How to Connect PostgreSQL
  • 2 Minutes to read
  • Dark
    Light

How to Connect PostgreSQL

  • Dark
    Light

Step 1 - Create a Database in PostgreSQL

  1. Log into your PgAdmin tool. Select the server that you want to create the database in.
    PostrgreSQL - admin 1

  2. Create a new database from the database section.
    PostrgreSQL - admin 2

  3. Name your database and then click on save.
    PostrgreSQL - admin 3

Step 2 - Create a Postgres Data Destination

  1. Sign in to your Dataddo account and click on Destinations at the top of the page.

  2. Click on Create Destination in the top right corner.

  3. From the list of destinations, choose Universal PostgreSQL. You can type the name of the connector into the search bar to find it faster.
    Data destinations - selection

  4. Fill out all the files on the sign-in page to give Dataddo all the necessary information to connect your data, and click on Save.
    PostgreSQL- new destination

Step 3 - Create a Data Flow

  1. Click on Flows at the top of the page.

  2. Click on Create Flow in the top right corner.

  3. Click on Add Source to connect it to the destination. You can type the connector's name into the search bar to find it faster.

  4. Once you select your data source, click on Add Destination.

  5. From the list of destinations, select PostgreSQL. You can also type the name into the search bar.

  6. Configure the destination by filling out necessary fields.

  7. Name your flow by typing the name into the top.

  8. You can see a Data Preview to make sure of your configuration.
    Flow - preview

  9. Click on Save flow.

How to Configure the Data Flow

  1. Click on the three dots next to the flow and choose Config.

  2. The window pops up with the configuration details.
    PostreSQL - flow config

Congratulations, your new Data Destination & Data Flow are ready!

WARNING

Network setup required. Make sure to white list the following IP address to allow incoming connections from the Dataddo application to your server.

  • 52.17.68.150
  • 52.30.37.137
  • 52.214.115.147
  • 54.77.45.35
  • 52.210.57.95
  • 18.200.46.19

Delivering Data to Storage

Append-only

When data delivery is scheduled, Dataddo gets the actual contents of the data source and performs a set of INSERT operations (one or multiple) against your storage. Neither DELETE nor UPDATE operation is performed, therefore the data are inserted in append-only mode.

Deleting Historic Rows

In certain cases, the append-only solution might have drawbacks due to the growing size of the database. To cope with this, the best solution is to define an AFTER INSERT trigger that triggers a stored procedure that deletes the historic data. In the example below, the stored procedure deletes all the rows meeting the insert_date or dataddo_insert_date < NOW() - INTERVAL '1 days' condition. Dataddo connectors allow extraction of various date information, so the retention period can be flexibly set.

  • In the case, the source has insert_date as the timestamp
CREATE FUNCTION delete\_old\_rows() RETURNS trigger  
    LANGUAGE plpgsql  
    AS $$  
BEGIN  
  DELETE FROM table\_name WHERE insert\_date < NOW() - INTERVAL '1 days';  
  RETURN NULL;  
END;  
$$;
  • In the case the source has dataddo_insert_date as timestamp
CREATE FUNCTION delete\_old\_rows() RETURNS trigger  
    LANGUAGE plpgsql  
    AS $$  
BEGIN  
  DELETE FROM table\_name WHERE dataddo\_insert\_date < NOW() - INTERVAL '1 days';  
  RETURN NULL;  
END;  
$$;

The associated trigger triggers the stored procedure after every insert operation on the table_name.

CREATE TRIGGER trigger\_delete\_old\_rows  
    AFTER INSERT ON table\_name  
    EXECUTE PROCEDURE delete\_old\_rows();

Need assistance?

Feel free to contact us and we will help you with the setup. To speed the process of resolving your issue, make sure you provide us with sufficient information.


Was this article helpful?