How to Connect Universal PostgreSQL

STEP 1 - Create a Database in PostgreSQL

  1. Log into your PgAdmin tool. Select the server that you want to create the database in.
    PostrgreSQL - admin 1
  2. Create a new database from the database section.
    PostrgreSQL - admin 2
  3. Name your database and then click on save.
    PostrgreSQL - admin 3

Step 2 - Create a Vertica Data Destination

  1. Sign in to your Dataddo account and click on Destinations at the top of the page. 
  2. Click on Create Destination in the top right corner.
  3. From the list of destinations, choose Universal PostgreSQL. You can type the name of the connector into the search bar to find it faster.
    Data destinations - selection
  4. Fill out all the files on the sign-in page to give Dataddo all the necessary information to connect your data, and click on Save

    PostgreSQL- new destination

Step 3 - Create a Data Flow

  1. Click on Flows at the top of the page. 
  2. Click on Create Flow in the top right corner.
  3. Click on Add Source to connect it to the destination. You can type the connector's name into the search bar to find it faster.
  4. Once you select your data source, click on Add Destination.

  5. From the list of destinations, select PostgreSQL. You can also type the name into the search bar.
  6. Configure the destination by filling out necessary fields.
  7. Name your flow by typing the name into the top.
  8. You can see a Data Preview to make sure of your configuration.
    Flow - preview
  9. Click on Save flow.

How to Configure the Data Flow

  1. Click on the three dots next to the flow and choose Config.
  2. The window pops up with the configuration details.
    PostreSQL - flow config

Congratulations, your new Data Destination & Data Flow are ready!

WARNING

Network setup required. Make sure to white list the following IP address to allow incoming connections from the Dataddo application to your server. 

52.17.68.150

52.30.37.137

52.214.115.147


Delivering Data to Storage

Append-only

When data delivery is scheduled, Dataddo gets the actual contents of the data source and performs a set of INSERT operations (one or multiple) against your storage. Neither DELETE nor UPDATE operation is performed, therefore the data are inserted in append-only mode.

Deleting Historic Rows

In certain cases, the append-only solution might have drawbacks due to the growing size of the database. To cope with this, the best solution is to define an AFTER INSERT trigger that triggers a stored procedure that deletes the historic data. In the example below, the stored procedure deletes all the rows meeting the insert_date or dataddo_insert_date < NOW() - INTERVAL '1 days' condition. Dataddo connectors allow extraction of various date information, so the retention period can be flexibly set.

  • In the case, the source has insert_date as the timestamp
CREATE FUNCTION delete_old_rows() RETURNS trigger
LANGUAGE plpgsql
AS $$
BEGIN
DELETE FROM table_name WHERE insert_date < NOW() - INTERVAL '1 days';
RETURN NULL;
END;
$$;
  • In the case the source has dataddo_insert_date as timestamp
CREATE FUNCTION delete_old_rows() RETURNS trigger
LANGUAGE plpgsql
AS $$
BEGIN
DELETE FROM table_name WHERE dataddo_insert_date < NOW() - INTERVAL '1 days';
RETURN NULL;
END;
$$;

The associated trigger triggers the stored procedure after every insert operation on the table_name.

CREATE TRIGGER trigger_delete_old_rows
AFTER INSERT ON table_name
EXECUTE PROCEDURE delete_old_rows();

 


Need assistance?

Feel free to contact us or create a ticket and we will help you with the setup. To speed the process of resolving your ticket, make sure you provide us with sufficient information.