- 2 Minutes to read
-
DarkLight
How to Connect Universal PostgreSQL
- 2 Minutes to read
-
DarkLight
Step 1 - Create a Database in PostgreSQL
-
Log into your PgAdmin tool. Select the server that you want to create the database in.
-
Create a new database from the database section.
-
Name your database and then click on save.
Step 2 - Create a Vertica Data Destination
-
Sign in to your Dataddo account and click on Destinations at the top of the page.
-
Click on Create Destination in the top right corner.
-
From the list of destinations, choose Universal PostgreSQL. You can type the name of the connector into the search bar to find it faster.
-
Fill out all the files on the sign-in page to give Dataddo all the necessary information to connect your data, and click on Save.
Step 3 - Create a Data Flow
-
Click on Flows at the top of the page.
-
Click on Create Flow in the top right corner.
-
Click on Add Source to connect it to the destination. You can type the connector's name into the search bar to find it faster.
-
Once you select your data source, click on Add Destination.
-
From the list of destinations, select PostgreSQL. You can also type the name into the search bar.
-
Configure the destination by filling out necessary fields.
-
Name your flow by typing the name into the top.
-
You can see a Data Preview to make sure of your configuration.
-
Click on Save flow.
How to Configure the Data Flow
-
Click on the three dots next to the flow and choose Config.
-
The window pops up with the configuration details.
Congratulations, your new Data Destination & Data Flow are ready!
Network setup required. Make sure to white list the following IP address to allow incoming connections from the Dataddo application to your server.
52.17.68.150
52.30.37.137
52.214.115.147
Delivering Data to Storage
Append-only
When data delivery is scheduled, Dataddo gets the actual contents of the data source and performs a set of INSERT operations (one or multiple) against your storage. Neither DELETE nor UPDATE operation is performed, therefore the data are inserted in append-only mode.
Deleting Historic Rows
In certain cases, the append-only solution might have drawbacks due to the growing size of the database. To cope with this, the best solution is to define an AFTER INSERT trigger that triggers a stored procedure that deletes the historic data. In the example below, the stored procedure deletes all the rows meeting the insert_date or dataddo_insert_date < NOW() - INTERVAL '1 days' condition. Dataddo connectors allow extraction of various date information, so the retention period can be flexibly set.
- In the case, the source has insert_date as the timestamp
CREATE FUNCTION delete\_old\_rows() RETURNS trigger
LANGUAGE plpgsql
AS $$
BEGIN
DELETE FROM table\_name WHERE insert\_date < NOW() - INTERVAL '1 days';
RETURN NULL;
END;
$$;
- In the case the source has dataddo_insert_date as timestamp
CREATE FUNCTION delete\_old\_rows() RETURNS trigger
LANGUAGE plpgsql
AS $$
BEGIN
DELETE FROM table\_name WHERE dataddo\_insert\_date < NOW() - INTERVAL '1 days';
RETURN NULL;
END;
$$;
The associated trigger triggers the stored procedure after every insert operation on the table_name.
CREATE TRIGGER trigger\_delete\_old\_rows
AFTER INSERT ON table\_name
EXECUTE PROCEDURE delete\_old\_rows();
Feel free to contact us and we will help you with the setup. To speed the process of resolving your issue, make sure you provide us with sufficient information.