How to start

Sending your data from a source to a destination is easy. Follow these steps to get your data flowing immediately.

First you need to create a source. Second step is to create a destination. Lastly, connecting the two via flow.

Navigation

How to create a data-based data source

   
Number 1

Sign in to your Dataddo account and click on Sources at the top of the page. Home Page - Data Source

Number 2

Click on Create Source in the top right corner.Data Sources - Create Source

Number 3 From the list of sources, choose the connector. You can type the name of the connector into the search bar to find it faster.
Facebook Graph - step 3
Number 4

First, select the Dataset and click on Next.

NOTE: If you are not sure which Dataset you need, but you know which Metrics and Attributes you are looking for, you can use the Search by name or attribute function or you can browse through all the possible Attributes and Metrics for each Dataset by clicking on the specific Dataset.
Facebook Graph- step 4

Number 5

From the drop-down menu, choose your Account connected to Dataddo, and then select other fields, such as campaign, ads account, or other. Click on Next to continue with the setup.

Facebook Graph- step 5

a)

Add new account
If you want to connect a new account that is not on the list, click on Add Account. You will be redirected to the sign-in or authorization page to confirm the account. Once you confirm, you will be redirected back to the Dataddo app.
Facebook Graph- step 5 add new account

Facebook Graph - step 5a

NOTE: In case the authorize requires API key/token, data label, domain/subdomain, or other additional information, search the connector page in the top search bar of Dataddo Documentation Site for more information.

b)

Authorize Your Account
If you have not authorized your Account with Dataddo, click on Authorize and you will be redirected to your account to log-in or authorization page.
Facebook Graph - step 5 authorizeFacebook Page Insights - step 4c
After you log in and give Dataddo the necessary permissions to access your data, you will be redirected back to the connector.

Number 6

Choose a Name for your Data Source and select the Attributes. To continue to the next page, click Next.

NOTE: The Metrics and Attributes depend on the Dataset you selected in the previous step. If you need different Metrics and Attributes, go back two steps and change the Dataset.

Facebook Graph- step 6

Connecting a non dataset-based data source

Some connectors, such as Google Ads, Facebook (except Facebook Graph), Instagram ... are not dataset-based and the configuration of such data source is a little different.

   
Number 1

Sign in to your Dataddo account and click on Sources at the top of the page. Home Page - Data Source

Number 2

Click on Create Source in the top right corner.Data Sources - Create Source

Number 3 From the list of sources, choose the connector. You can type the name of the connector into the search bar to find it faster.
Google Ads - step 3
Number 4 From the drop-down menu, choose your Account connected to Dataddo, and then select other fields, such as campaign, ads account, or other. Click on Next to continue with the setup. Google Ads - step 4
a)

Add new account
If you want to connect a new Google Account that is not on the list, click on Add Account. You will be redirected to the sign-in or authorization page to confirm the right account. Once you confirm, you will be redirected back to the Dataddo app.
Google Ads - step 4 add new account
Google Ads - step 4b

NOTE: In case the authorize requires API key/token, data label, domain/subdomain, or other additional information, search the connector page in the top search bar of Dataddo Documentation Site for more information.

b) Authorize Google Account
If you have not authorized your Google Account with Dataddo, click on Authorize and you will be redirected to your account to log in or authorization page.
Google Ads - step 4a
After you log in and give Dataddo the necessary permissions to access your data, you will be redirected back to the connector.
Number 5

Choose a Name for your Data Source and select the Attributes. To continue to the next page, click Next.

NOTE: The fields may depend on the Report type of other filters you select.

Google Ads - step 5
Google Ads - step 5a

Storage & Snapshotting

   
Number 6

On the Storage selection page, you can set up the destination configuration. To help you with the snapshot mapping, we make it easier for you to create a connection to the destination based on your preference.

  • A) Click on the option Dashboarding App if you want your data to connect to applications such as Google Data Studio, Power BI, or Tableau.
  • B) Click on the option Data Warehouse, if you want your data to be sent to warehouse storage, e.g., MySQL, Big Query, or PostgreSQL.
Google Ads - step 6
Number 7

Configure your snapshotting preferences by choosing your Data range*, Sync type, Sync frequency, Time, and Timezone. Confirm your setup by clicking on Next.

NOTE: *If the Data range is available to select, you can load historical data. Read more about it in our guide.

Google Ads - step 7

Number 8 In the last step, you can preview your data. Click on Save and your new Data Source is ready.
Google Ads - step 8 preview
* Broken view / Error message
If you cannot see a preview of your data, go back a few steps and check your setup. The most common causes are:
  • Date range - we recommend a smaller date range. If you need to load historical data, check our guide.
  • Invalid metrics, attributes or breakdowns, or their combination - you may not have any values for them.
Google Ads - step 8 broken preview

 


How to connect a Data Destination

The main difference between the data warehouses and the dashboarding app is that the data warehouses have been designed to store large volumes of data and generally, they allow for more flexibility when storing this data. 

Some destinations have some specific steps during configuration. Search the name of the destination in the search bar above for more information.

   
Number 1

Sign in to your Dataddo account and click on Destinations at the top of the page. Home Page - Destination

Number 2

Click on Create Destination in the top right corner.Data Destination - Create Destination

Number 3

From the list of sources, choose the destination. You can type the connector's name into the search bar to find it faster.

NOTE: Most of the dashboarding apps such as PoweBI or Google Data Studio do not require authorization, therefore you connect them directly when creating a flow.
Data destinations - all

Number 4

Choose your account and click on Next. Google Big Query- step 4a

NOTE: If you have not authorized your account, click on Authorize. You will be redirected to the sign-in page. Once signed in, continue with the steps in the Dataddo.

Number 5 Fill out all the fields on the sign-in page to give Dataddo all the necessary information to connect your data, and click on Save.Google Big Query- step 4b

 

How to connect Data Flow

   
Number 1

Sign in to your Dataddo account and click on Flows at the top of the page. Home Page - Data Flow

Number 2

Click on Create Flow in the top right corner.My Flows - Create Flow

Number 3 First, click on Add Source to select one from the list of all your sources or you can type the connector's name into the search bar to find it faster.
My Flows - New Flow
Number 4 Once you selected your data source, click on Add Destination.
Google Big Query- Flow 4
Number 5

 

Choose from the list of destinations, under the tab Data Storages you will find databases, or go to the Dashboarding app tab to find your dashboarding tool. You can also type the name of the source into the search bar.

Another option is to create a new destination right in this step. You would be redirected to the destination connector authorization and back to the flow editor.Google Big Query- Flow 5

Number 6 Configure the destination by filling out necessary fields. You can view what data will be sent by clicking on Data Preview Tab. Click on Save flow.
Google Big Query- Flow 6a

NOTE: Remember to set the time for the flow with a little delay after the synchronization of your source to let the data load.

How to configure a Data Flow

Automatic configuration for databases

Dataddo will try to automatically create a table when the writing operation is triggered for the first time. Date and time values will be stored as TIMESTAMP type, integers as INTEGER type, floating numerics as FLOAT, and string as a STRING type. If the operation fails, please proceed to Troubleshoot section or continue with a Manual configuration. 

Manual configuration

   
Number 1

To configure the flow, go to your Flows and click on the three dots next to the new flow. Click on Configure.Google Big Query - configuration 1

Number 2

You will see instructions to set up your flow into the destination. After that, your data flow is ready.

If you selected a Dashboarding App, the system will generate the parameters to connect with. Click on the Dashboarding App Section to see them.
Dashboarding app - configuration 2If you chose a Data warehouse or storage, you will see instructions to set up your table. After that, your data flow will be live. Google Big Query - configuration 2

Flow Troubleshooting

The table was not automatically created

Applies to SQL databases: MySQL, Azure SQL, Universal SQL Server, Vertica, Snowflake, CockroachDB, AWS Redshift, AWS RDS (MySQL), AWS RDS (SQL Server), AWS RDS (PostgreSQL), BigQuery, Google Cloud SQL (MySQL), Google Cloud SQL (PgSQL), Universal PostgreSQL, Universal MySQL.

If the flow after its creation turned into a broken state, the table failed to be created. Click on the three dots next to the flow and choose Display Log to look for the error description. In most cases the problem is one of these:

  • Insufficient permissions to create the table. Make sure that the authorized user has at least a WRITER role.
  • The table already exists. Delete the already existing table and restart the flow.
  • Check the flow configuration.

Flow is broken after changing the source

Applies to SQL databases

Dataddo in order to maintain data consistency does not propagate changes done at the flow level to the downstream database destinations (i.e. table schemas are not automatically updated without your knowledge).

In case your flow got broken after changing the source, most likely the updated schema does not match the table that was already created. Click on the three dots next to the flow and choose Display Log to look for the error description. In case the data collected in the database table can be deleted, delete the entire table and reset the flow and Dataddo will attempt to create a new table. In case the data cannot be deleted, try manually adding missing columns.

Experiencing data duplicates

For destinations that are primarily append-only, the recommended approach is to use the INSERT write strategy. However, this approach can result in duplicities in your data. To avoid that, consider other writing strategies:

  • TRUNCATE INSERT. This strategy removes all the contents in the BigQuery table prior to data insertion. 
  • UPSERT. This strategy inserts new rows and updates existing ones. To perform this correctly, it is necessary to set a unique key representing one or multiple columns.

Flow with UPSERT strategy is failing with invalid_configuration message

The combination of columns that you have chosen does not produce a unique index. Edit a flow and include more columns to the index.


Need assistance?

Feel free to contact us or create a ticket and we will help you set up the Data Source.