Dataddo enables you to load historical data and add them to your Data Source.
If you come across a case where you need to add a specific range of your historical data to your data set, you can do it simply by a one-time request in the Dataddo app.
If you have your data source set up and running already, taking snapshots as you like and sending them to your data destination, and you only want to load some historical data and possibly send them to your data destination as well, you can go to Step 2 right away. If not, we will go through setting up a Data Source and Data Flow here in Step 1.
STEP 1 - Creating Data Source and Data Flow
Let's say we want to connect Google Analytics with your MySQL database. We want to fetch data and have them pushed into your database weekly. Let's say on every Friday morning, we want to fetch data from previous Friday till the end of Thursday of the current week and have them sent to MySQL.
Navigate to Sources and click on New source in the top right corner. From the list of sources pick the one you want, in this example Google Analytics. Make sure that the Google Account and Analytics Profile fields are filled correctly and you have picked all Metrics and Dimensions you want.
Next, we need to select date range. In our case this is the Last 7 days. If you want to pull data daily, this can be set to Yesterday, if you want to do this monthly, you can set this to Last month.
Lastly, we can set up the AUTOMATIC DATA SYNCHRONIZATION. We want to load data every week on Friday morning. The set up should look something like this:
By clicking on Connect your Data source gets created and is ready to be used in a Data flow. Every Friday at 6 am Zurich time, Dataddo will pull data from the last 7 days as wanted.
To connect this Data source with a database or a BI tool, you can click on Add to flow. Next, click on New destination and choose from the list of destinations, in our case MySQL. Label your source and pick a name for your database table. Next, you can configure automation. You want your data to be saved to your table every Friday morning, just after the data get fetched by Dataddo. The set up should look something like this:
You can click on Save Data Flow and your data will be sent into the newly created table. Now your Data source and Data flow are all set and you can move to Step 2 - loading of historical data.
STEP 2 - Loading historical data
For most sources (such as Google Analytics, Google Ads, Facebook Ads, Instagram Ads etc), this process is rather simple and straight forward. Navigate to Sources in your Dataddo account, find a specific source, you want to work with, and click on the "Ad-hoc data load" button.
Now you can set a specific date range of your historical data. Click Apply, choose whether you want to append this historical data to your existing data, or replace them, confirm your action and load the data.
You can run a Display log on your source panel and check if the operation went successfully.
If you want to send these historical data to your database, in our case MySQL, you can navigate into the Data Flows tab in your account, find the appropriate Data flow connected to this source and simply click on Ad-hoc data insert, confirm your action and the historical data are sent to your table.
You can run Log to check if the operation went successfully. Your historical data have been now loaded into Dataddo and sent to your database and you are done.
DATE RANGE in Ad-hoc data load not possible
Another case is that selecting DATE RANGE in the Ad-hoc data load panel is not possible. When this occurs, there is still a way how to get your historical data. This process is a bit more advanced but feasible following this guide. Nonetheless, if you still need some help with loading of historical data, feel free to contact us.
Navigate to Data Sources in your Dataddo account, find a specific source, you want to work with, and click on the "Edit data source" button.
Scroll down to the URL field. Here, you can see the current setting. You will always see something such as "startDate" and "endDate" in the link, in this example, we have "since_send_time" and "before_send_time". These specify the date range of data to be fetched by Dataddo.
The expression in the curly brackets defines a dynamic date range. This date range is originally set to the date range you have chosen during the configuration of the source (such as Yesterday, Last 7 days, Last 30 days etc). In this case, we have the range set to last 30 days including today.
If you want to load historical data, the easiest way how to do this is by either specifying the exact start and end date and time of the period for which you want to get data, or using the dynamic date range. You need to set this in the URL.
Case 1: Static date range
Let's say we want to pull data from 1 September 2020 until the end of 14 September 2020. Then you would need to change the curly brackets term to the exact date in ISO 8601 time format, hence our example URL would be:
Case 2: Dynamic date range
Let's say we want to pull data for last year. Then the expression in the curly brackets needs to be set to 1y1, hence our example URL would be:
If we wanted for example 90 days ago till yesterday, we would use 90d1, hence our example URL would be:
Once the date range of the historical data that we want to load is set in the URL, we Save the changes. Now, we are ready to load the historical data. This is simply done by clicking on the "Ad-hoc data load" button. You need to choose whether you want to append these historical data to the one you have already fetched, or whether you want to replace them. You confirm and the historical data gets loaded. You can once again run Display log on your source panel and check if the operation went successfully.
The last thing you need to do is either return the URL to the original setting or pick a new date range for which you want to pull data every time Dataddo takes a new snapshot, using the dynamic range.
For example, if we have the schedule set to Weekly and we always want to collect only the data from the last seven days (not including today), we simply use the expression 7d1 in the URL. We can also choose whether we want the new snapshot to replace all the previous, or append to them. Choosing the further, our edit window would look like this:
Now, you have all the historical data loaded and you are all set for the next snapshot.
If you want to send these historical data to your database, you can navigate into the Data Flows tab in your account, find the appropriate Data flow connected to this source and simply click on Ad-hoc data insert, confirm your action and the historical data are sent to your database.