Core concepts
  • 2 Minutes to read
  • Dark
    Light

Core concepts

  • Dark
    Light

Dataddo allows you to connect your sources and destinations in a few simple steps. To get the most out of your experience, let's take it from the beginning.

Basics

What is a Data Source?

Source represents a connection to any 3rd party from which the data are extracted such as SaaS apps (e.g. Salesforce, NetSuite, Hubspot, Stripe, Klavio, Facebook, Google Analytics), databases (e.g. MySQL, Postgres, SQL Server), cloud data warehouses (e.g. BigQuery, Redshift, Snowflake) or File storages (e.g. S3, SFTP). The initial setup of the source is configured via Connectors.

For more information see Data Sources.

What is a Data Destination?

Destination represents an endpoint where the data from sources are delivered to. Such as databases (e.g. MySQL, Postgres, SQL Server), cloud data warehouses (e.g. BigQuery, Redshift, Snowflake), File storages (e.g. S3, SFTP) or Dashboarding Apps (e.g. Tableau, PowerBI, Qlik, Data Studio).

Available destinations:

If Dataddo supports direct connection to dashboards, do I need data warehouse at all?

Although Dataddo features embedded storage SmartCache, it is not meant to be a replacement for a data warehouse, but rather a straightforward solution for cases, when you do not need to store large volumes of data or when heavy data transformations are not needed. As a rule of thumb, if you need to store more than 100,000 rows per source, or need more complex transformations beyond joins we recommend using a data warehouse.

For more information see Data Destinations.

What is a Data Flow?

One of the great benefits of the Dataddo architecture is that data extraction and data delivery operations are decoupled. In practice this means that you can defined loose associations between Data Sources and Data Destinations. As an examplate you route single extraction of Salesforce data to multiple data warehouses and vice versa.

For more information see Data Flows.

Configuring first data pipeline

Assuming you successfuly setup your Dataddo account and understand the Basics, let's proceed with the creation of first data pipeline. The process itself is very straight-forward.

  1. Step 1: Create a Data Source.
  2. Step 2: Create a Data Destination.
  3. Step 3. Create a Data Flow.

Need assistance?

Feel free to contact us and we will help you with the setup. To speed up the process of resolving your issue, make sure you provide us with sufficient information.


Was this article helpful?

What's Next