Data Destinations Overview
  • 3 Minutes to read
  • Dark
    Light

Data Destinations Overview

  • Dark
    Light

Article Summary

A data destination is a location or system where data is sent for further processing, analysis, or storage. This could be a dashboarding application (e.g. PowerBI, Tableau or Looker Studio) on-prem or hosted databases (MySQL, Postgres, SQL Server), cloud data-warehouses (Redshift, Snowflake, Azure SQL Database, Google BigQuery), object storages or data lakes (e.g. S3, Azure Blob Storage, Google Cloud Storage), business applications (Salesforce, ExactOnline, Klaviyo) or any other platform designed to receive and manage incoming data

Choosing the right Data Destination

Selecting an appropriate data destination is a critical step in your data management strategy. In the following section, we provide a comprehensive guide that aligns various use cases with optimal data destinations, facilitating an informed decision that best supports your specific needs and objectives.

Use caseDestination
Data Visualization, Reporting and BI of simple and small data.Should your datasets consist of fewer than ~100K rows and not require complex data transformations, a Dashboarding App can serve as an efficient destination. This method bypasses the need for intermediate storage, thereby streamlining your overall data workflow. Dataddo seamlessly integrates with a wide range of popular BI tools, including Tableau, PowerBI, and Looker Studio, further simplifying the process.
Data Visualization, Reporting and BI of complex or large data.For data visualization, reporting, and business intelligence tasks involving large or complex datasets, we recommend using cloud data warehouses like Snowflake, BigQuery Databricks or Redshift. These platforms are optimized to handle vast amounts of data and perform complex transformations using SQL. Furthermore, their seamless integration with popular BI tools greatly facilitates the management and analysis of your data.
Building a data lakeFor cases involving building a data lake, which is a repository for large and diverse datasets in their raw form, use storages such as S3, Azure Blob Storage or Google Cloud Storage.
Intermediate Storage for complex ETL processesFor intricate ETL processes involving systems such as Databricks or Azure Synapse, it can be advantageous to employ services like S3, Azure Blob Storage, or Google Cloud Storage as staging areas housing raw data.
Data ActivationIf you need to feed the enriched and transformed data back into original applications for improved operations or insights, you can use Dataddo to send data back to applications like Salesforce, HubSpot, or any other CRM/ERP systems.

Authorize the Connection to a Destination

  1. Navigate to the Authorizers, and click on Authorize new service. From the options available, select a Destination. You can use a search bar for quick search.
  2. In the Destination's Authorizer configuration, input all the required information. For assistance, refer to the specific documentation page related to each destination.

Create a Destination

  1. Under the Destinations tab, click on the Create Destination button and select the destination from the list.
  2. Select your account from the drop-down menu.
  3. Name your destination and click on Save to create your destination.
Need to authorize another connection?

Click on Add new Account in drop-down menu during authorizer selection and follow the on-screen prompts. You can also go to the Authorizers tab and click on Add New Service.

Edit a Destination

Navigate to Destinations and click on the three dots next to your destination to select Edit. Depending on the destination type, you will be able to edit the fields accordingly.

Usually, apart from configuration details, it is also possible to change the destination name. If you need to update your account connected to your destination, you can do so in the Authorizers tab.

Test and Debug Connection

If your destination is broken (marked with a RED badge), you can click on the three dots next to it and select Test Connection. A connection log will show up to give more details regarding what the issue is.

If the test fails, in most cases, you will need to reauthorize your service.

Delete a Destination

  1. Locate your destination in the Destinations tab and click on the trash can icon next to it.
  2. Type DELETE to confirm your action.
DATADDO TIP

If your destination is connected to a flow, you will need to delete the flow first.



Was this article helpful?