Google BigQuery
  • 6 Minutes to read
  • Dark

Google BigQuery

  • Dark

Article summary

Google BigQuery is a cloud-based data warehouse and analytics platform provided by Google Cloud. It uses a columnar storage format, which allows for fast query performance and efficient data compression, making it well-suited for handling large-scale analytical workloads on vast amounts of data.


  • You have created a BigQuery project and a dataset.
  • You authorized one of the following Google BigQuery accounts:
  • Your User or Service account in Google BigQuery has the following IAM project permissions:
    • roles/bigquery.dataEditor
    • roles/bigquery.jobUser
    • roles/bigquery.readSessions

Authorize Connection to Google BigQuery

To authorize the connection to Google BigQuery, choose one of the following methods.

User accountSimple standard authorization method using your Google account.
Service accountSuitable for more complex deployments.

If you don't have the required permissions to authenticate your service, assign the authorizer role to a team member with these permissions.

User Account Authorization

In Google Cloud Console

To create a user account, follow these steps:

  1. In your Google Cloud Console, select your project and navigate to IAM & Admin and IAM.
  2. Click on + Add to add a new user and set the user's email address.
  3. Assign the following roles to the user:
    1. BigQuery Data Editor
    2. BigQuery Job User
    3. BigQuery Read Sessions
  4. Save your configuration.

In Dataddo

Authorization via standard user account is the default option and will follow the standard OAuth 2.0 mechanism.

  1. On the Authorizers tab, click on Authorize New Service.
  2. Select Google BigQuery.
  3. Follow the on-screen prompts to finish the authorization process.

Continue with creating a new Google BigQuery data destination.

Service Account Authorization

In general, for larger and more complex deployments, using a Service Account is recommended.

In Google Cloud Console

To create service account, follow these steps:

  1. In your Google Cloud Console, select your project and navigate to IAM & Admin and Service accounts.
  2. At the top of the page click on + Create Service Account.
  3. Name your service account and in the Service account permissions (optional) section, assign the following roles:
    1. BigQuery Data Editor
    2. BigQuery Job User
  4. Save your Service Account.
  5. Click on three dots next to your newly created Service Account, choose Manage keys, then Add key.
  6. Choose JSON as key type. This will download the JSON file to your computer which you will later need to upload to Dataddo.

In Dataddo

To authorize your service account, make sure you also have your JSON key.

  1. On the Security page, navigate to the Certificates tab and click on Add Certificate.
  2. Name your certificate, select Google Service Account Key as certificate type and upload the file you have obtained when creating a Service Account for authorization.
  3. On the Authorizers tab, click on Authorize New Service and select Google BigQuery (Service account).
  4. Select the newly added certificate.
  5. Click on Save.

When creating the authorizer, make sure you select Google BigQuery (Service account).

Create a New BigQuery Destination

  1. On the Destinations page, click on the Create Destination button and select the destination from the list.
  2. Select your authorizer from the drop-down menu.
  3. Name your destination and click on Save.
Need to authorize another connection?

Click on Add new Account in drop-down menu during authorizer selection and follow the on-screen prompts. You can also go to the Authorizers tab and click on Add New Service.

Create a Flow to BigQuery

  1. Navigate to Flows and click on Create Flow.
  2. Click on Connect Your Data to add your source(s).
  3. Click on Connect Your Data Destination to add the destination.
  4. Choose the write mode and fill in the other required information.
  5. Check the Data Preview to see if your configuration is correct.
  6. Name your flow and click on Create Flow to finish the setup.

Table Naming Convention

When naming your table, please make sure the table name does NOT contain:

  • Whitespaces
  • Dashes

Table Partitioning

Dataddo supports table partitioning, which splits large datasets into smaller, manageable partitions, based on certain criteria, e.g. date.

During flow creation, add date range patterns to your table name:


Using this table name, Dataddo will create a new table named xyz every day, e.g. xyz_20xx0101, xyz_20xx0102 etc.


Invalid Grant Error


This issue may be caused by:

  1. Revoked or expired token: Please reauthorize your account.
  2. Insufficient permissions: Make sure that both BigQuery Data Editor and BigQuery Job User roles are assigned to the User account or Service account that you are using.

Invalid Configuration Error


This issue is caused by insufficient row-level uniqueness of you composite key. To solve the issue, please follow these steps:

  1. In the Flows tab, click on your flow.
  2. Edit the composite key to include more columns for distinct row identification when using upsert or delete write modes.

Failed to Prepare Table Error

Erroneous communication with Google Big Query API: failed to prepare table: Failed to create new table

This issue may be caused by:

  1. Table name already in use: Please edit the flow and choose a different table name.
  2. Insufficient permissions: Your BigQuery account does not have the sufficient permissions. Make sure that both BigQuery Data Editor and BigQuery Job User roles are assigned to your User account or Service account.

Failed to Convert Value GBQ Data Type Error

Failed to convert value GBQ data type: Column '...' schema is set to '...'

This issue is caused by data type mismatch between the table in BigQuery and sources attached to the flow. To solve the issue, please follow these steps:

  1. Check the error message to identify which column is affected and change the data type.
  2. Manually load data in the source.
  3. Restart the flow.

Column Not Found in Schema Error

Column not found in schema: Column ... not found in schema

This issue may be caused by:

  1. Updated schema due to a column in the source being renamed:
    • Dataddo generates flow schemas based on the attached sources. If you changed a column name in the source connected to the flow, it will result in a mismatch between the expected target table schema and the current one.
    • Either revert the column name in the source to its previous value or adjust the column name in the target table to match the new name.
  2. Changed source within flow that lead to a schema discrepancy: For the sake of data integrity, changes in the source schema are not passed down to downstream database destinations. This means that table schemas won't automatically update without your explicit consent.
    1. If you can delete data in the BigQuery table: Delete the table in its entirety and reset the flow. Upon the next execution, Dataddo will attempt to establish a new table.
    2. If you cannot delete data in the BigQuery table: Consider manually adding any absent columns directly in Google BigQuery.

Billing Not Enabled


Error 403: Billing has not been enabled for this project. Enable billing at Datasets must have a default expiration time and default partition expiration time of less than 60 days while in sandbox mode.

This issue is caused by Google BigQuery billing issue. Please enable your BigQuery billing in your Google Cloud Console.

Frequent OAuth Reauthorization Required

Google accounts authorized using OAuth 2.0 may require frequent reauthorization. If not anticipated, this may cause for your flows to break unexpectedly.

To solve this issue, please authenticate your Google account using a service account.

Was this article helpful?