Table of Contents

Configuring direct connection to Google BigQuery

Ramya Priya Updated by Ramya Priya

BigQuery Connector

Loading from table

Ensure the table's schema is compatible with Tellius for seamless integration and data loading.

Loading from external tables

The external tables refer to data that is stored outside of BigQuery. For external tables, we suggest using a query (like 'SELECT * FROM external_table') to load the table.

This is important because external tables differ from regular tables in that they do not have a storage layer within BigQuery. Instead, the table data is stored externally, with only the metadata layer present in BigQuery.

Loading from View

By default, BigQuery does not materialize loading data from Views. It is necessary to materialize the View first, by saving the output to a separate table and then loading the data from that table.

The intermediate table is a temporary table with a Time-To-Live (TTL) of 24 hours, and it will be deleted once the TTL ends. The TTL can be set to less than 24 hours. For this integration, it is recommended to build a separate materialization dataset (with full write access to Tellius).

Loading from SQL query

Loading data from an SQL query is similar to loading data from a View. The results of the query are saved in an intermediate, temporary table, from which the data is loaded.

The following details are required from the user to load data from external tables or Views or SQL queries:

  • Materialization project
  • Materialization dataset

Pre-requisites for configuring BigQuery in Google Cloud

Service Account Creation

  1. In the Google Cloud console, click on IAM and Admin section on the left pane.
  2. Choose Service accounts and select the required project.
  3. Click on + Create service account on the top pane.
  4. Enter the name, ID, and description of the service account.
  5. Copy the service account email address (which is of the format <account_name>@<project_name>.iam.gserviceaccount.com)

  1. Click on Done to create the service account.
  2. Copy the service account email, which will be in the following format: <account_name>@<project_name>.iam.gserviceaccount.com
  3. Select the required service account.
  4. Navigate to the Keys section at the top and click on Add Key.
  1. Under Key type, choose the JSON option and click on Create.
  1. When prompted to download the file, save the file to upload it later to Tellius.
  2. Verify whether the JSON file consists of the following structure:
{
"type": "service_account",
"project_id": "<project_name>",
"private_key_id": "random_id",
"private_key": "-----BEGIN PRIVATE KEY-----\nxxxxxx\n-----END PRIVATE KEY-----\n",
"client_email": "<account_name>@<project_name>.iam.gserviceaccount.com",
"client_id": "<numeric_client_id>",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "<custom_cert_url>"
}

Project Level Permissions

The following roles need to be included at the project level:

  • BigQuery Read Session User

It is required to read the data in parallel from Google Storage

  • BigQuery Job User

It is required to load data from Views or execute live-mode SQL queries directly from BigQuery. It is also required to run jobs where the Views will be materialized to a temporary table by triggering a job and then reading from Google Storage.

  1. Select the required project.
  2. Under Actions, click on Manage permissions.
  3. Click on Grant access.
  4. Under Add principals, paste the service account email copied from the previous section.
  5. Under Assign roles, choose the roles: BigQuery Read Session User and BigQuery Job User.
  6. Click on Save.
  7. Policy updated toast will be displayed.

Dataset Level Permissions

  • The role BigQuery Data Viewer should be assigned to the service account of the Tellius dataset from where the data would be read.
  • If Views are read into Tellius, then the role BigQuery Data Editor needs to be assigned to the service account of the same dataset or a new dataset created solely for Tellius to create temporary tables.
  1. Click on the dataset and select Options → Share.
  2. Under Add principals, paste the service account email address copied from the first section.
  3. Under Assign roles, provide the required role.
  4. Click on Save.

Did we help you?

Google BigQuery Connector

Contact