To export data from Triple Whale to Google BigQuery, you must provide the necessary project details and grant specific permissions. This guide walks you through the setup process to ensure a seamless connection.
BigQuery Export Setup
For step-by-step setup guidance, see the BigQuery Integration guide. The following is a quick reference of the setup information you will need:
-
Project ID and Dataset ID: Users must provide a project ID and dataset ID where data will be written.
-
Service Account Permissions: Triple Whale requires you to grant BigQuery editor permission to the
[email protected]service account on the specific dataset you are exporting, in order to create and write to tables in that dataset.-
To grant BigQuery editor permission to the service account, navigate to Share > Manage Permissions
-
Click Add Principal
-
Add
[email protected]as New principal, with the role of BigQuery Data Editor
-
Connecting to Triple Whale
Once your BigQuery dataset is set up, connect it to Triple Whale so it can be used as a Data Warehouse Export destination:
-
Go to the Data Warehouse Export page.
-
Click Connect on the BigQuery integration.

Connecting BigQuery to Triple Whale
-
Enter the Project ID and Dataset ID created in Step 1.
-
Click the checkbox confirming that you added the Triple Whale service account (Step 2).
-
Click Save to complete the setup.
Creating a Data Warehouse Export
After connecting your BigQuery warehouse to Triple Whale, you can export data using the Data Warehouse Export feature.
-
Go to the Data Warehouse Export page.
-
Click New Export, then select your connected BigQuery destination and specify a New Table ID. The new table will be created automatically in your selected dataset.
-
Define the SQL Query whose results should be exported to BigQuery.
Select Specific ColumnsAvoid using
SELECT *. Triple Whale’s schema is dynamic, and wildcard selection can lead to broken or inconsistent results as fields change. Always specify the exact columns you need. -
Choose How the Export Runs - You can either:
- Export Now to run a one-time export, or
- Schedule Recurring Exports (for example, hourly or daily) to configure automated recurring exports.
-
Create Export - Once saved, the export will either run immediately or on its configured schedule and append new rows to the destination table.
Handling Data De-duplication (Recommended)
Now that your BigQuery database is connected to Triple Whale, it's important to handle duplicate records to ensure accurate data in your warehouse. Since each export appends new rows rather than updating existing ones, querying the raw data directly may result in inflated totals if duplicates aren’t managed.
Because unique row keys depend on the structure of your query, Triple Whale does not automatically de-duplicate records. You must define the right approach based on your data model.
To prevent duplication-related issues, follow the best practices outlined in the De-duplication for Data Warehouse Export guide:
- Tracking the most recent version of each record
- Identifying unique row keys based on your data model
- Filtering out older duplicate rows in your queries
Implementing de-duplication ensures that your analysis reflects the most accurate and up-to-date data in your warehouse.