Google BigQuery
Store your event and dispatch data in Google BigQuery through S3 integration
Google BigQuery Data Warehouse Integration
The Google BigQuery Data Warehouse integration provides a powerful solution for storing and analyzing your event and dispatch data. We utilize Google's S3 transfer integration to automatically move your data from S3 to BigQuery, enabling efficient querying and analysis of your complete event dataset.
How the Integration Works
- Daily S3 to BigQuery Transfers: Events and dispatches are automatically transferred from S3 to BigQuery on a daily basis
- Complete Data: All events and dispatches from your account are included in the transfers
- Efficient Querying: BigQuery's SQL-based querying engine allows for fast, in-place analysis of your data
- Scalable Storage: BigQuery's infrastructure handles large volumes of data efficiently
S3 Data Organization
dispatches/
└── year=YYYY/
└── month=MM/
└── day=DD/
events/
└── year=YYYY/
└── month=MM/
└── day=DD/
Getting Started
To set up the Google BigQuery integration:
- Contact your account manager to enable the integration and to get the role for your AWS S3 Bucket
- Follow the Ours Privacy S3 Setup Guide
- Configure your BigQuery Data Transfer Service From S3
Simple Getting Started Guide
Note: This is the simplest way to get started without daily partitioning. The Ours Privacy S3 sync is well-configured for partitioning. For advanced partitioning options, see the BigQuery S3 transfer documentation.
Follow these steps to set up your BigQuery integration:
-
Create a table in BigQuery:
- Download a sample file from your S3 bucket
- Upload it to BigQuery to create your initial table structure
-
Configure data transfer:
- Set up the BigQuery Data Transfer Service to automatically populate your table daily
- The File format is "PARQUET"
- The Amazon S3 URI is
s3://{bucket name}/events/*
- Note: If you want to sync dispatches, you will have to setup one data transfer for dispatches and one data transfer for events.
- Note: You can introduced data parameters to the Amazon S3 URI by following the BigQuery partitioning documentation. But, the easiest way to get setup is to just configure
s3://{bucket name}/events/*
.
- Set up the BigQuery Data Transfer Service to automatically populate your table daily
Below are examples of the table creation and data transfer configuration interfaces:
Example: Table creation via file upload

Example: Data transfer configuration

Updated 12 days ago