Databricks
Store your event and dispatch data in Databricks through S3 integration
Databricks Data Warehouse Integration
The Databricks Data Warehouse integration provides a powerful solution for storing and analyzing your event and dispatch data. We utilize S3 as an intermediary storage layer, allowing you to easily import the data into your Databricks environment.
How the Integration Works
- Daily S3 Deposits: Events and dispatches are automatically deposited into your S3 bucket on a daily basis
- Complete Data: All events and dispatches from your account are included in the deposits
- Flexible Import: You can configure Databricks to read directly from your S3 bucket
Setting Up Databricks Access to S3
There are several ways to configure Databricks to access your S3 data:
Updated about 19 hours ago