Use the BigQuery connector when you want Convertmax event data copied from the raw events database into reporting tables in Google BigQuery.
What the connector writes
The connector writes Convertmax event data into reporting tables in your BigQuery dataset.
The exact table set depends on the reporting schema configured for your environment.
1. Prepare the target BigQuery dataset
Create or confirm:
- a GCP project that will own the export
- a BigQuery dataset for Convertmax reporting tables
- the reporting tables required by your Convertmax setup
At minimum, you need:
project_idbigquery_dataset
2. Choose an authentication pattern
Convertmax supports two setup options for connecting to your BigQuery project.
Service account JSON
Use this option when you want to share a dedicated BigQuery service account key with Convertmax.
This is a good fit when:
- you want to provide a dedicated service account key
- you are not using cross-project impersonation
Service account impersonation
This is the recommended option for most customer-managed GCP projects.
With this setup:
- Convertmax uses a central runtime service account
- your BigQuery writer service account is granted to the target dataset
- the Convertmax runtime service account is allowed to impersonate your BigQuery writer service account
This approach avoids sharing long-lived service account keys while still allowing Convertmax to write into your project.
3. Validate the export
After setup, verify:
- the target dataset exists in the correct GCP project
- Convertmax has the required access to write into that dataset
- new rows appear in the target BigQuery tables
Troubleshooting
Permission denied on impersonation
Confirm:
- the runtime service account has
roles/iam.serviceAccountTokenCreator - your BigQuery service account has access to the target dataset