dbt Core
Integrating dbt Core with SYNQ
This guide shows you how to connect your dbt Core project to SYNQ to track model runs, test results, and metadata changes.
Prerequisites:
- Admin access to your SYNQ workspace
- Ability to modify your dbt orchestration tool (Airflow, GitHub Actions, etc.)
⏱️ Estimated time: 15 minutes
Using dbt Cloud? You can integrate directly through Settings → Integrations → Add Integration → dbt Cloud instead of following this guide.
Set up dbt Core integration
Create integration in SYNQ
- Navigate to Settings → Integrations → Add Integration
- Select dbt Core from the integration options
Configure integration settings
Integration name: Enter a descriptive name (e.g., Production dbt Core
)
Generate token: Click Create to generate your integration token. You’ll use this token with the synq-dbt
tool to send artifacts securely to SYNQ.
Git integration: Select your Git provider to link model changes to repository commits. This enables change tracking and lineage visualization.
Relative path to dbt: If your dbt project isn’t in the repository root, specify the directory path (e.g., analytics/dbt/
).
Manage integration tokens
Access token management through Settings → Integrations, then select your dbt Core integration and click Manage tokens.
From the token management screen, you can:
- Create new tokens for different environments
- Invalidate compromised tokens
- Copy token snippets for easy integration
Install synq-dbt
About synq-dbt
synq-dbt
is a command-line wrapper that runs your existing dbt Core commands and automatically uploads artifacts to SYNQ. It’s compatible with any dbt Core version and works seamlessly with orchestration tools like Airflow, GitHub Actions, and Dagster.
Collected artifacts:
manifest.json
— Project structure and dependenciesrun_results.json
— Execution status and performance metricscatalog.json
— Complete data warehouse schema informationsources.json
— Source freshness test results
How it works:
- Executes your dbt Core command with provided arguments
- Captures the original exit code
- Reads your
SYNQ_TOKEN
environment variable - Uploads artifacts from the target directory to SYNQ
- Returns the original dbt Core exit code (preserving pipeline behavior)
Installation methods
Choose the installation method that matches your dbt orchestration setup:
Airflow with DockerOperator
-
Set environment variable: In Airflow UI, create a new environment variable
SYNQ_TOKEN
with your integration token. -
Update Dockerfile:
- Update your operator:
Airflow with dbt Plugin
-
Set environment variable: Create
SYNQ_TOKEN
in Airflow UI. -
Install synq-dbt:
- Update DbtOperator:
Dagster
-
Configure environment: Add
SYNQ_TOKEN=<your-token>
to your.env
file. -
Update resources in
definitions.py
:
- Update assets in
assets.py
:
Docker
Add to your Dockerfile:
Linux/macOS
Download and install:
Use synq-dbt
Basic usage
Replace your existing dbt Core commands with synq-dbt
:
All dbt Core arguments and options work exactly the same way.
Upload existing artifacts
If you have already generated dbt artifacts and want to upload them to SYNQ:
Include dbt logs:
Token migration (v2 tokens)
If your SYNQ_TOKEN
doesn’t start with st-
, you’re using a legacy token that will be deprecated. Migrate to v2 tokens for improved security and performance.
Migration steps:
- Upgrade synq-dbt first (backward compatible):
-
Generate new v2 token through Settings → Integrations → Manage tokens
-
Replace your
SYNQ_TOKEN
environment variable with the new token (starts withst-
) -
Test the integration to ensure artifacts upload successfully
v2 tokens require synq-dbt v1.8.0 or later. Legacy tokens will continue working during the transition period.
Configuration options
Environment variables:
SYNQ_TOKEN
— Your integration token (required)SYNQ_TARGET_DIR
— Artifact directory path (default:target/
)
Network requirements:
- Allow outbound HTTPS traffic to
dbtapi.synq.io:443
when using legacy token or old uploader. - Allow outbound HTTPS traffic to
developer.synq.io:443
Data visibility:
- Artifacts appear in SYNQ within 2-3 minutes of upload
- Failed uploads are logged and can be retried
For advanced configuration options and troubleshooting, see the synq-dbt GitHub repository.