This guide shows you how to connect your dbt Core project to SYNQ to track model runs, test results, and metadata changes.

Prerequisites:

  • Admin access to your SYNQ workspace
  • Ability to modify your dbt orchestration tool (Airflow, GitHub Actions, etc.)

⏱️ Estimated time: 15 minutes

Using dbt Cloud? You can integrate directly through Settings → Integrations → Add Integration → dbt Cloud instead of following this guide.

Set up dbt Core integration

Create integration in SYNQ

  1. Navigate to Settings → Integrations → Add Integration
  2. Select dbt Core from the integration options

Configure integration settings

Integration name: Enter a descriptive name (e.g., Production dbt Core)

Generate token: Click Create to generate your integration token. You’ll use this token with the synq-dbt tool to send artifacts securely to SYNQ.

Git integration: Select your Git provider to link model changes to repository commits. This enables change tracking and lineage visualization.

Relative path to dbt: If your dbt project isn’t in the repository root, specify the directory path (e.g., analytics/dbt/).

Manage integration tokens

Access token management through Settings → Integrations, then select your dbt Core integration and click Manage tokens.

From the token management screen, you can:

  • Create new tokens for different environments
  • Invalidate compromised tokens
  • Copy token snippets for easy integration

Install synq-dbt

About synq-dbt

synq-dbt is a command-line wrapper that runs your existing dbt Core commands and automatically uploads artifacts to SYNQ. It’s compatible with any dbt Core version and works seamlessly with orchestration tools like Airflow, GitHub Actions, and Dagster.

Collected artifacts:

  • manifest.json — Project structure and dependencies
  • run_results.json — Execution status and performance metrics
  • catalog.json — Complete data warehouse schema information
  • sources.json — Source freshness test results

How it works:

  1. Executes your dbt Core command with provided arguments
  2. Captures the original exit code
  3. Reads your SYNQ_TOKEN environment variable
  4. Uploads artifacts from the target directory to SYNQ
  5. Returns the original dbt Core exit code (preserving pipeline behavior)

Installation methods

Choose the installation method that matches your dbt orchestration setup:

Airflow with DockerOperator

  1. Set environment variable: In Airflow UI, create a new environment variable SYNQ_TOKEN with your integration token.

  2. Update Dockerfile:

ENV SYNQ_VERSION=v1.8.0
RUN wget -O /usr/bin/synq-dbt https://github.com/getsynq/synq-dbt/releases/download/${SYNQ_VERSION}/synq-dbt-amd64-linux && \
    chmod +x /usr/bin/synq-dbt
  1. Update your operator:
KubernetesPodOperator(
    env_vars={
        "SYNQ_TOKEN": Variable.get("SYNQ_TOKEN")
    },
    cmds=["synq-dbt"],
    arguments=["build"],  # Your dbt command here
    # ... other configuration
)

Airflow with dbt Plugin

  1. Set environment variable: Create SYNQ_TOKEN in Airflow UI.

  2. Install synq-dbt:

export SYNQ_VERSION=v1.8.0
wget -O ./synq-dbt https://github.com/getsynq/synq-dbt/releases/download/${SYNQ_VERSION}/synq-dbt-amd64-linux
chmod +x ./synq-dbt && mv synq-dbt /usr/local/bin/synq-dbt
  1. Update DbtOperator:
dbt_run = DbtRunOperator(
    dbt_bin='synq-dbt',
    env={"SYNQ_TOKEN": Variable.get("SYNQ_TOKEN")},
    # ... other configuration
)

Dagster

  1. Configure environment: Add SYNQ_TOKEN=<your-token> to your .env file.

  2. Update resources in definitions.py:

resources = {
    "dbt": DbtCliResource(
        dbt_executable='synq-dbt', 
        project_dir=os.fspath(dbt_project_dir)
    ),
}
  1. Update assets in assets.py:
@dbt_assets(manifest=dbt_manifest_path)
def jaffle_shop_dbt_assets(context: AssetExecutionContext, dbt: DbtCliResource):
    dbt_target_path = Path('target')
    yield from dbt.cli(["build"], target_path=dbt_target_path, context=context).stream()

Docker

Add to your Dockerfile:

ENV SYNQ_VERSION=v1.8.0
RUN wget -O /usr/bin/synq-dbt https://github.com/getsynq/synq-dbt/releases/download/${SYNQ_VERSION}/synq-dbt-amd64-linux && \
    chmod +x /usr/bin/synq-dbt

Linux/macOS

Download and install:

# For Linux
export SYNQ_VERSION=v1.8.0
wget -O ./synq-dbt https://github.com/getsynq/synq-dbt/releases/download/${SYNQ_VERSION}/synq-dbt-amd64-linux

# For macOS
wget -O ./synq-dbt https://github.com/getsynq/synq-dbt/releases/download/${SYNQ_VERSION}/synq-dbt-arm64-darwin

# Make executable and move to PATH
chmod +x ./synq-dbt && mv synq-dbt /usr/local/bin/synq-dbt

# Set your token and test
export SYNQ_TOKEN=<your-token>
synq-dbt --version

Use synq-dbt

Basic usage

Replace your existing dbt Core commands with synq-dbt:

# Instead of: dbt run --select finance --threads 5
synq-dbt run --select finance --threads 5

# Instead of: dbt test --select reports  
synq-dbt test --select reports

# Instead of: dbt build
synq-dbt build

All dbt Core arguments and options work exactly the same way.

Upload existing artifacts

If you have already generated dbt artifacts and want to upload them to SYNQ:

export SYNQ_TOKEN=<your-token>
synq-dbt synq_upload_artifacts

Include dbt logs:

dbt build | tee dbt.log
synq-dbt synq_upload_artifacts --dbt-log-file dbt.log

Token migration (v2 tokens)

If your SYNQ_TOKEN doesn’t start with st-, you’re using a legacy token that will be deprecated. Migrate to v2 tokens for improved security and performance.

Migration steps:

  1. Upgrade synq-dbt first (backward compatible):
export SYNQ_VERSION=v1.8.0  # or later
wget -O ./synq-dbt https://github.com/getsynq/synq-dbt/releases/download/${SYNQ_VERSION}/synq-dbt-amd64-linux
chmod +x ./synq-dbt && mv synq-dbt /usr/local/bin/synq-dbt
  1. Generate new v2 token through Settings → Integrations → Manage tokens

  2. Replace your SYNQ_TOKEN environment variable with the new token (starts with st-)

  3. Test the integration to ensure artifacts upload successfully

v2 tokens require synq-dbt v1.8.0 or later. Legacy tokens will continue working during the transition period.

Configuration options

Environment variables:

  • SYNQ_TOKEN — Your integration token (required)
  • SYNQ_TARGET_DIR — Artifact directory path (default: target/)

Network requirements:

  • Allow outbound HTTPS traffic to dbtapi.synq.io:443 when using legacy token or old uploader.
  • Allow outbound HTTPS traffic to developer.synq.io:443

Data visibility:

  • Artifacts appear in SYNQ within 2-3 minutes of upload
  • Failed uploads are logged and can be retried

For advanced configuration options and troubleshooting, see the synq-dbt GitHub repository.