Skip to main content
This guide explains how SYNQ automatically links your dbt models with the Airflow tasks that execute them, providing end-to-end visibility from orchestration to data models.Prerequisites:
  • Airflow integration configured in SYNQ
  • dbt Core integration with synq-dbt v1.8.0+ installed
  • synq-dbt running within Airflow tasks
  • Valid SYNQ integration token (v2 format starting with st- required for US region)
⏱️ Estimated time: Already works automatically if prerequisites are met

How it works

When you run dbt Core commands using synq-dbt inside Airflow tasks, SYNQ automatically captures the Airflow execution context and links it to your dbt models. This creates a bidirectional connection between your orchestration layer and data models. What you get:
  • Model → Airflow: See which Airflow tasks executed each dbt model, with DAG and task information on model pages
  • Airflow → Models: View all dbt models executed by an Airflow task, with run status and metrics
  • Visualize orchestration in SYNQ lineage (Orchestration mode)
  • Historical tracking of model executions (last 30 days)
  • Airflow context in issues: When a model fails, SYNQ includes the specific Airflow DAG and task information in the issue, making it easier to identify and troubleshoot failures

Automatic context collection

When synq-dbt uploads your dbt artifacts (run_results.json), it automatically detects and collects Airflow-specific environment variables: Required variables (for linking to work):
  • AIRFLOW_CTX_DAG_ID — DAG identifier
  • AIRFLOW_CTX_TASK_ID — Task identifier
  • AIRFLOW_CTX_DAG_RUN_ID — Unique run identifier
  • AIRFLOW_CTX_TRY_NUMBER — Retry attempt number
Optional variables (enhance tracking and context):
  • AIRFLOW_CTX_DAG_OWNER — DAG owner
  • AIRFLOW_CTX_EXECUTION_DATE — Execution timestamp
These environment variables are automatically set by Airflow in your task execution context. You don’t need to manually configure them — they’re available in the task environment. However, when using operators like KubernetesPodOperator, you need to explicitly pass them using Jinja templating as shown in the setup examples below.

Setup verification

If you’ve already set up both integrations, the linking is already active. To verify:

1. Check your Airflow setup

Ensure you’re using synq-dbt to execute dbt commands in your Airflow tasks: Example with KubernetesPodOperator:
KubernetesPodOperator(
    env_vars={
        "SYNQ_TOKEN": Variable.get("SYNQ_TOKEN"),
        # Required: Airflow context for linking
        "AIRFLOW_CTX_DAG_ID": "{{ dag.dag_id }}",
        "AIRFLOW_CTX_TASK_ID": "{{ task.task_id }}",
        "AIRFLOW_CTX_DAG_RUN_ID": "{{ dag_run.run_id }}",
        "AIRFLOW_CTX_TRY_NUMBER": "{{ task_instance.try_number }}",
        # Optional: Additional context
        "AIRFLOW_CTX_EXECUTION_DATE": "{{ execution_date }}",
        "AIRFLOW_CTX_DAG_OWNER": "{{ dag.owner }}",
        # For US region workspaces only:
        # "SYNQ_API_ENDPOINT": "https://api.us.synq.io"
    },
    cmds=["synq-dbt"],
    arguments=["build"],
    # ... other configuration
)
Regional Configuration: EU region customers (default) don’t need to set SYNQ_API_ENDPOINT. US region customers must add "SYNQ_API_ENDPOINT": "https://api.us.synq.io" to env_vars and use v2 tokens (starting with st-).
Example with DbtRunOperator:
dbt_run = DbtRunOperator(
    dbt_bin='synq-dbt',
    env={
        "SYNQ_TOKEN": Variable.get("SYNQ_TOKEN"),
        # Required: Airflow context for linking
        "AIRFLOW_CTX_DAG_ID": "{{ dag.dag_id }}",
        "AIRFLOW_CTX_TASK_ID": "{{ task.task_id }}",
        "AIRFLOW_CTX_DAG_RUN_ID": "{{ dag_run.run_id }}",
        "AIRFLOW_CTX_TRY_NUMBER": "{{ task_instance.try_number }}",
        # Optional: Additional context
        "AIRFLOW_CTX_EXECUTION_DATE": "{{ execution_date }}",
        "AIRFLOW_CTX_DAG_OWNER": "{{ dag.owner }}",
        # For US region: "SYNQ_API_ENDPOINT": "https://api.us.synq.io"
    },
    # ... other configuration
)

2. Run a dbt task in Airflow

Execute any DAG that runs dbt models using synq-dbt.
Artifacts and linking data typically appear in SYNQ within minutes of upload under normal conditions. Wait 5-10 minutes after your first execution for metadata to sync before checking.

3. View linked data in SYNQ

On model pages

Navigate to any dbt model in SYNQ to see:
  • Airflow Execution History: Recent task executions that ran this model
  • DAG Information: Which DAG and task executed the model
  • Execution Timestamps: When the model was last executed via Airflow

On Airflow execution pages

Navigate to any Airflow task execution in SYNQ to see:
  • Executed dbt Models: Complete list of all dbt models that ran in this task
  • Model Run Status: Success/failure status for each model
  • Execution Details: Runtime, rows affected, and other metrics per model
  • Direct Links: Quick navigation to individual model pages

In lineage view

  1. Open Lineage in SYNQ
  2. Switch to Orchestration mode (toggle in the view settings)
  3. View your complete data pipeline including:
    • Airflow DAGs and tasks
    • dbt models executed by each task
    • Connections between orchestration and data layers

Airflow context in issues

When a dbt model fails during execution in an Airflow task, SYNQ automatically enriches the issue with Airflow execution context. This helps you quickly identify which specific DAG and task encountered the failure. Information included:
  • DAG ID: The Airflow DAG that was running
  • Task ID: The specific task within the DAG that executed the failed model
  • Execution Date: When the failure occurred
  • DAG Run ID: Unique identifier for the DAG run
  • Owner: DAG owner for faster escalation
This context appears directly in the issue details, allowing you to:
  • Jump directly to the relevant Airflow task logs
  • Identify if the failure is specific to a particular orchestration path
  • Correlate model failures with DAG execution patterns
  • Route issues more effectively based on DAG ownership
The Airflow context is captured at the time of execution and remains associated with the issue, even if you investigate it later.

Troubleshooting

Models not showing Airflow context

Check the following:
  1. Using synq-dbt: Verify you’re running synq-dbt instead of dbt in your Airflow tasks
  2. synq-dbt version: Ensure you’re using v1.8.0 or later (run synq-dbt --version)
  3. Environment variables passed: Verify that required Airflow context variables (AIRFLOW_CTX_DAG_ID, AIRFLOW_CTX_TASK_ID, AIRFLOW_CTX_DAG_RUN_ID) are being passed to your operator
  4. Token format: For US region, ensure you’re using v2 tokens (starting with st-)
  5. Regional endpoint: US region customers must set SYNQ_API_ENDPOINT=https://api.us.synq.io
  6. Recent execution: Linking shows executions from the last 30 days
  7. Airflow version: Ensure your Airflow version sets the context variables (supported in Airflow 1.10+)
  8. Upload success: Check that synq-dbt successfully uploaded artifacts (check task logs for upload confirmation)
  9. Network connectivity: Verify outbound HTTPS access to developer.synq.io:443 (EU) or api.us.synq.io:443 (US)

Lineage not showing orchestration

  1. Orchestration mode enabled: In Lineage view, ensure “Orchestration mode” is toggled on
  2. Airflow integration active: Verify your Airflow integration is properly configured
  3. Metadata sync: Wait 5-10 minutes after first execution for metadata to sync

Learn more