Skip to main content
This guide explains how SYNQ automatically links your dbt models with the Airflow tasks that execute them, providing end-to-end visibility from orchestration to data models.Prerequisites:⏱️ Estimated time: Already works automatically if prerequisites are met

How it works

When you run dbt Core commands using synq-dbt inside Airflow tasks, SYNQ automatically captures the Airflow execution context and links it to your dbt models. This creates a bidirectional connection between your orchestration layer and data models. What you get:
  • Model → Airflow: See which Airflow tasks executed each dbt model, with DAG and task information on model pages
  • Airflow → Models: View all dbt models executed by an Airflow task, with run status and metrics
  • Visualize orchestration in SYNQ lineage (Orchestration mode)
  • Historical tracking of model executions (last 30 days)
  • Airflow context in issues: When a model fails, SYNQ includes the specific Airflow DAG and task information in the issue, making it easier to identify and troubleshoot failures

Automatic context collection

When synq-dbt uploads your dbt artifacts (run_results.json), it automatically detects and collects Airflow-specific environment variables:
  • AIRFLOW_CTX_DAG_OWNER — DAG owner
  • AIRFLOW_CTX_DAG_ID — DAG identifier
  • AIRFLOW_CTX_TASK_ID — Task identifier
  • AIRFLOW_CTX_EXECUTION_DATE — Execution timestamp
  • AIRFLOW_CTX_TRY_NUMBER — Retry attempt number
  • AIRFLOW_CTX_DAG_RUN_ID — Unique run identifier
These environment variables are automatically set by Airflow in your task execution context. You don’t need to configure anything beyond using synq-dbt as your dbt command wrapper.

Setup verification

If you’ve already set up both integrations, the linking is already active. To verify:

1. Check your Airflow setup

Ensure you’re using synq-dbt to execute dbt commands in your Airflow tasks: Example with KubernetesPodOperator:
KubernetesPodOperator(
    env_vars={
        "SYNQ_TOKEN": Variable.get("SYNQ_TOKEN")
    },
    cmds=["synq-dbt"],
    arguments=["build"],
    # ... other configuration
)
Example with DbtRunOperator:
dbt_run = DbtRunOperator(
    dbt_bin='synq-dbt',
    env={"SYNQ_TOKEN": Variable.get("SYNQ_TOKEN")},
    # ... other configuration
)

2. Run a dbt task in Airflow

Execute any DAG that runs dbt models using synq-dbt.

3. View linked data in SYNQ

On model pages

Navigate to any dbt model in SYNQ to see:
  • Airflow Execution History: Recent task executions that ran this model
  • DAG Information: Which DAG and task executed the model
  • Execution Timestamps: When the model was last executed via Airflow

On Airflow execution pages

Navigate to any Airflow task execution in SYNQ to see:
  • Executed dbt Models: Complete list of all dbt models that ran in this task
  • Model Run Status: Success/failure status for each model
  • Execution Details: Runtime, rows affected, and other metrics per model
  • Direct Links: Quick navigation to individual model pages

In lineage view

  1. Open Lineage in SYNQ
  2. Switch to Orchestration mode (toggle in the view settings)
  3. View your complete data pipeline including:
    • Airflow DAGs and tasks
    • dbt models executed by each task
    • Connections between orchestration and data layers

Airflow context in issues

When a dbt model fails during execution in an Airflow task, SYNQ automatically enriches the issue with Airflow execution context. This helps you quickly identify which specific DAG and task encountered the failure. Information included:
  • DAG ID: The Airflow DAG that was running
  • Task ID: The specific task within the DAG that executed the failed model
  • Execution Date: When the failure occurred
  • DAG Run ID: Unique identifier for the DAG run
  • Owner: DAG owner for faster escalation
This context appears directly in the issue details, allowing you to:
  • Jump directly to the relevant Airflow task logs
  • Identify if the failure is specific to a particular orchestration path
  • Correlate model failures with DAG execution patterns
  • Route issues more effectively based on DAG ownership
The Airflow context is captured at the time of execution and remains associated with the issue, even if you investigate it later.

Troubleshooting

Models not showing Airflow context

Check the following:
  1. Using synq-dbt: Verify you’re running synq-dbt instead of dbt in your Airflow tasks
  2. Recent execution: Linking shows executions from the last 30 days
  3. Airflow environment variables: Ensure your Airflow version sets the context variables (supported in Airflow 1.10+)
  4. Upload success: Check that synq-dbt successfully uploaded artifacts (check task logs)

Lineage not showing orchestration

  1. Orchestration mode enabled: In Lineage view, ensure “Orchestration mode” is toggled on
  2. Airflow integration active: Verify your Airflow integration is properly configured
  3. Metadata sync: Wait 5-10 minutes after first execution for metadata to sync

Learn more

I