This guide explains how SYNQ automatically links your dbt models with the Airflow tasks that execute them, providing end-to-end visibility from orchestration to data models.Prerequisites:
- Airflow integration configured in SYNQ
- dbt Core integration with
synq-dbt
installed synq-dbt
running within Airflow tasks
How it works
When you run dbt Core commands usingsynq-dbt
inside Airflow tasks, SYNQ automatically captures the Airflow execution context and links it to your dbt models. This creates a bidirectional connection between your orchestration layer and data models.
What you get:
- Model → Airflow: See which Airflow tasks executed each dbt model, with DAG and task information on model pages
- Airflow → Models: View all dbt models executed by an Airflow task, with run status and metrics
- Visualize orchestration in SYNQ lineage (Orchestration mode)
- Historical tracking of model executions (last 30 days)
- Airflow context in issues: When a model fails, SYNQ includes the specific Airflow DAG and task information in the issue, making it easier to identify and troubleshoot failures
Automatic context collection
Whensynq-dbt
uploads your dbt artifacts (run_results.json
), it automatically detects and collects Airflow-specific environment variables:
AIRFLOW_CTX_DAG_OWNER
— DAG ownerAIRFLOW_CTX_DAG_ID
— DAG identifierAIRFLOW_CTX_TASK_ID
— Task identifierAIRFLOW_CTX_EXECUTION_DATE
— Execution timestampAIRFLOW_CTX_TRY_NUMBER
— Retry attempt numberAIRFLOW_CTX_DAG_RUN_ID
— Unique run identifier
These environment variables are automatically set by Airflow in your task execution context. You don’t need to configure anything beyond using
synq-dbt
as your dbt command wrapper.Setup verification
If you’ve already set up both integrations, the linking is already active. To verify:1. Check your Airflow setup
Ensure you’re usingsynq-dbt
to execute dbt commands in your Airflow tasks:
Example with KubernetesPodOperator:
2. Run a dbt task in Airflow
Execute any DAG that runs dbt models usingsynq-dbt
.
3. View linked data in SYNQ
On model pages
Navigate to any dbt model in SYNQ to see:- Airflow Execution History: Recent task executions that ran this model
- DAG Information: Which DAG and task executed the model
- Execution Timestamps: When the model was last executed via Airflow
On Airflow execution pages
Navigate to any Airflow task execution in SYNQ to see:- Executed dbt Models: Complete list of all dbt models that ran in this task
- Model Run Status: Success/failure status for each model
- Execution Details: Runtime, rows affected, and other metrics per model
- Direct Links: Quick navigation to individual model pages
In lineage view
- Open Lineage in SYNQ
- Switch to Orchestration mode (toggle in the view settings)
- View your complete data pipeline including:
- Airflow DAGs and tasks
- dbt models executed by each task
- Connections between orchestration and data layers
Airflow context in issues
When a dbt model fails during execution in an Airflow task, SYNQ automatically enriches the issue with Airflow execution context. This helps you quickly identify which specific DAG and task encountered the failure. Information included:- DAG ID: The Airflow DAG that was running
- Task ID: The specific task within the DAG that executed the failed model
- Execution Date: When the failure occurred
- DAG Run ID: Unique identifier for the DAG run
- Owner: DAG owner for faster escalation
- Jump directly to the relevant Airflow task logs
- Identify if the failure is specific to a particular orchestration path
- Correlate model failures with DAG execution patterns
- Route issues more effectively based on DAG ownership
The Airflow context is captured at the time of execution and remains associated with the issue, even if you investigate it later.
Troubleshooting
Models not showing Airflow context
Check the following:- Using synq-dbt: Verify you’re running
synq-dbt
instead ofdbt
in your Airflow tasks - Recent execution: Linking shows executions from the last 30 days
- Airflow environment variables: Ensure your Airflow version sets the context variables (supported in Airflow 1.10+)
- Upload success: Check that
synq-dbt
successfully uploaded artifacts (check task logs)
Lineage not showing orchestration
- Orchestration mode enabled: In Lineage view, ensure “Orchestration mode” is toggled on
- Airflow integration active: Verify your Airflow integration is properly configured
- Metadata sync: Wait 5-10 minutes after first execution for metadata to sync
Learn more
- Airflow integration guide — Connect SYNQ to Airflow
- dbt Core integration guide — Install and configure synq-dbt
- synq-dbt GitHub repository — Source code and advanced configuration