This guide explains how SYNQ automatically links your dbt models with the Airflow tasks that execute them, providing end-to-end visibility from orchestration to data models.Prerequisites:
- Airflow integration configured in SYNQ
- dbt Core integration with
synq-dbtv1.8.0+ installed synq-dbtrunning within Airflow tasks- Valid SYNQ integration token (v2 format starting with
st-required for US region)
How it works
When you run dbt Core commands usingsynq-dbt inside Airflow tasks, SYNQ automatically captures the Airflow execution context and links it to your dbt models. This creates a bidirectional connection between your orchestration layer and data models.
What you get:
- Model → Airflow: See which Airflow tasks executed each dbt model, with DAG and task information on model pages
- Airflow → Models: View all dbt models executed by an Airflow task, with run status and metrics
- Visualize orchestration in SYNQ lineage (Orchestration mode)
- Historical tracking of model executions (last 30 days)
- Airflow context in issues: When a model fails, SYNQ includes the specific Airflow DAG and task information in the issue, making it easier to identify and troubleshoot failures
Automatic context collection
Whensynq-dbt uploads your dbt artifacts (run_results.json), it automatically detects and collects Airflow-specific environment variables:
Required variables (for linking to work):
AIRFLOW_CTX_DAG_ID— DAG identifierAIRFLOW_CTX_TASK_ID— Task identifierAIRFLOW_CTX_DAG_RUN_ID— Unique run identifierAIRFLOW_CTX_TRY_NUMBER— Retry attempt number
AIRFLOW_CTX_DAG_OWNER— DAG ownerAIRFLOW_CTX_EXECUTION_DATE— Execution timestamp
These environment variables are automatically set by Airflow in your task execution context. You don’t need to manually configure them — they’re available in the task environment. However, when using operators like
KubernetesPodOperator, you need to explicitly pass them using Jinja templating as shown in the setup examples below.Setup verification
If you’ve already set up both integrations, the linking is already active. To verify:1. Check your Airflow setup
Ensure you’re usingsynq-dbt to execute dbt commands in your Airflow tasks:
Example with KubernetesPodOperator:
Regional Configuration: EU region customers (default) don’t need to set
SYNQ_API_ENDPOINT. US region customers must add "SYNQ_API_ENDPOINT": "https://api.us.synq.io" to env_vars and use v2 tokens (starting with st-).2. Run a dbt task in Airflow
Execute any DAG that runs dbt models usingsynq-dbt.
Artifacts and linking data typically appear in SYNQ within minutes of upload under normal conditions. Wait 5-10 minutes after your first execution for metadata to sync before checking.
3. View linked data in SYNQ
On model pages
Navigate to any dbt model in SYNQ to see:- Airflow Execution History: Recent task executions that ran this model
- DAG Information: Which DAG and task executed the model
- Execution Timestamps: When the model was last executed via Airflow
On Airflow execution pages
Navigate to any Airflow task execution in SYNQ to see:- Executed dbt Models: Complete list of all dbt models that ran in this task
- Model Run Status: Success/failure status for each model
- Execution Details: Runtime, rows affected, and other metrics per model
- Direct Links: Quick navigation to individual model pages
In lineage view
- Open Lineage in SYNQ
- Switch to Orchestration mode (toggle in the view settings)
- View your complete data pipeline including:
- Airflow DAGs and tasks
- dbt models executed by each task
- Connections between orchestration and data layers
Airflow context in issues
When a dbt model fails during execution in an Airflow task, SYNQ automatically enriches the issue with Airflow execution context. This helps you quickly identify which specific DAG and task encountered the failure. Information included:- DAG ID: The Airflow DAG that was running
- Task ID: The specific task within the DAG that executed the failed model
- Execution Date: When the failure occurred
- DAG Run ID: Unique identifier for the DAG run
- Owner: DAG owner for faster escalation
- Jump directly to the relevant Airflow task logs
- Identify if the failure is specific to a particular orchestration path
- Correlate model failures with DAG execution patterns
- Route issues more effectively based on DAG ownership
The Airflow context is captured at the time of execution and remains associated with the issue, even if you investigate it later.
Troubleshooting
Models not showing Airflow context
Check the following:- Using synq-dbt: Verify you’re running
synq-dbtinstead ofdbtin your Airflow tasks - synq-dbt version: Ensure you’re using v1.8.0 or later (run
synq-dbt --version) - Environment variables passed: Verify that required Airflow context variables (
AIRFLOW_CTX_DAG_ID,AIRFLOW_CTX_TASK_ID,AIRFLOW_CTX_DAG_RUN_ID) are being passed to your operator - Token format: For US region, ensure you’re using v2 tokens (starting with
st-) - Regional endpoint: US region customers must set
SYNQ_API_ENDPOINT=https://api.us.synq.io - Recent execution: Linking shows executions from the last 30 days
- Airflow version: Ensure your Airflow version sets the context variables (supported in Airflow 1.10+)
- Upload success: Check that
synq-dbtsuccessfully uploaded artifacts (check task logs for upload confirmation) - Network connectivity: Verify outbound HTTPS access to
developer.synq.io:443(EU) orapi.us.synq.io:443(US)
Lineage not showing orchestration
- Orchestration mode enabled: In Lineage view, ensure “Orchestration mode” is toggled on
- Airflow integration active: Verify your Airflow integration is properly configured
- Metadata sync: Wait 5-10 minutes after first execution for metadata to sync
Learn more
- Airflow integration guide — Connect SYNQ to Airflow
- dbt Core integration guide — Install and configure synq-dbt
- synq-dbt GitHub repository — Source code and advanced configuration