Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
rabbyn
Regular Visitor

stuck with MS Learn Tutorial for Apache Airflow Jobs for dbt-fabric Orchestration

Hello, does anyone managed to implement the sample/tutorial Transform data using dbt - Microsoft Fabric | Microsoft Learn ?
For us, with a Starter Pool running on Apache Airflow version 2.10.5, only the version of astronomer-cosmos==1.5.1 and dbt-fabric==1.5.0 are valid.

rabbyn_0-1747759252096.png

....but when running the DAG 'dags/my_cosmos_dag.py we systematically get a Failed to start error. Thanks

rabbyn_1-1747759413976.png

 




4 REPLIES 4
v-csrikanth
Community Support
Community Support

Hi @rabbyn 
Thanks for reaching out to Fabric Community.
Here are the few check points that might resolve your issue.

  • Ensure the required environment variables (FABRIC_WORKSPACE_ID, FABRIC_WAREHOUSE_ID, FABRIC_CAPACITY_ID) are explicitly defined in the "Environment variables" section of the Airflow configuration in Fabric.
  • Even if the requirements.txt file validates successfully with astronomer-cosmos==1.5.1 and dbt-fabric==1.5.0, re-deploy the environment from scratch to eliminate issues caused by corrupted or cached dependencies.
  • Check the Airflow logs for any Python-related import errors such as No module named cosmos or dbt not found, which indicate module loading issues during DAG startup.
  • Verify that your my_cosmos_dag.py file is located directly under the /dags directory and follows correct DAG declaration syntax required by Airflow.
  • The Starter Pool in Fabric may impose limits on compute or task parallelism, so reduce task concurrency if your DAG is calling resource-heavy operations like dbt transformations.

Suggested troubleshooting steps:
Add your .env or DAG-level configuration using:

import os

os.environ["FABRIC_WORKSPACE_ID"] = "<your-workspace-id>"
os.environ["FABRIC_CAPACITY_ID"] = "<your-capacity-id>"
os.environ["FABRIC_WAREHOUSE_ID"] = "<your-warehouse-id>"

If you are using DbtDag, validate all dbt_kwargs and project paths:

DbtDag(
project_dir="/usr/local/airflow/dags/dbt",
profiles_dir="/usr/local/airflow/dags/dbt",
# other parameters
)
Rebuild the environment or delete and recreate the Airflow workspace to clear possible internal caching.

 

If the above information helps you, please give us a Kudos and marked the Accept as a solution.

Best Regards,
Community Support Team _ C Srikanth.

Hi @v-csrikanth , thanks for your support. I did follow all the troubleshooting steps you mentioned but no success, despite the requirements.txt being "validated" the Airflow cluster is not starting up (time out after 18min). I tried with starter cluster then I recreate a brand new Apache Airflow from an other Workspace and the outcome is the same. Below the screenshot you find a code snippet from my latest DAG script (based on your instructions , I something is wrong let me know). Thanks

rabbyn_1-1749192284274.png

import os
from pathlib import Path
from datetime import datetime
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig

# Add environment variables here
os.environ["FABRIC_WORKSPACE_ID"] = "18584879-394b-****-8d34-61e864c0bd1c"
os.environ["FABRIC_CAPACITY_ID"] = "FAC80AA7-5E69-****-88E7-DE388FC23422"
os.environ["FABRIC_WAREHOUSE_ID"] = "60a45649-7e92-****-90b3-237243a35114"

DEFAULT_DBT_ROOT_PATH = Path(__file__).parent.parent / "dags" / "nyc_taxi_green"
DBT_ROOT_PATH = Path(os.getenv("DBT_ROOT_PATH", DEFAULT_DBT_ROOT_PATH))
profile_config = ProfileConfig(
     profile_name="nyc_taxi_green",
     target_name="fabric-dev",
     profiles_yml_filepath=DBT_ROOT_PATH / "profiles.yml",
)

dbt_fabric_dag = DbtDag(
     project_config=ProjectConfig(
         project_dir="/usr/local/airflow/dags/dbt",
         profiles_dir="/usr/local/airflow/dags/dbt",
     ),
     operator_args={"install_deps": True},
     profile_config=profile_config,
     schedule_interval="@daily",
     start_date=datetime(2024, 9, 10),
     catchup=False,
     dag_id="dbt_fabric_dag",
)

 

 

rabbyn
Regular Visitor

hi @v-csrikanth , did you manage to make it work ? thanks

v-csrikanth
Community Support
Community Support

Hi @rabbyn 
Thank you for being part of the Microsoft Fabric Community.
I trying to implement the sample tutorial in my workspace once it was successfully created will post the steps in detail that might help you to resolve your issue.

Best Regards,
Cheri Srikanth.

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

May 2025 Monthly Update

Fabric Community Update - May 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors