Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello, does anyone managed to implement the sample/tutorial Transform data using dbt - Microsoft Fabric | Microsoft Learn ?
For us, with a Starter Pool running on Apache Airflow version 2.10.5, only the version of astronomer-cosmos==1.5.1 and dbt-fabric==1.5.0 are valid.
....but when running the DAG 'dags/my_cosmos_dag.py we systematically get a Failed to start error. Thanks
Hi @rabbyn
Thanks for reaching out to Fabric Community.
Here are the few check points that might resolve your issue.
Suggested troubleshooting steps:
Add your .env or DAG-level configuration using:
import os
os.environ["FABRIC_WORKSPACE_ID"] = "<your-workspace-id>"
os.environ["FABRIC_CAPACITY_ID"] = "<your-capacity-id>"
os.environ["FABRIC_WAREHOUSE_ID"] = "<your-warehouse-id>"
If you are using DbtDag, validate all dbt_kwargs and project paths:
DbtDag(
project_dir="/usr/local/airflow/dags/dbt",
profiles_dir="/usr/local/airflow/dags/dbt",
# other parameters
)
Rebuild the environment or delete and recreate the Airflow workspace to clear possible internal caching.
If the above information helps you, please give us a Kudos and marked the Accept as a solution.
Best Regards,
Community Support Team _ C Srikanth.
Hi @v-csrikanth , thanks for your support. I did follow all the troubleshooting steps you mentioned but no success, despite the requirements.txt being "validated" the Airflow cluster is not starting up (time out after 18min). I tried with starter cluster then I recreate a brand new Apache Airflow from an other Workspace and the outcome is the same. Below the screenshot you find a code snippet from my latest DAG script (based on your instructions , I something is wrong let me know). Thanks
import os
from pathlib import Path
from datetime import datetime
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig
# Add environment variables here
os.environ["FABRIC_WORKSPACE_ID"] = "18584879-394b-****-8d34-61e864c0bd1c"
os.environ["FABRIC_CAPACITY_ID"] = "FAC80AA7-5E69-****-88E7-DE388FC23422"
os.environ["FABRIC_WAREHOUSE_ID"] = "60a45649-7e92-****-90b3-237243a35114"
DEFAULT_DBT_ROOT_PATH = Path(__file__).parent.parent / "dags" / "nyc_taxi_green"
DBT_ROOT_PATH = Path(os.getenv("DBT_ROOT_PATH", DEFAULT_DBT_ROOT_PATH))
profile_config = ProfileConfig(
profile_name="nyc_taxi_green",
target_name="fabric-dev",
profiles_yml_filepath=DBT_ROOT_PATH / "profiles.yml",
)
dbt_fabric_dag = DbtDag(
project_config=ProjectConfig(
project_dir="/usr/local/airflow/dags/dbt",
profiles_dir="/usr/local/airflow/dags/dbt",
),
operator_args={"install_deps": True},
profile_config=profile_config,
schedule_interval="@daily",
start_date=datetime(2024, 9, 10),
catchup=False,
dag_id="dbt_fabric_dag",
)
Hi @rabbyn
Thank you for being part of the Microsoft Fabric Community.
I trying to implement the sample tutorial in my workspace once it was successfully created will post the steps in detail that might help you to resolve your issue.
Best Regards,
Cheri Srikanth.