dbt Labs Delivers Significant Cost Optimization Results and Agentic AI Features, Powered by Fusion
New capabilities accelerate developer workflows, automatically reduce compute spend, and bring intelligent AI assistance to structured data
PHILADELPHIA, Oct. 14, 2025 /PRNewswire/ -- On the keynote stage at Coalesce 2025, dbt Labs – the leader in standards for AI-ready structured data – showcased the continued evolution of the dbt platform, unveiling cost optimization outcomes and performance enhancements powered by the dbt Fusion engine. dbt Labs also introduced dbt Agents, a suite of intelligent AI assistants built into dbt and made accessible via the remote dbt MCP server, to supercharge development, improve governance and deliver trustworthy AI outcomes. Collectively, these platform updates accelerate development, support cross-platform portability and lay an important foundation for new analytics use cases.
"Open standards and AI are fueling the next era of analytics, and the dbt Fusion engine is the bridge that data teams need to move toward that future," said Tristan Handy, founder and CEO at dbt Labs. "Fusion delivers robust context, tools and error-correction mechanisms for both humans and agents. It is the enabler of next generation, AI-powered data infrastructure."
Fusion, now in Preview for eligible projects on BigQuery, Databricks, Snowflake and Redshift, is building on its supercharged developer experience capabilities and enabling customers to dramatically optimize compute spend, eliminate wasted cycles, and focus teams on innovation and faster insights delivery. State-aware orchestration, in Preview for projects running Fusion, instantly allows teams to reduce compute spend by approximately 10% simply by activating the feature. It ensures pipelines only run models that have changed, allowing organizations to reduce unnecessary compute costs, all without rewriting projects or restructuring jobs. Teams can further tune their pipelines by providing specific data freshness requirements, and state-aware orchestration determines the most efficient job execution path to meet those needs. Organizations can expect an additional estimated 15%+ data platform cost savings with these tuned configurations, and in early testing, some organizations have realized over 50% in total savings. This represents a meaningful reduction in customers' spend on data infrastructure.

