Attending the Databricks Data + AI Summit 2025 in San Francisco was nothing short of inspiring for me and the Infocepts team. The energy in the room, the innovation on display, and the conversations we had with peers and leaders all pointed to one thing: AI is no longer experimental, it’s becoming foundational.
Databricks has always been a strong partner in this journey, but what struck us this year was how seamlessly its roadmap aligns with the very challenges and opportunities we hear from our clients every day. From democratizing AI agents to modernizing legacy warehouses, Databricks is enabling enterprises to rethink the way they work with data.
Below, we break down the most important announcements —sharing why they matter for enterprises and our advice on how to maximize their value while avoiding common pitfalls.
Databricks introduced AgentBricks, a no-code platform for building, optimizing, and deploying production-grade AI agents using enterprise data.
Highlights
- Natural language agent builder for extraction, chat, and summarization tasks.
- Auto-generated evaluation suites leveraging LLM judges and human feedback.
- End-to-end optimization pipeline with prompt tuning and fine-tuning.
- Cost vs. quality configurations for flexibility.
- Unity Catalog–backed governance and auditability.
Why it Matters for Enterprises
What excites us most about AgentBricks is its potential to truly democratize AI agent development. Imagine business teams spinning up intelligent assistants that are aware of enterprise context—without depending on specialized engineering every step of the way. That kind of acceleration can fundamentally change how fast organizations deliver value from data. And because it’s powered by Databricks’ compute and governed through Unity Catalog, the scale and trust are already built in.
Our Advice
We see two key things to watch out for. First, don’t fall into the trap of thinking no-code means no-oversight—keeping humans in the loop is essential for quality. Second, costs can creep up fast if governance isn’t set up early. Our advice: put clear guardrails around who can build agents, embed monitoring from day one, and make agent creation part of your governed development workflow.
A long-awaited integration now brings Databricks into SAP Business Data Cloud with zero-copy data sharing.
Highlights
- Databricks embedded in SAP BDC as a native service.
- Zero-copy data sharing via Delta Sharing.
- Spark and ETL workloads on SAP data.
- Common interfaces (SQL editors, notebooks) for cross-platform analytics.
- Unity Catalog governance across both platforms.
Why it Matters for Enterprises
For years, we’ve seen SAP landscapes act like black boxes—rich with data but isolated from the rest of the enterprise. This integration feels like a turning point. Suddenly, you can bring SAP data to life alongside customer, supply chain, or IoT datasets in Databricks to create insights that weren’t possible before. No messy ETL, no redundant copies—just real, governed access.
Our Advice
Yes, this capability lives inside SAP Business Cloud, so cost is a factor. But in our experience, the best approach is to target high-impact use cases first—those where blending SAP data with external sources moves the needle most. Also, SAP’s schema complexity can’t be ignored. That’s where having the right expertise (which we bring from years of SAP + analytics work) makes all the difference. With proper schema mapping and governance, this integration can quickly pay for itself.
Databricks unveiled Apps, simplifying the creation of custom data and AI-driven applications directly within the platform.
Highlights
- Serverless app deployment in managed environments.
- Framework support (Streamlit, Node.js, Gradio, etc.).
- Developer tooling with CLI and IDE integration.
- Built-in OAuth2/OIDC security.
- CI/CD pipeline support for version-controlled deployments.
- Unity Catalog–governed governance.
Why it Matters for Enterprises
This one hit home for us. For a long time, Databricks has been the go-to for engineering but not always for end-user consumption. With Apps, that gap is closing. We now have a simple way to create interactive dashboards, AI-driven workflows, or even lightweight domain-specific data products—all while staying inside the governed Databricks ecosystem. For business teams, this means faster insights without waiting in line for IT.
Our Advice
That said, apps aren’t a shortcut for design expertise. We’ve seen organizations underestimate the need for thoughtful UX and seasoned developers. Our recommendation: treat Databricks Apps like any enterprise app—design with the end-user in mind, build with experienced hands, and run every deployment through performance, cost, and security reviews. Done right, they can become game-changers for data consumption.
Databricks extended Unity Catalog to support Iceberg, making governance truly interoperable across ecosystems.
Highlights
- Unified Delta + Iceberg governance (GA).
- Federated catalog access to Glue, Hive, Snowflake without duplication.
- REST APIs for Iceberg (read/write).
- Managed Iceberg tables with optimization and multi-engine access.
- Delta Sharing for Iceberg (preview).
- Business metrics as first-class assets with lineage and auditability.
Why it Matters for Enterprises
This announcement felt like a big maturity milestone. With Iceberg support, Unity Catalog becomes a true federated governance hub. Now we can register external catalogs, manage them in one place, and make data lineage, provenance, and trust more transparent than ever. For global enterprises with fragmented data estates, this is a real step toward a single version of truth.
Our Advice
The flip side of more power is more responsibility. A growing catalog can easily become messy. Our advice: set up governance rituals—periodic audits, cleanup cycles, and access reviews—so Unity Catalog stays the trusted foundation it’s meant to be. Don’t just adopt the features; adopt the discipline.
Databricks announced Lakebridge, an open-source solution to simplify migration from Teradata, Oracle, and Redshift to the Databricks Lakehouse.
Highlights
- AI-assisted SQL & ETL conversion.
- Migration profiling and complexity analysis.
- Validation framework for row/column-level checks.
- Support for phased hybrid migration.
- Extensible and integration-friendly architecture
Why it Matters for Enterprises
Lakebridge really resonated with us because we’ve seen how hard legacy migrations can be. By automating SQL conversion, ETL translation, and validation, Databricks is lowering the barrier to modernization. Even more importantly, the hybrid migration support lets enterprises move at their own pace, reducing risk while still gaining early wins in Databricks.
Our Advice
But here’s the reality: no tool fully automates decades of custom business logic. We always recommend starting with a solid assessment—use Lakebridge to accelerate the 70–80% that can be automated, and plan for human-led effort where exceptions and custom rules come into play. A blended approach is the safest path to a successful migration.
The latest release, MLflow 3.0, strengthens Databricks’ position as a unified advanced analytics platform.
Highlights
- LoggedModel in Unity Catalog for models, prompts, and agents.
- One-click observability for LLM latency and token usage.
- Evaluation pipelines with Mosaic AI and human review.
- Prompt Registry for optimization and reuse.
- GenAI-native UI for visual management.
Why it Matters for Enterprises
This was one of our favorite announcements. MLflow has long been the workhorse for ML operations, but with 3.0 it now embraces GenAI as a first-class citizen. The ability to govern prompts, monitor LLMs, and version everything inside Unity Catalog makes it feel like MLOps finally meets GenAIOps. For our clients, it’s an opportunity to unify all model operations under a single roof.
Our Advice
Our perspective is simple: automation is powerful, but human oversight is non-negotiable. The new evaluation and review features are fantastic, but production deployments still need checkpoints and manual approvals. We advise clients to design workflows that blend speed with accountability, so the tech empowers teams without exposing them to unnecessary risks.
The 2025 Databricks Data + AI Summit reaffirmed that Databricks continues to evolve from a data engineering powerhouse into a holistic enterprise AI platform. From agentic AI to governed interoperability, SAP integration to modernization accelerators, Databricks is pushing boundaries that align perfectly with the real-world challenges we help clients solve every day.
At Infocepts, we’re excited to help enterprises harness these innovations—not just by turning features on, but by ensuring they deliver sustainable business value. Governance, cost optimization, and human oversight will be the levers that turn these announcements into lasting impact.
Want to explore how these new Databricks capabilities can accelerate your data and AI journey? Talk to Infocepts today and let us help you build the right strategy for your enterprise.
Recent Blogs

Transforming Customer Experience with Hyper-Personalization at Scale
August 26, 2025

The Future of Life Sciences with Agentic AI: From Discovery to Patient Care
August 25, 2025

Enterprise Guide to Building Scalable GenAI Solutions on AWS with Infocepts
August 7, 2025

Why Modern Enterprises Are Turning to Conversational & Agentic AI for Insights
July 21, 2025