The BI and data analysis industry is reaching an inflection point, and data teams must navigate a moment of massive upheaval in their day-to-day work. LLMs and agentic analytics tools are rapidly advancing, and stakeholder expectations are shifting: they want answers faster, and they want to find insights themselves.
In a recent fireside chat, Julio Avalos, Co-CEO of Observable, and Marisa Morby, our Director of Research, unpacked what these transformations mean for data teams and proposed ways analysts can stay at the forefront of this changing BI and analytics landscape. Their conversation touched on findings from our recent State of BI and Analytics survey, and outlined where AI is having real impact, where it falls short, and how analysts can get ahead of changing stakeholder expectations.
What are some of the broader changes across the BI and analytics industry?
The data space continues to reckon with the now familiar challenges of technological evolution, rising user expectations, budget constraints, and resource limitations. But now, AI is significantly changing the landscape. Simultaneously, stakeholder expectations demand faster, self-serve answers. This combination is reshaping how data teams operate.
In the webinar, Julio pointed out that data is everywhere — not just large data warehouses. It also shows up as small, personal data in consumer and internal apps, meaning people interact with data constantly, often without realizing it's "data work." Yet, exposure doesn't always lead to data fluency. While people may be more comfortable with charts, they aren't necessarily better at asking good questions.
There’s also an issue of scale and added complexity. Organizations collect more data than ever, but many existing tools and workflows were designed for a different era. Stakeholders are breaking away from the paradigm of passively consuming charts and reports that defined corporate data work for decades. This shift creates human and technical uncertainty around what questions to ask, and how to trust data and results. Issues of provenance, credibility, and trustworthiness are also exacerbated by AI.
As Julio described, this means we’re reaching a moment where human ingenuity matters more, and not less:
If AI takes an increasing role, the value of human insight — the "spark" — becomes paramount. Data analysis within businesses needs to become more like continuous, collaborative flows, demanding new tools and paradigms to support this reality.

Julio Avalos, Co-CEO, Observable
AI’s impact on data analyst workflows
How is AI meaningfully helping analysts today?
AI is helping analysts with the more mundane parts of the job, Julio observed, including data profiling, anomaly detection, quality checks, and other repetitive tasks that require a significant investment of time, and that don’t directly benefit from human creativity or judgment. By automating these rote tasks, analysts can spend more time adding value to their organization by doing the work they were hired for: finding and communicating useful insights in the data.
Marisa shared that in research calls, analysts often echo that AI is particularly useful at automating the less creative or administrative functions of data work. She also pointed out that debugging complex SQL — especially SQL written by other developers — was another popular use case for AI among data analysts. AI also has the potential to improve data literacy, and to help users develop new skills and techniques.
Where is there a gap between promised AI capabilities, and what AI can deliver?
AI for data analysis is in its early stages, Julio said, and early generations of AI tools have not been built specifically with data analysis in mind. This has led to a few key gaps.
Reproducibility is one such critical gap. If an insight cannot be reproduced using the same tool, data, and prompt, it cannot be trusted by the analyst or stakeholders. A second, related gap is verifiability. Much of today's AI operates as a black box. As Julio said in the webinar, "bad AI doesn't smell" — it can appear correct while being fundamentally wrong.
Third, most AI tools for data analysis are built on general-purpose LLMs and not tailored specifically for data analysis and visualization. They often lack the context needed to evaluate particular datasets and edge cases. Finally, there’s an institutional memory gap. Organizations invest significant resources into efforts and decisions whose rationale is quickly forgotten. AI has a real opportunity to dramatically improve how corporate knowledge and principles are communicated and operationalized.
What do you think the biggest untapped opportunity is as it relates to AI in the data workflow?
At Observable, Julio said, we view AI as a human accelerant that reinforces human reasoning and perception for data professionals. Transparency is an area of untapped potential: we need inspectable AI that explains its work, so that users can track and interact with results.
Marisa added that while AI may handle many admin tasks, freeing humans from unwanted work, it still requires human discernment and reasoning. She suggested AI could boost data literacy by helping stakeholders collaborate better with data teams. By using AI to explain the data work, the process becomes more transparent, trustworthy, and understandable.
Julio pointed out that data literacy is multimodal, requiring stakeholders to understand the data, and data teams to educate collaborators on how to interpret visualizations. If data teams simply assume that charts are commonsensical, they end up undervaluing data work. He sees an opportunity in building a richer dialogue around data work, emphasizing that data professionals must focus on the human elements of learning and exploration:
There is massive underinvestment and underappreciation in the role of advanced data visualization in making your data immediately more intelligible, interpretable, and subject to interrogation. There's a lot there, with respect to data literacy, that can be helped by contemporary tooling and some of the advances that are happening in AI.

Julio Avalos, Co-CEO, Observable
Changing stakeholder expectations
How have stakeholder expectations around speed to insight changed?
Julio highlighted that stakeholders view decision making as an ongoing process, requiring data to be immediately available rather than delivered weeks later as a finished artifact. But data teams and stakeholders remain siloed, which slows iteration.
Instead, stakeholders want data and proactive insights in the tools they are already using — such as Slack, Microsoft Teams, etc. — so they don’t have to adopt another tool to analyze data. The future of self-serve analytics will involve more collaboration, facilitated by better tools and integrations with the data team, or interactions with chat agents.
Where do stakeholders expect to encounter data?
Data visualizations are ubiquitous in BI tools, but the performance and expressiveness of data visualizations in business applications lags behind that of consumer-facing apps, Julio said. Data teams must now build visualizations across a wide range of surfaces, and beyond traditional dashboards. The dashboard-centric paradigm for data analysis and BI is fading. The goal shouldn’t be a particular presentation mode: it should be better human understanding and insights that enable confident decision making.
Marisa emphasized the importance of meeting people where they are, and noted that role hybridization is a major shift in the broader workplace:
More people and functions are having to look at and understand the data, have data literacy, and share that information with people across the entire company. That is going to be a big shift as roles become more hybridized. Not only do we have to get data to people — the right data, at the right time — but also more people are being exposed to it and expected to do something with it.

Marisa Morby, Director of Research
How have stakeholder expectations changed around the quality and performance of visualizations?
Advanced visualizations are key to unlocking data potential, Julio asserted, because they’re often the best way to find answers in large, complex data. But many incumbent BI tools fail to provide the interactive, expressive charts required to uncover hidden insights. Good visualizations help stakeholders engage with data, mitigating dashboard rot, which costs organizations millions and destroys confidence in data teams. Focusing on making data more intelligible and engaging through visualization can greatly improve corporate data culture.
Conclusion
During this fireside chat, Julio and Marisa discussed how data teams can navigate the changing BI and analytics landscape by moving away from dashboard-driven workflows and toward continuous, collaborative analysis that meets stakeholders where they are. AI is already reducing mundane tasks, and stakeholders are upping the ante with greater expectations around speed, accessibility, and visualization quality. Data teams can rise to the occasion by combining advanced data visualizations, collaborative analytics, and thoughtful AI integration.
For a deeper dive into our research on how data teams are adapting, read our blog post recapping our survey on the State of BI and Data Analytics.