The data analysis and BI industry is no stranger to change, but the speed and reach of advances in AI have caused major shifts to analytical workflows. AI is rapidly transitioning from an experimental tool to operationalized technology that is core to data teams’ daily work. As AI automates tasks throughout the data analysis lifecycle, analysts can refocus their effort on the things that require human ingenuity and expertise like asking better questions, interpreting data, and applying their judgement at scale.

While AI is meaningfully improving data workflows in some areas, there are still areas where AI tools miss the mark.

Below, we walk through how AI is meaningfully contributing to data team workflows, and where its promised capabilities don’t always align with today’s reality. We also share what we see as the biggest untapped potential of AI tools, and what data teams can do to make the most of this new technology.

Where AI is helping analysts today

Automating the “least human” parts of data work

When we spoke to data analysts and BI professionals about the role of AI in their day-to-day work, many pointed to using AI for mundane tasks. Steps like initial data profiling, summarization, and cleaning are critical in the data analysis process, but typically aren’t how the average analyst would want to spend the majority of their time. So, it should come as no surprise that data teams reach for AI tools to automate these repetitive tasks that are invaluable for avoiding thrash and common mistakes, but that don’t always require human creativity or judgement.

AI can be a powerful tool for exploratory data analysis and wrangling, enabling analysts to quickly inspect datasets and identify issues to avoid costly errors down the line. As data teams adopt AI tools to automate mundane tasks, analysts can shift their resources and energy to more interesting aspects of data analysis like model evaluation and interpretation, or fine-tuning data visualizations.

Here are example AI prompts that can help with your next data exploration:

  • Are there any null or missing values in this dataset?

  • Do missing values appear to follow a pattern?

  • Summarize the schema of this dataset in plain English. What does each table represent?

Simplifying complexity and growing data skills

Another area that AI shines is breaking down complex information into its component parts, which encourages learning and skill-building. This is particularly helpful with technical tasks like understanding and debugging code — especially code written by other developers or data scientists.

Analysts’s resources are often stretched thin, with only the time and bandwidth to ship a final artifact. In the rush to move on to the next request, however, helpful documentation and explanatory materials are often left by the wayside. But expecting the insights to be self-evident to stakeholders can in fact devalue the analyst’s choices and work. AI tools can fill the gap by summarizing key insights and creating documentation to guide stakeholders as they explore dashboards and other deliverables.

Finally, data is everywhere, and stakeholders need a baseline level of data literacy to keep up. AI can help here as well! These tools are upleveling data skills and knowledge organization-wide, giving cross-functional teams access to a personalized coach to support their data skill development that adapts to their level of expertise.

Where AI falls short in data work

Reproducibility and trustworthiness are still a major challenge

When AI tools produce insights or analyses, it’s critical that the analyst can reproduce the output. We think of the challenge as akin to a reporter looking for confirmation on a story from a second source — if the AI uses the same data and prompt, but comes back with a different response, that’s an intolerable situation for organizations trying to make sense of their data. No analyst is going to want to put that report in front of stakeholders or clients and no stakeholder will feel confident acting upon it.

By now, most people understand that AI tools can misinterpret requests, or lack context while providing a technically correct answer. Worse, major AI tools continue to hallucinate, or generate synthetic datasets to provide erroneous responses when it can’t find the appropriate tables in a database. The problem is that bad AI “doesn’t smell” — its outputs may look legitimate, while being entirely fictitious or flat out wrong.

So, skepticism around using AI tools for mission-critical analytical work is entirely valid. Analysts must be able to trust the results of any tool they use.

General-purpose LLMs weren’t designed for real-world challenges of data analysis

In addition, general-purpose LLMs were not built specifically for data analysis workflows. They don’t automatically understand the nuances of a particular schema, the reason behind certain business operations, or edge cases presented by real-world data analysis work. They also don’t retain the rationale behind past business decisions. That creates a major gap for general-purpose models: why a metric was defined a certain way, why a filter was applied, what trade-offs were made. When prompted to generate visualizations, these general-purpose LLMs will typically produce only basic and non-interactive charts, potentially leaving insights on the table.

AI’s untapped potential

At the same time, these challenges present an opportunity for data teams to use these AI tools to create more efficient processes. For example, AI tools could help data teams avoid losing institutional memory and recreating past work by enshrining business decisions — and the reasoning behind those decisions — and corporate values directly into their analytical processes. Specialized tools, designed for the rigors of modern data analysis work, can augment analysts’ work through transparent, interpretable interfaces, and encourage deeper explorations. AI tools can deliver a more collaborative analytics process and bridge the gap in data literacy between analysts and stakeholders, helping members of cross-functional teams become active participants in insight discovery.

There’s also a big, often-overlooked gap here around data visualization. If a chart requires deep knowledge of underlying analysis to be useful, it can be a sign that the visualization (or the way it’s being shared) is missing the mark. Most organizations massively underinvest in expressive, clear data visualizations — even though better visuals can make data instantly more intelligible and interpretable. AI can step in here to level up analyst capabilities in delivering advanced, expressive visualizations that communicate insights clearly.

Conclusion

While AI excels at automating repetitive tasks and breaking down complexity, the human elements of data work — curiosity, judgment, contextual understanding, and the ability to ask the right questions — remain irreplaceable. Perhaps paradoxically, that makes human input and expertise increasingly critical as AI handles more of the mechanical work.

For data teams navigating this transition, automating repetitive tasks is a proven use case for AI tools, including data profiling, basic exploration, documentation, and debugging, along with improving collaboration and data literacy organization-wide.

At Observable, we believe AI as an accelerator for human capability and ingenuity. The data teams that thrive in this new era will be those that recognize AI as a powerful tool to augment data teams, not a replacement, freeing analysts to focus on what they do best: turning data into insights.