Over the last several years, large language models (LLMs) have made tremendous progress in writing code, answering questions, and assisting with natural language interfaces. One of the most impressive shifts in the data world is its ability to generate SQL from plain English prompts in a text-to-SQL format. This development begs the question: if anyone can ask a database a question in plain English and get back an answer in seconds, do we still need data analysts?
The short answer is yes. However, the role of data analysts is changing.
SQL sits at the intersection of business logic, data modeling, and decision-making. LLMs are making it easier to write queries, but they’re also changing the role of a data analyst. Are we moving from dashboard builders to prompt engineers?

1. The Traditional Role of a Data Analyst
The classic data analyst workflow starts with a business question. A business user wants to know: Why did sign-ups drop last month? Which campaign drove the most revenue? The analyst then:
- Uses the data model to locate the relevant datasets
- Cleans and joins the data
- Writes SQL queries to refine the question and move towards the answer
- Builds a visualization or dashboard
The bottlenecks are often not in the logic but in the grunt work — writing the same query structure repeatedly, translating vague stakeholder requests into precise logic, formatting reports for clarity.
Analysts also act as domain experts. They bridge the gap between raw data and organizational decision-making. Their role isn’t just to write queries — it’s to interpret data, contextualize it, and guide business strategy.
2. What LLMs Can Do Today
LLMs excel at translating natural language to structured SQL — especially when the schema is known. Tools like Galaxy allow users to write a complex prompt that typically requires data model knowledge, like:
“What was the average revenue by channel in Q2?”
and get back a working SQL query in seconds.
Beyond simple translation, LLMs also support context-aware code completion. The model understands the ongoing task in a Galaxy-like interface, suggesting filters, grouping, or JOINs based on context. This is very useful for:
- Speeding up repetitive query writing
- Reducing SQL syntax errors
- Onboarding junior analysts or non-technical users by allowing them to “chat with their database”
AI-assisted SQL tools also reduce cognitive load, letting users focus more on insights and less on syntax. They offer autocomplete, query optimization (an awesome bonus of Galaxy), and explain queries in plain English.
3. Where LLMs Fall Short
Not everything is perfect in the world of LLMs and SQL just yet! Despite their usefulness, LLMs still have a few important blind spots:
- Schema awareness: They often hallucinate table or column names unless provided with an explicit schema. This is why simple text-to-SQL tools without a database context tend to lack precision.
- Business context: They can’t distinguish between revenue and ARR (two similar metrics) unless explicitly told — context that analysts pick up from experience.
- Real-world KPIs: Metrics aren’t just math; they’re definitions. What counts as “active” or “churned” varies by company and product. LLMs don’t know how to differentiate between these small nuances.
- Temporal logic and edge cases: Many business questions involve complicated time windows, exceptions, or workaround logic that an LLM can typically miss.
Comparison Table: When LLMs Work — and When They Don’t
Scenario | LLMs Work Well | LLMs Fall Short |
Basic SQL generation | Natural language → SQL queries for known schemas | Schema is unknown or has non-intuitive naming |
Data exploration | Initial questions like “top users by spend” or “avg revenue by channel” | Deep dives that require iterative, hypothesis-driven analysis |
Repetitive query tasks | Writing boilerplate queries (e.g., filtering by date, aggregations) | Queries that require creative joins or non-standard business logic |
KPI tracking and metrics | Generating charts or summaries from structured definitions | Understanding nuanced, company-specific metric definitions (e.g., “active user”) |
Assisting junior analysts | Teaching SQL patterns, autocompleting JOINs or filters | Teaching judgment, context, and domain-specific implications |
Trend detection or anomaly spotting | Not effective — lacks historical baseline or domain intuition | Analysts can connect unexpected trends to business changes |
Final reporting and decisions | Drafting initial narratives or summaries | Analysts are needed for critical thinking, storytelling, and decision support |
4. The Human Advantage
Humans are more than SQL writers – they are storytellers and decision makers. They understand the full lineage of why a query should be written and the story they need to tell to support their findings.
For example, at Galaxy, we noticed a dip in usage patterns associated with a certain product area. An LLM wouldn’t flag this without being told what “good usage” looks like. A human analyst, however, spots the pattern, investigates further, and discovers that a recent change to onboarding/usage flows is affecting usage patterns.
Analysts bring:
- Domain-specific intuition
- Institutional knowledge
- The ability to ask better questions, not just answer them
- Strategic thinking that extends beyond the data
They also build trust. A well-reasoned explanation from a human builds more confidence in decisions than a machine-generated report.
5. The Analyst + AI Workflow
The future isn’t analyst or AI — it’s analyst with AI.
In many cases, analysts can:
- Use LLMs to generate 80% of a query
- Review, refine, and add nuance in the last 20%
- Focus more on insight, less on syntax
Galaxy users already work this way: rapid prototyping with AI, followed by analyst validation. This creates a faster feedback loop from question to insight.
Prompt engineering becomes part of the workflow: tuning inputs for better output. Over time, the best analysts will build their prompt libraries and refine them so that they save query snippets as they currently do.
LLMs also democratize access to data. Business teams can self-serve more effectively, allowing analysts to tackle deeper, higher-impact questions.
6. The Future of the Analyst Role
As LLMs perform more mechanical work, the analyst role shifts toward judgment and communication.
- Less querying, more interpreting: Analysts will spend more time explaining why something happened and what should happen next. Also, what queries should be written next to draw even more conclusions?
- Curators of data context: They’ll shape and maintain definitions, guardrails, and metric governance.
- Organizational translators: Bridging the gap between technical systems and business needs becomes more important than query optimization.
- Decision accelerators: Analysts will move from reactive to proactive, surfacing trends, risks, and opportunities early.
Companies should start to re-skill their teams:
- Invest in data literacy across departments
- Encourage prompt-based workflows
- Reframe the analyst role as strategic, not operational
- Support tools that facilitate collaboration between humans and LLMs
7. How Galaxy Is Revolutionizing AI‑Assisted SQL
While many tools use LLMs to assist with basic SQL generation, Galaxy takes it further by being context-aware. This means:
- Schema-Aware Querying: Galaxy understands your data model, including relationships and naming conventions.
- Intelligent Query Optimization: It can revise inefficient queries, suggest better filters, and structure joins properly.
- Chat with Your Data: Users can interact with their database conversationally — refining, explaining, or drilling down into results.
- Iterative Debugging and Learning: Galaxy doesn’t just spit out code. It learns from your corrections, making future interactions more accurate.
Unlike traditional AI assistants that act like autocomplete engines, Galaxy becomes part of the analytics loop: generating queries, validating results, and enabling real-time iteration with human oversight.
Galaxy empowers analysts to stay in flow, reduce cycle time from question to insight, and bridge the gap between non-technical users and the full power of their data.
Conclusion
LLMs are not replacing data analysts — they’re making them faster, more strategic, and more impactful. The core job remains: translating data into decisions. But now, with AI doing some of the heavy lifting, analysts can focus on the parts that matter most. Final thought: The best analysts won’t be replaced by AI. They’ll be the ones who know how to use it.