One thing Excel AI assistants like Copilot and Claude really need is a way to load system-level preferences and modeling instructions.
Right now, when you ask these tools to build a workbook, they often fall back on very generic Excel patterns. That usually means things like hardcoded numbers in formulas, calculations scattered across sheets, inconsistent formatting, or outputs that break the moment the data grows.
My guess is that this happens because there’s simply more training data reflecting older Excel habits. Modern practices like structured tables, dynamic arrays, and consistent modeling standards have been adopted much more slowly across organizations. As a result, unless you guide the model, it often defaults to those older patterns.
The good news is that the quality of AI-generated workbooks improves dramatically once you start giving the assistant a few guardrails. A helpful way to think about it is this: treat the AI the same way you would treat a new analyst joining your team. If you want consistent models, you need to explain the standards you expect.

Start with how you want models structured
Most analysts follow certain structural habits when building workbooks, even if those habits are rarely written down anywhere. Over time, you develop a mental model for how a workbook should be organized so that it stays understandable and maintainable.
Typically, inputs live in one place where they can be easily edited. Calculations happen somewhere else, where formulas can operate on those inputs without clutter. Outputs are separated again so the final results can be presented clearly to whoever is consuming the analysis.
When you make those expectations explicit in your instructions to an AI assistant, the quality of the workbook it generates improves dramatically. Instead of scattering formulas and values throughout the file, the assistant has a clear blueprint to follow.
A few simple structural instructions can go a long way. For example, you might include guidance like:
- “Organize the workbook into Inputs, Calculations, and Outputs sections or sheets.”
- “Store input datasets in Excel tables, not loose cell ranges.”
- “Name tables using a
tbl_prefix (for exampletbl_salesortbl_expenses).” - “Avoid hardcoding numbers inside formulas. Reference input cells or parameters instead.”
- “Use structured references to table columns rather than fixed ranges like A2:A100.”
Using Excel tables is especially helpful here. Tables allow formulas to reference columns by name instead of pointing to specific cell ranges, which makes the logic much easier to understand. They also expand automatically as new rows are added, so formulas and analyses don’t silently break when the dataset grows.
These kinds of rules help the assistant build something closer to what an experienced analyst would produce. Instead of a spreadsheet that mixes inputs, formulas, and results in unpredictable ways, you get a model with a clear structure that someone else can actually understand.
In practice, that small structural decision can prevent many of the subtle errors that creep into poorly organized spreadsheets.
Encourage modern Excel features
Another simple instruction that can make a big difference is telling the assistant which Excel tools it It can also help to specify which generation of Excel functions you want the assistant to favor.
Left on its own, an AI model will often default to older patterns like VLOOKUP, nested IF statements, or fixed ranges such as A2:A100. That’s largely because there is far more training data reflecting those older habits. Newer capabilities, especially dynamic array functions and tools like LET() and LAMBDA(), have spread more slowly across organizations, so the model sees fewer examples of them.
Modern Excel tools are usually much better suited for building resilient models. Functions like XLOOKUP(), FILTER(), UNIQUE(), SORT(), and XMATCH() work naturally with dynamic datasets, while dynamic arrays allow a single formula to spill results automatically as the data grows. LET() can also make complex formulas easier to read by naming intermediate calculations.
Because of this, it often helps to include explicit preferences such as:
- “Prefer
XLOOKUP()instead ofVLOOKUP()orHLOOKUP()when performing lookup operations.” - “Use modern dynamic array functions such as
FILTER(),UNIQUE(), andSORT()when generating lists or subsets of data.” - “Write formulas that spill automatically rather than copying formulas down rows or columns.”
- “Use
LET()to define intermediate variables and simplify complex formulas.” - “Reference Excel table columns using structured references instead of fixed ranges such as
A2:A100.”
These small instructions help ensure the model uses modern Excel patterns, which tend to produce workbooks that adapt much more gracefully as the data changes.
Set some naming conventions
Another helpful preference to define is how things should be named.
Consistent naming makes a workbook much easier to understand once formulas start referencing multiple tables, parameters, and calculations. Without it, even a well-built model can become hard to follow.
You can guide the assistant with simple conventions such as:
- “Use
snake_casefor all named ranges and variables.” - “Prefix Excel tables with
tbl_(for exampletbl_sales,tbl_expenses).” - “Prefix input parameters with
p_.” - “Prefix calculated metrics or measures with
m_.”
These may seem like small details, but they make formulas far easier to read and navigate. The same principle applies in programming and analytics: clear, consistent names reduce confusion and make systems much easier to maintain and extend.
Encourage documentation inside the workbook
Another useful preference is asking the assistant to include basic documentation inside the workbook.
A good analyst rarely hands over a model without explaining how it works, and the same expectation can improve AI-generated files. You might ask the assistant to add brief comments explaining complex formulas, include short descriptions at the top of worksheets, or create a simple Assumptions sheet.
An assumptions sheet is especially helpful because it gives readers a clear place to see the key inputs driving the model: things like growth rates, scenario parameters, or cost estimates.
You can also include instructions like:
- “Add comments to explain any complex or non-obvious formulas.”
- “Include a short description at the top of each worksheet explaining its purpose.”
- “Create an Assumptions sheet listing key inputs, parameters, and model drivers.”
This begins to overlap with the idea of test-driven instructions, where models include validation checks and error flags. That’s a deeper topic for another post. For now, the goal is simply encouraging the assistant to make its logic visible.
A model that documents itself is much easier for someone else to understand and trust.
Set visual expectations
Excel workbooks are not just computational models. They’re also documents people need to read and interpret, which means visual consistency matters.
You can help the assistant by specifying simple formatting preferences, such as using one color for inputs, another for outputs, applying consistent number formats, or building charts with a specific color palette.
For example, you might include instructions like:
- “Use a consistent color to highlight editable input cells.”
- “Apply clear and appropriate number formats to outputs (such as currency, percentages, or thousands separators).”
- “Use consistent table styles and chart formatting throughout the workbook.”
- “Apply the company color palette when creating charts or dashboards.”
Some teams even upload their brand colors or example dashboards so new workbooks align with existing reporting standards.
These small details may seem cosmetic, but they go a long way toward making AI-generated workbooks feel less like rough drafts and more like finished deliverables.
Don’t forget the “last mile” work
Another category of instructions analysts often forget to include is the last mile formatting work.
This is the cleanup that happens right before a file gets sent to a manager or included in a presentation: freezing panes so headers stay visible, autofitting column widths, setting print areas, aligning charts, and making sure number formats are consistent.
None of this work is particularly difficult, but it can quietly consume a surprising amount of time.
The good news is that AI assistants can usually handle these tasks just fine, as long as you tell them to. You might include instructions like:
- “Freeze panes so table headers remain visible when scrolling.”
- “Autofit column widths to improve readability.”
- “Apply consistent number formats across output tables and reports.”
- “Set appropriate print areas and page layout for printable sheets.”
- “Align charts and tables neatly on output or dashboard sheets.”
If there are small formatting tweaks your boss regularly asks you to make before sharing a workbook, those are exactly the kinds of expectations worth building into your instructions.
Give the assistant examples of good work
Another powerful way to improve results is simply showing the assistant examples of good workbooks.
A well-structured template, a modeling standards document, or a sample dashboard can give the AI a clear signal about what “good Excel” looks like in your environment. Instead of generating models completely from scratch each time, the assistant can start to mirror the layouts, conventions, and patterns used in those examples.
You can reinforce this by including instructions like:
- “Follow the structure and formatting used in the provided workbook template.”
- “Use the uploaded dashboard as a reference for layout and chart styling.”
- “Follow the conventions outlined in the modeling standards document.”
- “Match the color palette and formatting used in the example reports.”
Even a single well-built workbook can act as a powerful reference point. When the assistant has a concrete example to follow, it becomes much easier for it to produce models that align with the way your team already works.
Conclusion
Excel AI assistants are already surprisingly capable. They can generate formulas, structure workbooks, and even assemble fairly sophisticated models with very little prompting.
What they still lack, however, is something most experienced analysts rely on every day: a modeling standard. In most organizations, analysts develop informal conventions for how workbooks should be structured, how formulas should be written, and how results should be presented. Those habits make models easier to maintain, review, and extend. But AI assistants don’t automatically know those expectations unless we tell them.
As these tools evolve, it’s likely we’ll see better ways to load persistent style guides, templates, and organizational preferences directly into the assistant. When that happens, the quality and consistency of AI-generated Excel models will improve dramatically.
Until then, the best approach is fairly simple. Don’t just tell the AI what model you want to build… tell it how you want Excel to be built. A short set of preferences around structure, formulas, naming conventions, and formatting can go a long way toward producing models that behave more like the work of an experienced analyst.
If you’re experimenting with these tools, try writing a small Excel style guide for your prompts and see how much the results improve.
And if you’re interested in more practical examples of modern Excel workflows, automation techniques, and AI-assisted analytics, I share a growing collection of guides, templates, and resources inside my Modern Excel + AI membership:
That’s where I’m collecting many of the patterns, prompts, and tools I’m experimenting with as Excel and AI continue to evolve.
