It’s getting harder to keep things current as a trainer and course builder. Not long ago, you could release a course and expect it to hold up for a while. Now you can publish something and see a major feature change the next week. I’m not kidding when I say I’ve had to go back and redo an entire book manuscript because a core workflow shifted underneath it.
That doesn’t make training less valuable. But it does change what training needs to focus on. The old model of following along with clicks and steps in a specific interface is becoming less reliable when the interfaces keep changing.
What still holds up
What tends to hold up much better is a foundation in digital literacy, including understanding how data is structured, being comfortable working with different file types, and knowing how to move between tools rather than relying on any single interface.
It sounds simple, but it’s often missing. A lot of otherwise capable professionals have never had to think about this layer explicitly because the pointing and clicking masked the gap. Tools were stable enough that muscle memory did most of the work, and the underlying concepts stayed hidden. When I started out in analytics, you could get pretty far by memorizing the ribbon. That is no longer true, and it hasn’t been for a while.
Why AI exposes the gap
AI assistants collapse the distance between what you want and what ends up in the spreadsheet. But the collapse only works if you can describe what you want in the first place, and then recognize whether what came back is actually correct.
A few things that come up constantly in finance workflows:
- Can you OCR text out of a PDF into a usable format?
- Can you convert a CSV to an Excel workbook and understand what changed in the process?
- Can you tell a delimited file from a fixed-width one?
- Do you recognize when numbers are stored as text, and do you know why that matters for a PivotTable?
- Can you move a table between Excel, Power Query, and a database without the data types silently breaking?
These are small, unglamorous skills. They are also the substrate on which any serious AI workflow has to sit. If a team struggles with whether to open a CSV in Excel or in Power Query, asking them to “adopt AI” is skipping a few steps. The AI does not rescue anyone from the underlying confusion. It just moves the confusion one step further down the pipeline, usually into a place where it is harder to notice.
There used to be some slack in this. A person could avoid the fundamentals as long as they stayed inside one tool and followed a well-worn path. That slack is gone. The moment you start asking an assistant to pull from a PDF, reshape a CSV, push results into a workbook, and write them back out somewhere else, the fundamentals stop being optional.
Why this is a tough sell
I also understand why this is a hard conversation to have with leadership. It can feel like going backwards. The ROI is less obvious than investing in a new tool with a vendor demo and a case study.
Perhaps worse, from a manager’s perspective, it pattern-matches to a low-ROI ego hit. You are telling a professional that they need training on file formats. That is a combination most managers will instinctively avoid, and a combination most analysts will quietly resist. Nobody wants to tell a director that the real gap is not a missing license for the latest copilot, but a missing understanding of how data actually moves between systems.
The table below is the awkward version of this conversation:
| What budgets naturally fund | What actually drives AI adoption |
|---|---|
| New tools and licenses | Fluency with the data underneath |
| Feature-specific training | Skills that port across tools |
| Pilots and proofs of concept | Day-to-day working habits |
The right-hand column does not photograph well. But the teams that are quietly getting the most out of AI in Excel right now are the ones who already had the fundamentals, and the teams that keep stalling on rollouts tend to be the ones that did not. Managers who want the top row to produce outcomes without the bottom row are going to be disappointed, and most of them already are.
What this means for training
There is a reality on the training side too. If the expectation is constantly up-to-date content keyed to whatever version of the tool is live this month, that is possible, but it takes a lot more ongoing effort than it used to. A course built around a specific feature path can be outdated before the season is over.
The alternative is to anchor training on the parts that do not change as quickly. This includes understanding how data is shaped, how tools connect, how to read an error message without panicking, and how to think about a file before opening it. These skills transfer across releases, across tools, and across the AI assistants that will continue to arrive for the foreseeable future. While specific screenshots may become outdated, the underlying capability remains relevant.
That is also a more honest conversation to have with a client. A workshop that promises permanent mastery of a specific copilot UI is setting up both sides for disappointment. A workshop that builds fluency with files, formats, and flow of data is betting on something that will still be true next year.
Conclusion
If you take nothing else away from this post:
- Point and click training is aging out faster than it used to.
- Digital fundamentals are aging in, because they are the layer AI actually sits on top of.
- This will feel like an unglamorous investment. It is also the one that compounds.
If you are running a team and trying to decide between sponsoring another AI pilot or quietly sending a few people back to the basics, the basics are almost always the better bet right now.
If this matches how you are thinking about your team, you can get a sense of how I approach this kind of training on my How I Work page:
