I was scrolling through my feed last week when something stopped me. It looked like an interview, someone in a professional setting, answering thoughtful questions about AI adoption. It had the shape of a think piece. It was a sales presentation. The product was expertise itself.
I have seen this advertisement before. Not this one specifically, but this shape. Every generation of technology produces it, usually within the first few years after the tools become accessible. Someone moves quickly, learns the mechanics, and positions themselves between the technology and the people who need it. The argument is always the same: this is complicated. You will need help. Here is who to call.
What I noticed, scrolling past, was not irritation. It was recognition.
I have two people close to me doing exactly this. I am not going to say what fields. Both are smart. Both have spent years learning something real. Both saw an opening and moved toward it, which is what capable people do. I am not going to challenge them in public, because they have a right to make a living from skills they have developed. They have a right to shoot their shot.
But I do not believe what they are selling. And I have been sitting with what that means.
Most organizations are running AI in some form now. Most of those initiatives are failing to meet their goals. The consultants point to this gap as their argument. Someone has to close it. Fine. But they are not the someone.
It is worth pausing on this. The tools at the center of this conversation are not even two years old. Nobody has a decade of experience with them. The person selling AI expertise has an 18-month head start on something you can learn by asking the tool itself. That is the gap. That is what is being sold back to you.
I have been building AI-assisted workflows in my office for the past two years. The tools I use are not specialized. They are available to anyone with a browser and an account. What makes the work land is not the model. It is the 25 years of context I carry into the conversation with it. I know which retention flags are real and which are artifacts of how the data is collected. I know that a student who has not registered yet in April is different from a student who has not registered yet in August, and the model does not know that unless I tell it. I know what to ask because I have spent years learning what questions matter.
An AI product built by a generalist, however well-intentioned, cannot carry that knowledge. It is built for no institution in particular. It will produce answers that look like answers. It will generate reports that look like reports. And the people receiving them will not always know the difference, because the reports will be formatted correctly and will use the right words and will not announce their own insufficiency.
This is what stays with me after I scroll past the advertisement. Not anger at the people selling. Something closer to worry about the people buying.
The same pattern ran through every general-purpose technology before this one. When spreadsheets arrived, people hired spreadsheet consultants. When data visualization tools came, firms charged to build dashboards that institutional staff eventually learned to build themselves. The consultants were not fraudulent; some of them built genuinely useful things. But the premise that ordinary practitioners could not master a general-purpose tool turned out to be wrong, as it always does. The field caught up. The gap closed. The tools became ordinary.
What it cost to wait is harder to name. In higher education, where the people being served are often first-generation students and families making consequential decisions with incomplete information, the cost is not abstract. An enrollment model built by someone who does not understand the population will not fail dramatically. It will fail quietly, at the margin, in ways that are easy to attribute to other causes.
I think about my friends and I do not feel superior to them. I think they believe they are doing something useful. I think they probably are, in certain moments, for certain clients. What I resist is what sits underneath the pitch: the idea that the person who has spent a career learning a domain needs to be brokered through an intermediary to access a tool she could, with some time and some trust, learn herself. That she is not the expert on her own problem. That someone else, who has learned the AI side of things, is.
The person who can do this work is already employed somewhere. Twenty years in. Knows which numbers lie and which ones tell the truth. Probably underwater with requests right now, no time or permission to sit with the tool long enough to learn what it can do. That is the gap worth closing. Not the gap between the practitioner and the technology. The gap between the practitioner and the time to figure it out.