
Technology often begins with a promise of liberation. It tells us that things will be easier, faster, more accessible. The early days of the personal computer were filled with that spirit: empowering individuals to write, create, and calculate without relying on large institutions. The internet was meant to bring the world’s knowledge to our fingertips. AI, in its early forms, carried the same energy; a tool that would dissolve friction, flatten hierarchies, and remove the need for technical fluency.
And yet, in many areas, we find ourselves in the opposite situation. The more advanced the software becomes, the harder it is to use without guidance. Access is still available, but mastery is hidden behind a curtain. Interfaces are layered. Functions are buried. Features are sold in pieces. Even asking a simple question requires knowing the correct pathway, vocabulary, or plugin.
This is not just the result of technological complexity. It is, increasingly, a business strategy. The software is not made difficult by accident. It is made difficult by design. Complexity is power, and power, once acquired, is rarely given up freely. As we will see, the software industry has quietly developed its own priesthoods: groups of certified, branded, and authorized mediators who stand between the user and the machine. And now, even the world of AI may be following that same path.
The Business of Obscurity
There was a time when the most important thing about software was what it could do. Today, for many enterprise products, the more relevant question is who understands how to use it. Tools like Adobe Creative Suite are famous for their capabilities, but also for their opacity. Photoshop, Illustrator, After Effects, Premiere, and Lightroom each offer thousands of features. But to unlock their potential, users are often required to spend weeks or months in formal training. Entire industries have emerged around teaching, supporting, and certifying users in these tools.
This is not an unfortunate side effect. It is the foundation of a deliberate strategy. Complexity justifies cost. It creates dependence. It protects pricing models by making it difficult for users to migrate to alternatives. It keeps trainers, support staff, and consultants employed. When the software is hard to learn, the product is not just the tool; it is the whole ecosystem of services around it. Knowledge becomes a subscription.
Large-scale enterprise systems operate the same way. SAP, Salesforce, and Oracle all require extensive onboarding, documentation, and customization. No one uses them intuitively. Even skilled professionals often need help desks and consultants. The same is true of many government portals, HR systems, and supply chain platforms. These tools are not user-first. They are vendor-first. The interface is not designed to welcome, but to intimidate.
This approach creates power structures. Some users become priests, others remain supplicants. And while it may seem natural in the context of massive business tools, it quietly affects our broader expectations of technology itself.
The Promise AI Once Carried
When large language models first entered public awareness, there was a feeling of freshness. Instead of dropdowns and dashboards, we were speaking to the machine directly. Natural language became the interface. People asked questions. They received answers. They made requests. They saw results. It felt like a return to simplicity.
In those early moments, AI offered a glimpse of something different from the traditional software world. It wasn’t just that it could write or summarize. It was the way it bypassed layers of instruction. A student could ask for help without learning Excel. A designer could brainstorm without knowing Adobe’s interface. A manager could draft a presentation without navigating PowerPoint’s menus. It was direct, forgiving, and sometimes even conversational.
The dream was not just convenience. It was liberation from the priesthood. No certification was required. No training session. No tiered access. Anyone with a question could speak. Anyone with a need could act. For a moment, it felt like AI might reverse the trend that had defined enterprise software for decades. It could give power back to the user.
But as more companies joined the race, and more features were added, that initial clarity began to blur.
The New Priesthood
It did not take long for AI to start mimicking the software world it once challenged. OpenAI’s product landscape has grown into a confusing array of overlapping models: GPT‑3.5, GPT‑4, GPT‑4‑turbo, GPT‑4o, as well as the o-series like o1, o3, and o4 that power parts of ChatGPT’s backend. Each variant has different capabilities, speeds, costs, and levels of compatibility with tools like file uploads, memory, code execution, or web browsing. As a result, users are often unsure which model they are interacting with, what limitations it carries, or how to get the best performance. What began as a promise of simplicity has gradually returned to the very maze it once aimed to replace.
Other companies quickly followed the same pattern. Anthropic’s Claude, Google’s Gemini, Perplexity, Mistral, and Cohere have all introduced their own naming schemes, access tiers, and selectively gated features. Google’s Gemini lineup alone has gone from 1.0 Ultra, Pro, and Nano, to 1.5 Pro and 1.5 Flash, then 2.0 Flash and Flash-Lite, with many of these versions already deprecated or quietly phased out as the 2.5 family takes center stage. Users are left to guess: Should they use Claude Opus or Sonnet? Gemini Pro or Ultra? Is this version agentic, multimodal, code-capable, or optimized for long context? The choices multiply, but clear guidance remains scarce. What started as a push for universal usability now resembles a tangle of branded silos, more shaped by market segmentation than by the needs of ordinary users.
The result is confusion. And where there is confusion, there is opportunity for intermediaries. Just as Adobe created trainers and certifiers, AI has begun producing its own support class. Prompt engineers, AI coaches, prompt libraries, and influencer-led workshops now surround the field. Many of these resources are well-meaning and helpful. But they also reflect a broader trend: AI has stopped being intuitive. It now requires translation.
This is how a new priesthood forms. Not through malice, but through complexity. And in this case, complexity is both technical and symbolic. The very terms used, agents, inference, tokens, temperature, context length, begin to form a kind of sacred language. It is not spoken by all. But those who master it can charge for access.
The Spectacle of Change
There is another layer to this problem. It is not just complexity that creates distance between people and tools. It is also the way new technology is marketed. Each new release is presented not as a steady improvement, but as a revolution. GPT-4o is not just faster; it is “human-like.” Claude Opus is not just smarter; it is “creative and kind.” Google’s Gemini is not just an update; it is “a new era of AI.”
This language is familiar. It resembles the language of spiritual movements. We are told that a great change is upon us. We are urged to believe, to subscribe, to upgrade. But when the user actually tries the tool, the result is often mundane. The improvement is real, but modest. The revolution is postponed.
This gap between language and experience creates a kind of fatigue. People begin to tune out. They lose the ability to tell what is genuinely new and what is simply repackaged. The hype that was once exciting becomes noise. And this, too, benefits the priesthood. When ordinary users cannot tell what matters, they look for someone to guide them.
So the industry continues to produce not just tools, but rituals. Keynotes, teaser videos, roadmaps, influencer breakdowns. Each model release is celebrated like a prophecy fulfilled. But very few stop to ask: what is actually changing for the user?
What AI Could Still Do
Despite all this, the potential of AI remains real. If guided carefully, it can still serve the purpose it once hinted at: reducing the need for complexity. It can become the interpreter between humans and difficult systems. It can give voice to people who do not speak the language of menus and modules.
Imagine being able to say, “Create a video with a cinematic filter using the footage I took last weekend,” and the system handles both Premiere and After Effects. Or: “Generate a report on regional sales and convert it into a client-ready PowerPoint deck.” No need to open tabs, click through menus, or memorize workflows.
This is not about magic. It is about interface design. If AI truly succeeds, we will not need to learn more terms. We will need fewer. We will not need to study new dashboards. We will simply say what we need.
But for that to happen, companies must resist the temptation to mystify. They must reject the comfort of opacity. They must stop using complexity as a product and start treating usability as a responsibility.
This will not be easy. It means giving up some forms of profit. It means letting go of the ecosystems that have long made software valuable not because it works, but because only a few know how to make it work.
The Choice Ahead
Every priesthood begins with the idea that certain knowledge is sacred. In the world of software, that sacredness has often been artificial. It was not that the knowledge was rare. It was that the systems were designed to make it feel rare. That feeling sustained business models. It created dependency. It ensured that users would remain followers, not authors.
But now we have a chance to change that. AI, when designed with care, can help dissolve the barriers that older systems reinforced. It can make power feel like something shared, not leased. It can return technology to its original role: not a ladder of access, but a tool for human clarity.
We are at a moment of decision. Either AI becomes another priesthood, full of impressive language, expensive trainers, and gated experiences, or it becomes something quieter, more useful, more human. It will not be decided by breakthroughs alone, but by choices in design, business, and trust.
The most powerful technology is the one that disappears into everyday life. It does not need to dazzle. It only needs to help.
Image by StockSnap