If It Doesn’t Feel Easy, People Won’t Use It
One observation I keep coming back to in conversations with colleagues is this: it doesn’t really matter how advanced or intelligent a system is if people don’t want to use it.
That may sound obvious, but it’s surprisingly easy to overlook, especially in conversations about artificial intelligence. There is often a strong focus on what a system can do—how sophisticated the model is, how accurate the outputs are, how many edge cases it can handle. These are important considerations, but they tend to overshadow something more immediate and more human: the initial experience of interacting with the system.
When someone encounters a tool for the first time, they are not evaluating its underlying capabilities. They are making a much faster, more intuitive judgment. Does this look simple? Does it feel approachable? Can I figure this out without too much effort?
If the answer to those questions is unclear, most people will not proceed. They will hesitate, delay, or avoid the system altogether, even if it is objectively powerful or well-designed beneath the surface.
This is not a new dynamic. It is a reflection of basic human behavior. People are drawn to experiences that feel easy and, in some small way, enjoyable. They are more likely to engage with something that gives them a sense of progress early on, rather than something that feels like work from the outset.
Artificial intelligence does not change this. If anything, it heightens expectations. Many people approach AI systems with the assumption that they will be intuitive, responsive, and even a little bit seamless. When that expectation is not met—when the interaction feels confusing, heavy, or unclear—the gap becomes immediately noticeable.
In that moment, it does not matter how capable the system is. What matters is how it feels to use.
This creates an interesting tension in the design of AI-powered tools. On one hand, there is a desire to expose the full capability of the system. On the other, there is a need to create an experience that feels simple, even if the underlying system is not.
The systems that succeed tend to resolve this tension well. They present themselves as easy to engage with, even when they are doing something complex behind the scenes. They reduce the cognitive load required to get started and provide early signals that the interaction will be manageable.
In many ways, this is less about technology and more about perception. A system does not need to be simple in order to feel simple. But if it fails to create that feeling, most people will never stay long enough to discover what it can actually do.