|||

Service Design for AI: Why Human Experience (HX) Matters

As AI reshapes how we interact with systems and each other, we need to expand our understanding of what’s at stake.

I’ve spent years talking & speaking globally about this idea of consequence design, which has evolved to be defined as interrogates the ways interfaces and technology proliferate daily life. and I’ve written about this Where things always got a little stuck with this framework is eventually getting to a place where you’d shrug and go okay, I agree but now what? about the ways these problems show up in everyday life.

In a world increasingly mediated by AI agents and large-language models, it’s difficult to know where the people start and where the machines end.

Last week, two blog posts: one where Mathias Biilmann introduced the concept of Agent Experience (AX), followed by another post from Steven Fabre followed up on the importance of agent-compatible, collaborative products, led me to think about my own work and a concept I’ve been using privately for years, but hadn’t yet written about, disintermediation.

Understanding disintermediation

Disintermediation sounds more complicated than it is. An ATM giving you money at midnight because the bank is closed is one example of a helpful intermediary. This didn’t put anyone out of a job - banks were never open late, and without this (perhaps when we still used money for things), you’d have to wait until morning.

On the other hand, using an AI agent to purchase something based on its knowledge of you, or leveraging a platform to contact another person’s agent to make a purchase or arrange a service, creates a series of assumptions about user intent. In one-off situations, this isn’t a huge deal - might even be preferable. But in the aggregate, these mediated interactions can have huge downstream effects across society.

Disintermediation in AI-mediated systems is the process by which people become separated from direct relationships, capabilities, and decision-making as AI interfaces interpret and act on their behalf.

Unlike traditional automation that simply makes processes more efficient, disintermediation fundamentally changes how people interact with services and systems, often requiring them to adapt to AI limitations rather than the other way around.

For example, when AI chatbots become the primary way to access customer service, people don’t just lose direct contact - they lose the ability to express needs in their own terms, must learn to communicate in ways the AI understands, and lose access to human judgment in complex situations. You can ask for a human, but in time, you might not get one.

Why we need to expand our lens to human experience

On its face, Human Experience sounds like a fork of traditional human-computer interaction or user experience principles that have evolved over the last 40+ years and has a wide canon and practitioners.

I’ve spent years talking about this idea of consequence design, which has evolved to be defined as interrogates the ways interfaces and technology proliferate daily life. and I’ve written about this & talked about it extensively since 2017.

In a world increasingly mediated by AI agents and large-language models, it’s difficult to know where the people start and where the machines end.

Why We Need to Expand Our Lens to Human Experience

While the tech industry focuses on optimizing Agent Experience (AX), we’re missing something crucial: how these systems reshape human capability and relationship patterns across society. This isn’t just about better interfaces or user experiences - it’s about understanding and designing for how AI mediation changes human behavior, decision-making, and social connections.

Human Experience (HX) builds on traditional UX principles but expands our lens to consider:

  • How AI agents interpret and act on human intent
  • Where system assumptions create risks
  • When mediation helps vs. hinders
  • How capabilities shift over time
  • What patterns need attention

From Consequence Design to HX

My work on consequence design has always focused on understanding how interfaces and technology reshape daily life. As AI agents become primary mediators of human experience, we need frameworks that help us:

  1. Identify patterns of disintermediation
  2. Understand system implications
  3. Design better alternatives
  4. Preserve human capability

HX isn’t just another layer of design - it’s a critical practice for ensuring AI-mediated systems work for people rather than forcing people to work for them.

What’s Next

As organizations rush to implement AI agents and large language models, we need people focused on:

  • Understanding how mediation changes behavior
  • Identifying where assumptions create risks
  • Designing for actual human patterns
  • Preserving critical capabilities

This isn’t about resisting AI implementation - it’s about ensuring these systems enhance rather than erode human experience. I’ll continue expanding on this topic through writing and speaking more in 2025.

Feel free to get in touch if you want to talk more about this concept, too.

Up next On adapting my coaching philosophy
Latest posts Service Design for AI: Why Human Experience (HX) Matters On adapting my coaching philosophy Writing about personal stuff On Goals & Invisible Lines On Secret Hobbies (or About My First Talk of 2025) On Resolutions From Tools to Territory: How Systems Become Infrastructure On Lasting Resilience Scoring Baseball’s Golden At-Bat: Why Finland’s Pesäpallo Already Has a Better Solution Dismantling Things Takes Time What I’ve learned about posting in a post-Twitter world Rushing into the elevator to avoid saying hello Wrote about that one time I was a game designer On AI and Care On House Rules Robbed of thoughts Civil Futures and the speculative present Using a AAAA major league + winter major league to develop fringe prospects Why the pricing out of youth sports so personal 8 Arenas of Action Matrix Assessing the worthiness of posting in the AI era On social media platforms & separation of concerns Why we ought to be (civil) cartographers, not “experience” designers Bringing (advanced) stats to junior tennis Measuring the impact of (design) consequences Service design wasn’t designed to serve everybody On Putting Yourself Out There On Rage Tweets, Reddit & the painful silence of a post-twitter world Notes from the age of disengagement Bifurcated modalities The endless cycle of blogging