|||

Not a Roadmap

A reflecton on design, systems, and the seductive drift of automation

There’s something familiar about this time of year. Maybe it’s the flood of trend decks. Maybe it’s the breathless declarations that we’ve turned another corner — into an era of agents, of ambient UX, of design without design. Or maybe it’s just the recursive loop of professionals trying to articulate a future while conveniently sidestepping the wreckage of the present.

I don’t write trend reports. I’ve spent too much time inside the guts of broken systems to pretend we can deck our way out of deep structural rot. What I offer instead is a kind of reckoning. Not critique for critique’s sake, but a gesture of maintenance. A different kind of yearly ritual: not looking at what’s next, but what we’ve already chosen to forget.

And we have forgotten a lot. We’ve forgotten how many of our systems still don’t work for the people who need them most. We’ve forgotten how many products have rebranded friction as inefficiency, and complexity as failure. We’ve forgotten that design is more than interface. It’s intention, governance, ethics, and care.

We’ve now entered what some are calling the age of agent experience,” or AX — where the interface dissolves, the user issues a command, and the system silently handles the rest. There’s power in that vision. I don’t reject it outright and I’ve contributed my thinking to it. But I do question what gets lost along the way: human navigability, informed decision-making, the humble friction of learning. When you remove the path, you remove the ability to understand how you got somewhere or what it means to go back.

As we build more of these intelligent, automated loops, we’re not just offloading effort. We’re also offloading accountability. Who gets to design the agent? Who decides what it can refuse, what it overlooks, and whose outcomes it optimizes for? These aren’t academic questions. They sit at the center of how modern systems operate and most of the time, no one is answering them seriously.

That’s not just a design problem. It’s a professional crisis. Ours is a field with no meaningful guardrails. No certification. No licensing. No regulatory framework. I’m not someone who loves gatekeeping, but I do believe in friction that forces a pause. At the very least, those mechanisms signal that the work has weight, that the systems we build are consequential enough to require vetting. Right now, a person can be hired to shape access to critical public services without any moral orientation, historical grounding, or formal accountability. Just a portfolio. Just vibes.

We talk a lot about being user-centered.” We hear words like trust, equity, and access invoked with regularity. But when the systems fail — when a form locks someone out, when an algorithm misfires, when an AI agent mangles a decision — we don’t take responsibility. We say, that wasn’t my call,” or it was out of scope.” We defer to product owners, stakeholders, models. We keep shipping and keep smiling.

That culture of plausible deniability is eroding the credibility of the entire field. And the consequences don’t land on us. They land downstream — on the people who have to navigate the mazes we call progress. The people who never get to see the Figma files or the sprint retros. The ones who live with the aftereffects of our decisions, even when we pretend those decisions were neutral.

Every project eventually hits a point where someone asks you to move faster. To cut a corner. To trust the model. To stop asking questions. And if you aren’t grounded in something sturdier than metrics and brand language, you’ll say yes. Maybe not because you don’t care, but because everyone else already has.

This is what it looks like when a profession loses its moral center. It becomes aestheticized. Marketable. Easier to sell than to question. In this moment — where AI is accelerating, where systems are scaling faster than we can trace them, where design becomes orchestration — we don’t just need skill. We need conscience. We need people who will ask what happens next, who will push to test the systems before they go live, who will refuse to mask harm behind performance.

This piece is my small contribution to that kind of practice. It’s not predictive, and it’s not comprehensive. It’s just an effort to pay attention — to resist the amnesia that tech culture encourages. Because a discipline that can’t explain itself, or be held to account, isn’t a discipline. It’s damage control.

The truth is, most of the people driving this moment in automation aren’t designing systems to accommodate human lives. They’re building toward efficiency, toward optimization, toward the fantasy of frictionless living. Spend enough time in these circles, and it becomes clear how little space there is for culture, joy, humor, or contradiction. There’s a suspicion of mess. A discomfort with ambiguity. A disinterest in anything that can’t be streamlined or vectorized. That’s not intelligence. That’s impoverishment.

We should be asking ourselves why people who hate fun, food, history, or care are shaping the tools we’ll use to organize our lives. Why the developers of systems without context are being entrusted with the task of interpreting the complexity of actual humans. Why we’re surrendering our interfaces to models that don’t know how — or why — we do the things we do.

We don’t have to. The future isn’t predetermined. It’s not a trendline or a keynote or a feature rollout. It’s built — every day — by people who show up and make decisions. And that means it’s still malleable. Still open. Still worth fighting over.

That doesn’t just apply to the big, obvious moves. It applies to the small ones too. The language you replicate without thinking. The choice you rubber-stamp because it’s easier than questioning. The friction you eliminate without understanding what it protected. The harm you obscure because it was never logged in Jira. These choices matter.

This work is hard. But it’s also ours to do.

We can build systems that recognize where they are.
 We can design tools that slow down when they need to.
 We can write documentation that tells the truth.
 We can build agents that don’t erase people.
 We can insist that joy and culture and community aren’t nice-to-haves.

The people writing the future are just people. And people get it wrong all the time. Which means we still have time to do something different.

That’s not a roadmap. That’s a practice. And it starts with deciding to care.

Up next When User Experience (UX) no longer explains the work: Interaction Engineering as concept For years, I’ve been circling around a term to describe the kind of design work I actually do. Not just service design. Not just IA or UX or content
Latest posts Not a Roadmap When User Experience (UX) no longer explains the work: Interaction Engineering as concept Libraries Weren’t the Point: Fiscal Archetypes for Speculative Civics The Interface Is The Interview Civic Tech, Local Impact: Launching the Portland Digital Corps On Doing What You Can The Interface Trap: How Administrative Systems Create Unsolvable Problems Context-Aware Agents: Designing for Alignment in AX Some thoughts on hiring, quality & public service Notes on Modernization and System Change Why would you take a promotion without a raise? Service Design for AI: Why Human Experience (HX) Matters On adapting my coaching philosophy Writing about personal stuff On Goals & Invisible Lines On Secret Hobbies (or About My First Talk of 2025) On Resolutions From Tools to Territory: How Systems Become Infrastructure On Lasting Resilience Scoring Baseball’s Golden At-Bat: Why Finland’s Pesäpallo Already Has a Better Solution Dismantling Things Takes Time What I’ve learned about posting in a post-Twitter world Rushing into the elevator to avoid saying hello Wrote about that one time I was a game designer On AI and Care On House Rules Robbed of thoughts Civil Futures and the speculative present Using a AAAA major league + winter major league to develop fringe prospects Why the pricing out of youth sports so personal 8 Arenas of Action Matrix