IxD Essentials: conversations, engagements, embodiments
An update for the age of generative AI
Over 25 years ago, in a graduate seminar at Carnegie Mellon taught by Richard Buchanan, I was provided an orientation toward design that has never left me. Buchanan’s intellectual lineage ran through Richard McKeon back to John Dewey, whose book Art as Experience remains one of the most illuminating accounts of what it means for a human being to encounter — and be changed by — an experience. That seminar offered no UI patterns or UX checklists. 😇
It offered something far more durable: an architectonic stance that’s rhetorical, strategic, pluralistic, toward problems of varying complexity & scale.
From that foundation I’ve long advocated that interaction design rests on three core elements: conversations, engagements, and embodiments. Drafted over fifteen years ago, I think that notion still holds. But each word/concept now involves an additional layer of interpretation, mainly as AI changes not just daily tools used, but the expression of nuances in the practice itself…
❖ Conversations: from facilitation to co-authorship
Conversations are central to what we do: staging meaningful dialogues among designers, stakeholders, and users, mediated by some “it” being designed — an interface, a service, a task or activity, an organizational system, etc. That “it” is the conveyance of social, philosophical, and emotional values through visual, behavioral, and structural facets. Buchanan taught that every design is a rhetorical argument: a coordinated appeal to logos (functional logic), ethos (the designer’s character and credibility), and pathos (emotional resonance with the audience). An effective design, on this account, is a balanced argument — useful, usable, and desirable.
What changes with genAI is that the artifact is no longer a passive conveyance of that argument — it becomes an active co-author of it!
The product now actually converses, often in ways the designer didn’t script. ChatGPT’s radical openness — no guardrails on the prompt surface (with potentially highly questionable, risky ends) — is a philosophical bet on user autonomy, an argument that the human should author the scope and direction of the encounter without constraint. Grammarly or MS Copilot’s inline suggestion model takes a different stance: keep the human in the driver’s seat, treating AI as a supporting voice rather than a principal one.
Each is a different answer to a key rhetorical question: whose logos, ethos, and pathos does this system project, and on whose behalf?
The designer’s job is thus no longer only to compose the argument, but to direct its conditions — the interpretive frames, behavioral constraints, and ethical commitments that govern how the system speaks & responds.
AI UX patterns like disclosure (clearly marking AI-generated content) and personality (defining the AI’s tone and character) are, ethos & pathos decisions really not just technical choices!
❖ Engagements: from getting to know each other, to being known
Engagements are the product encounters themselves — the actual using of “it” to act, achieve, or explore, not some simplistic vanity metric for product measurement (like engagement metrics — ugh!). Engagement in this phenomenological sense refers to something personally significant toward a particular goal or purpose. As you explore a product’s controls & responses, you cultivate a relationship, discovering fit and value. So all of this — and here’s that Dewey framework — transpires in seconds through the extraordinary physio-cognitive capabilities of mind + body. The “consummate doing & undergoing” of a well-designed encounter is what interaction design (writ large) is ultimately in the pursuit of enabling. 🤔
That phenomenological model remains intact today amid AI-usage. What shifts is the relational dynamic. In AI-mediated systems, the product is not simply waiting to be known — it is getting to know you back. It models your inferenced behavior, adapts to your deduced patterns, accumulates inferred context across sessions. Claude Code with Projects bears this out as a philosophical position: AI as project collaborator, embedded in the ongoing context of your work, building familiarity over time.
The engagement is no longer symmetrical — but it’s that asymmetry that carries considerable ethical weight. 😬
Thus, trust must be earned through repeated interaction while validated through transparency. Users need to understand how the system arrived at its outputs, not just what it produced. The Shape of AI’s memory pattern (giving users control over what the AI retains about them) and stream of thought (surfacing the system’s reasoning for oversight) are how this plays out in practice. Progressive autonomy — designing systems that earn greater independence through demonstrated reliability, with every stage reversible and human authority always ultimate — is literally Dewey’s “doing and undergoing” applied to a relationship that is itself learning. But now the “undergoing” includes being accompanied by something that is changing in response to you!
❖ Embodiments: from authorship to governance
Embodiments are the manifestations of a designer’s ideas into perceptible form — the blend of visual, behavioral, and structural qualities that constitute the “it” through which meaning emerges. This is where IxD craft lives: where philosophical commitments become something you can see, touch, and act through.
AI disrupts a foundational assumption here: that the embodiment is authored, stable, and intentional. Increasingly, embodiments are generated — adaptive, variable, shifting in response to context & inference. Consider various postures from today’s rapidly changing AI landscape. Perplexity bets on epistemic accountability — grounding output in verifiable, cited reality. Notion AI treats the document as a shared canvas, as AI woven into the artifact’s ongoing formation rather than applied from outside. V0 or Replit pushes even more: AI as agentic co-creator, generating functional interfaces with functioning code from natural language with interactive previews of the reasoning in real-time. 😅 Mind-blowing!
So the designer no longer composes elements within a form — they govern a process that produces form. The “it” is summoned or shaped, not just authored, and the designer’s responsibility evolves to the conditions, constraints, and intentions that make the summoning meaningful. 🤨
This is where a new vocabulary of AI design lands with most force:
Latency becomes anticipation; space for reflection and intent.
Non-determinism becomes serendipity; the source of discovery rather than an error to suppress.
Token limits force clarity; demanding precision and meaning.
These are not apologies for AI’s limitations. They are design materials as consequential as color or typography can be, and learning to work with them skillfully is a new craft of embodiment.
The designer’s role has shifted from author to curator: from composing every element of meaning to governing the conditions under which meaning can emerge.
⏳ On temporal instability…
John Dewey’s persistent critical insight is that experience is never static — always a dynamic encounter between an agent and a resistant world. 🫠 What AI adds is that the digital apparatus is also learning. It adapts, drifts, and changes post-deployment through model updates, fine-tuning, and the accumulation of usage in ways designers didn’t specify and may not fully understand. Stewardship in this manner is never finished. The interaction designer working with AI enters into an ongoing, relational practice of governance & re-interpretation — keeping the system honest, legible, humane, and consequential over time. 🙏🏽
That is, I believe, the defining challenge for the field in the next quarter century and beyond. The foundational questions Buchanan posed in that CMU seminar — about the nature of interaction, the elements of a product, the rhetorical structure of a designed argument — turn out to be exactly the right questions to bring to this AI disruption moment. They just need to be asked again, in the presence of a world that is asking them back. 🤔

