Canvas

Giving AI output a place to live,
evolve, and become something real

C3 Generative AI: Canvas Workspace
Company C3.ai
My Role Product Designer
Scope Designed Canvas for structured, long-form AI output across products
Team Multiple PMs, Engineering, Data Science
Tools Figma
01 — Context

When AI output outgrew the chat window

Chat is good at answering questions. It’s not built for creating something you need to keep. Responses disappear into scrollback. Edits mean re-prompting. A campaign brief or structured report feels fragile inside a conversation thread.

At the same time, multiple product teams were embedding generative AI into their applications. Each team had different requirements and formats. The challenge: design a workspace that works across all of them without starting from scratch for each.

02 — The Problem

AI was generating great content. Users couldn’t do much with it.

The gap wasn’t AI quality. It was what happened after. A document arrived in a chat bubble. To use it: copy it, paste it elsewhere, reformat it, share it. Each step broke the connection between what the AI produced and how the user could act on it.

“How do we create a persistent workspace where AI output doesn’t just land, it lives? Where users can iterate on it, own it, and take it somewhere?”

Before Canvas — ephemeral output in chat

Before: AI responses lived and died in the conversation scroll

03 — Discovery

Watching the workaround

Before defining a solution, I needed to understand the real behavior. Through conversations with PMs embedded in different product teams, a pattern emerged: users weren’t failing at prompting. They were failing at what came next. AI-generated content was being copied out of the product and pasted into other tools to be reformatted, edited, and shared. Every step happened somewhere else. The workaround was the workflow.

That insight reframed the problem entirely. This wasn’t about improving the AI. It was about closing the gap between generation and use, building a space where output could live, be owned, and actually go somewhere.

“Users weren’t asking for better AI output. They were asking, implicitly, for somewhere to put it.”

04 — Concept

The Canvas Resource: a new kind of output

The core concept was the Canvas Resource: a structured, editable block that lives in the AI interface and behaves differently from a chat message. Four properties defined it from the start:

✏️

Editable

Full formatting and structure control. The AI generates; the user owns.

↩️

Interactive

Give feedback, request changes, iterate inline without leaving the workspace.

Actionable

Canvas content can trigger downstream actions: updating an object, sending a message, launching a workflow.

Composable

Resources fit inside larger flows. A brief or a table isn’t standalone: it’s part of something bigger.

This gave cross-functional teams a shared vocabulary. Alignment happened because everyone agreed on what a Canvas Resource needed to do, wherever it appeared.

05 — Design

A transition that feels earned, not abrupt

When an AI response meets the criteria for a Canvas Resource (long-form, structured, meant to persist), Canvas activates automatically. Users don’t switch modes. They just keep working. Resources link back to the conversation as anchors, so moving between dialogue and document feels like one continuous thought.

Once a resource exists, users can edit it in two ways: through the Assistant with natural language commands, or directly in the rich text editor. The minimized view is read-only, and manual editing is only available in the expanded state, which prevents accidental edits in the compact UI and reinforces a clear interaction model: minimized to view, expanded to edit. Editing through the agent lets users delegate complex or repetitive work, like formatting, summarizing, or rewriting, reducing friction for users who prefer not to edit directly.

Editing and iterating on an existing resource via the Assistant

Creating a new object from the expanded view

Full-screen editing was non-negotiable. Enterprise users creating briefs or structured plans need space to think. Full-screen removes all chrome and puts content first.

Canvas full experience — all states

All Canvas states: Search link, Canvas view, full-screen editing

06 — Expansion

A text editor became a content system

We started with a rich text editor. Canvas quickly expanded as new needs emerged: tables, email templates, form components. The four properties (editable, interactive, actionable, composable) became the test for whether a new format belonged in Canvas or not.

Canvas resource types — text, table, email, form

The Canvas type system: documents to structured data to email templates

07 — Alignment

Design as the integrating layer

Multiple PMs with different visions for how AI output should behave in their context. My job was partly to design, partly to mediate. I authored a unified interaction specification in Figma: shared behaviors, edge cases, and system logic across all Canvas implementations. It gave stakeholders something concrete to align around.

The spec also made engineering conversations more grounded. We could point to specific interactions and have real discussions about feasibility instead of negotiating in abstractions.

Canvas interaction spec — Figma documentation

The unified interaction spec: shared behaviors, edge cases, system logic

08 — Trade-offs

The tensions we held

Consistency vs Flexibility

Shared behaviors had to work for a marketing brief and a code output. The framework needed principles, not rules.

Automatic Trigger vs Explicit Control

Canvas activates when output meets criteria. Clear rules for when it should, and deliberate restraint for when it shouldn’t.

AI Continuity vs Clean Workspace

Users needed conversation context and document space. The transition model kept both without forcing a choice.

09 — Outcome

A workspace that became infrastructure

Key learning

The copy-paste behavior I observed early on turned out to be the most important signal in the project. Users weren’t asking for a better editor, they were telling me, through their workarounds, that AI output had no home. Naming that gap made everything else easier to design. Enterprise AI work isn’t really about interface styling. It’s about understanding what users do when the product doesn’t give them what they need, and building for that.

Previous project Agentic
Workflows
All work ✦
Next project AI Assistant
Component