Menu
HomeAboutServicesCase StudiesBlogContact
Get Started

Or chat with our AI assistant

Figma Opened the Canvas to AI Agents. Designers Should Pay Attention.
Back to Blog

Figma Opened the Canvas to AI Agents. Designers Should Pay Attention.

Web Development
April 19, 2026
12 min read
A

AWZ Team

Web Development

On March 24, 2026, Figma announced that AI agents can now write directly to Figma files through the MCP server. Not read. Write.

The new use_figma tool lets Claude Code, OpenAI's Codex, GitHub Copilot CLI, Cursor, and any other MCP-compatible client generate and modify design assets on the Figma canvas. Those assets link to your existing design system. Components, variables, auto layout, the whole stack.

"Teams at OpenAI use Figma to iterate, refine, and make decisions about how a product comes together," said Ed Bayes, design lead at Codex. "Now, Codex can find and use all the important design context in Figma to help us build higher quality products more efficiently."

This is not a plugin. It is native to Figma's MCP server infrastructure. It works today in beta, for free, with plans to move to usage-based pricing once the beta ends.

Two Tools, Two Directions

Figma now has two complementary tools for AI agent workflows.

generate_figma_design takes HTML from live applications and websites and converts it into editable Figma layers. This is the reverse-engineering direction. You have a running product, and you want its current UI in Figma so you can iterate on it.

use_figma goes the other direction. It lets agents build new designs on the canvas using your existing components and variables. This is the generative direction. You describe what you want, and the agent builds it with your brand's buttons, spacing, typography, and color tokens.

The two tools connect in a practical workflow: when designs drift out of sync with code (which happens constantly on any active product), generate_figma_design pulls the current state into Figma. Then use_figma edits or extends those designs using the design system. The agent stays within your established conventions instead of inventing its own.

The demo Figma published shows Claude Code generating 72 button variants in a single command, all correctly linked to the design system's component definitions. For context, manually creating 72 variants following proper naming conventions and style tokens is a full day of work.

Skills Are the Real Feature

The use_figma tool is the mechanism. Skills are the strategy.

Skills are markdown files. Plain text, written by humans, that instruct AI agents on how to work within Figma. They define workflows, sequencing, naming conventions, spacing rules, accessibility requirements, anything that represents your team's design decisions and intent.

The foundational skill is /figma-use, which gives every agent a baseline understanding of how Figma works: node structure, layers, auto layout principles. Teams customize on top of this to enforce their own standards.

Nine skills launched on day one, and they show the range of what is possible:

  • /figma-generate-library: Creates new Figma components from an existing codebase. If your React components have evolved beyond what is in Figma, this brings Figma up to date.
  • /figma-generate-design: Builds new screens using existing components and variables. The workhorse skill.
  • /create-voice: Generates screen reader specifications (VoiceOver, TalkBack, ARIA labels) from UI specs. Built by a designer at Uber.
  • /apply-design-system: Connects existing loose designs to system components. If someone designed a screen without using the component library, this skill wires it up after the fact.
  • /sync-figma-token: Syncs design tokens between code and Figma variables with drift detection. Built by Firebender.
  • /multi-agent: Runs parallel agent workflows to implement designs. Built by Augment Code.

Anyone can write a skill. You do not need to build a Figma plugin, write JavaScript, or use the Plugin API. A markdown file with clear instructions is sufficient. This is a deliberate accessibility decision: the people who know how design should work are designers and design leads, not necessarily engineers. Skills let them encode their expertise directly.

"Skills teach Claude Code how to work directly in the design canvas, so you can build in a way that stays true to your team's intent and judgment," said Cat Wu, head of product for Claude Code at Anthropic.

Self-Healing Loops

The detail that separates this from a simple API integration: skills enable iterative refinement through screenshots.

When an agent generates a screen, it can take a screenshot of the result, evaluate whether it matches expectations, and iterate on the parts that do not. Because the output is real Figma structure (components, variables, auto layout), adjustments interact with the actual system, not just pushing pixels around.

AI models are non-deterministic. The same prompt can produce different results. Skills make behavior more predictable by encoding specific steps and constraints. But the self-healing loop adds a second layer of quality control. The agent checks its own work and fixes deviations.

This is the same pattern we see in agentic AI systems more broadly: plan, execute, evaluate, iterate. The difference is that the evaluation happens against a visual design standard rather than test cases.

MCP as the Protocol Layer

Figma chose the Model Context Protocol (MCP) as the integration layer for a reason. MCP is becoming the standard way AI agents interact with external tools and services. Claude Code, Codex, Copilot, Cursor, Warp, Factory, and Firebender all support it. By building on MCP rather than a proprietary API, Figma automatically works with any MCP-compatible client.

We wrote extensively about MCP's security risks in March. The protocol is powerful but immature. At the time, it had 102 documented CVEs and no built-in authentication standard. Figma's implementation benefits from Figma's own security infrastructure, which adds a layer of protection that raw MCP connections lack. But the fundamental challenge remains: when you give an AI agent write access to production design files, you need to think carefully about permissions, access scoping, and audit trails.

Figma currently handles this through its existing user and team permission model. The MCP server respects the same access controls as the Figma UI. An agent operating under a particular user's credentials can only modify files that user can modify. It is a reasonable starting point, but as agent usage scales, more granular controls (per-file, per-component, per-action) will likely be necessary.

What This Changes for Design Teams

The honest assessment: this is early. The beta launched three weeks ago. Skills are new. The community is still figuring out which workflows translate well to agent execution and which do not.

But the direction is clear, and it matters for three reasons.

Design systems become executable infrastructure. When components, variables, and tokens can be consumed and applied by AI agents, the design system is no longer just documentation. It is a runtime specification that governs how products look and behave, whether a human or an agent builds the screen.

The gap between design and code narrows further. Vibe coding already showed that AI can generate functional frontends from natural language. The missing piece was design fidelity. Vibe-coded UIs work but look generic. When the agent can pull your actual components, spacing variables, and color tokens from Figma while generating code, the output starts matching your brand rather than looking like a default template.

Design expertise becomes more valuable, not less. Skills are the mechanism for encoding design judgment. Someone has to write them. Someone has to define what good spacing looks like, how components should compose, what accessibility standards the team follows. Agents can execute faster than any individual designer, but only if someone defines what "correctly" means. That is design leadership.

The Compatibility List

Figma's MCP server currently works with: Augment, Claude Code, Codex, Copilot CLI, Copilot in VS Code, Cursor, Factory, Firebender, and Warp.

The Figma team says they are expanding the use_figma tool toward parity with the Plugin API. Image support and custom fonts are the next features on the roadmap. They are also working on making skills easier to share through the Figma community.

Where This Goes

For small teams and agencies, the immediate application is productivity on repetitive design tasks: generating component variants, building standard screens from existing libraries, keeping Figma and code in sync. A designer defining skills and reviewing agent output can cover ground that previously required a team of three or four.

For enterprise design operations, the value is consistency at scale. When every product team generates screens using the same skills and the same design system, brand coherence improves without manual enforcement.

The long-term question is how design workflows reorganize around this. If agents can generate and modify designs, and skills control the quality, the designer's role shifts from production (making the screens) toward direction (defining how screens should be made). That is a meaningful change in how design teams spend their time.

We build products at AWZ Digital where design-to-code speed directly impacts project timelines. If you are rethinking how your team handles design workflows, or exploring how AI agents fit into your development process, reach out.

Tags

Figma
MCP
AI Agents
Design Systems
Claude Code
Web Development

Share this article

Related Articles

One App, Five AI Coding Tools, Zero Consensus

One App, Five AI Coding Tools, Zero Consensus

Claude Code, Cursor, Windsurf, Replit Agent, and GitHub Copilot all built the same task management app. Copilot had zero security issues. Windsurf was the fastest. Claude Code wrote the cleanest code. Nobody won outright.

Web DevelopmentApril 21, 202615 min read

Stay Updated

Get the latest insights on AI, automation, and digital transformation delivered to your inbox.