Velocity
Platform

Figma MCP Server

Role
Senior Design Technologist — designed and built the MCP server end-to-end
User Problem
Figma's API wasn't designed for AI consumption — deeply nested structures made it unusable for agents.
Business Problem
AI coding agents couldn't read design specs, creating a gap between design intent and implementation.
Impact
Built June 2025, before Figma's official MCP server. Used with Cursor and Jetski.

What it does

Point it at a Figma file (or a specific node), and it traverses the document tree, resolves all component instances — including references to external library files — simplifies the structure, and returns it as YAML that an AI agent can reason about. Dual transport: stdio mode for MCP client integration (Cursor, Jetski) and HTTP mode for direct testing.

I built this in June 2025, months before Figma shipped their official MCP server. Figma’s API wasn’t designed for AI consumption — deeply nested structures, huge payloads, cross-file component references. This server bridges that gap.

The hard parts

External component resolution. When a Figma file uses instances from a shared library, the component definitions live in a different file. The server pre-loads specified library files and builds a cross-file node map so instances resolve correctly.

“Request too large” mitigation. Large Figma files exceed the API’s response size limit. When a node_id is provided, the server fetches only that node via the nodes endpoint instead of the whole file.

Iterative local dependency resolution. With resolve_local_dependencies=True, the server discovers all component instances within the requested node and automatically fetches their definitions — the caller doesn’t need to know which libraries are involved.

Stack: Python, FastMCP, FastAPI, httpx (async), Pydantic, PyYAML.

GitHub repo

Artifact Evidence

Project: figma-mcp-server

Access Restricted

Access Required

Enter the access key to view projects

Hint: See intro email for access key.