@createlex/figgen
v1.7.4
Published
CreateLex MCP runtime for Figma-to-SwiftUI generation and Xcode export
Downloads
4,767
Readme
CreateLex Figma to SwiftUI MCP
A local MCP (Model Context Protocol) runtime that connects your Figma plugin session to Claude Code, Cursor, Windsurf, or any MCP-compatible AI tool — and writes production-ready SwiftUI directly into your Xcode project.
Table of contents
- Architecture
- Install & start
- MCP client configuration
- Generation tiers
- Recommended workflow
- MCP tools reference
- Generation modes
- Output file structure
- Building from source
- code.ts internals
- Key data types
- Bridge protocol
- Environment variables
- Key source files
- How to extend the codebase
Architecture
Three components run together whenever you use the plugin:
Figma desktop app
└── Plugin sandbox (code.ts → code.js)
│ WebSocket ws://localhost:7765/bridge
▼
companion/bridge-server.cjs HTTP + WebSocket on :7765
│ WebSocket
▼
companion/mcp-server.mjs MCP stdio server
│ MCP protocol
▼
AI IDE (Claude Code / Cursor / Windsurf / …)| Layer | File | Role |
|---|---|---|
| Figma plugin | code.ts → code.js | Reads Figma node tree, exports PNG/SVG assets, generates SwiftUI code |
| Bridge | companion/bridge-server.cjs | Routes messages between plugin and MCP agents over WebSocket; exposes REST endpoints |
| MCP server | companion/mcp-server.mjs | Wraps bridge in MCP protocol; exposes 23 tools; orchestrates BYOK generation |
| Xcode writer | companion/xcode-writer.cjs | Writes Swift files + asset catalog entries to the Xcode project on disk |
| Local LLM | companion/local-llm-generator.cjs | Calls Anthropic / OpenAI / HuggingFace APIs for Tier 2 generation |
| Auth | companion/createlex-auth.cjs | JWT validation and refresh for Tier 3 CreateLex subscription |
Port 7765 is the single shared port. The bridge server listens on HTTP (REST) and WebSocket (/bridge path) simultaneously.
Install & start
Recommended: Multi-IDE setup (Cursor, Claude, VS Code, etc.)
This approach allows multiple IDEs to share a single Figma connection without port conflicts.
Install globally:
npm install -g @createlex/figgenRun once (terminal or login item):
figgen bridge --project /path/to/MyApp/MyAppStarts the persistent daemon on port 7765.
Auto-configure your IDEs:
figgen setupDetects installed IDEs and adds the figma-swiftui MCP server automatically.
Legacy: Single-IDE mode
If you only use one IDE and want it to manage the bridge lifecycle:
figgen start --project /path/to/MyApp/MyAppMCP client configuration
Most users should use figgen setup to configure this automatically. If configuring manually:
{
"mcpServers": {
"figma-swiftui": {
"command": "figgen",
"args": ["mcp"],
"env": {
"FIGMA_SWIFTUI_PROJECT_PATH": "/path/to/MyApp/MyApp",
"ANTHROPIC_API_KEY": "sk-ant-..."
}
}
}
}Note: Use figgen mcp as the command. It is a lightweight adapter that connects to the running figgen bridge.
For Hugging Face:
{ "env": { "HF_API_TOKEN": "hf_...", "HF_MODEL": "Qwen/Qwen2.5-Coder-32B-Instruct" } }For Ollama (fully local):
{ "env": { "OPENAI_API_KEY": "ollama", "OPENAI_BASE_URL": "http://localhost:11434/v1", "OPENAI_MODEL": "llama3" } }Generation tiers
| Tier | How | Cost |
|---|---|---|
| 1 — AI-native | AI IDE calls figma_to_swiftui; server exports assets and returns either a validated one-shot write or a prompt package for the IDE model to finish with write_generated_swiftui_to_xcode | Your existing AI subscription |
| 2 — BYOK | Set ANTHROPIC_API_KEY, OPENAI_API_KEY, or HF_API_TOKEN; MCP server calls the model and only accepts the result if it passes a fidelity check | Your API key |
| 3 — CreateLex hosted | No keys set; server may use hosted semantic generation, but rejects low-fidelity output and falls back to the AI-native prompt package | CreateLex subscription |
Login (Tier 3 only)
npx @createlex/figgen login
# Saves session to ~/.createlex/auth.jsonRecommended workflow
1. Select a frame in Figma
2. Ask your AI: "Generate SwiftUI for my selected Figma frame"
3. AI calls `figma_to_swiftui`
4. If `oneShot=true`, files are already written to Xcode
5. If `oneShot=false`, AI uses `generationPrompt` to generate SwiftUI and then calls `write_generated_swiftui_to_xcode`Critical: write_selection_to_xcode overwrites the Swift file on every call and is now best treated as a direct-write scaffold tool, not the default high-fidelity path. Apply all manual refinements (adaptive layout, hidden node removal, real buttons, etc.) after the last MCP call, not before.
To hide a node from export: set it invisible in Figma before running. Hidden nodes (visible: false) are skipped in both the asset export plan and code generation.
MCP tools reference
Bridge & status
| Tool | Description |
|---|---|
| bridge_status | WebSocket health, protocol version, connected agent count |
| auth_status | CreateLex subscription status |
| get_document_summary | File name, pages, current page |
| get_viewport_context | Viewport center, zoom, current page |
Inspection
| Tool | Parameters | Description |
|---|---|---|
| get_metadata | scope: 'selection'|'page'|'node' | Node metadata, fills, effects, layout |
| get_node_snapshot | nodeId, maxDepth 0–8 (default 3) | Rich structural snapshot for a specific node ID |
| get_selection_snapshot | maxDepth (default 3) | Snapshot of current selection |
| get_page_snapshot | maxDepth (default 2) | Full page hierarchy |
| find_nodes | name (substring), type (optional), limit ≤200 | Search nodes by name |
| dump_tree | nodeIds (optional) | Readable indented text tree of selection or specific nodes |
Design context & generation
| Tool | Parameters | Description |
|---|---|---|
| get_design_context | excludeScreenshot (bool) | Full node tree + asset export plan + generation hints. Also available via GET http://localhost:7765/design-context |
| get_swiftui_generation_prompt | — | Returns systemPrompt + userMessage for the preferred AI-native high-fidelity workflow |
| generate_swiftui | generationMode, scope | Runs plugin generator; returns code without writing to disk |
| analyze_generation | generationMode | Code + per-node diagnostics + refinement hints |
Asset export
| Tool | Parameters | Description |
|---|---|---|
| get_asset_export_plan | — | Lists all icon/vector/image/raster candidates with format and blend mode |
| get_screenshot | nodeId (optional), includeImages (bool) | PNG screenshot of a node |
| export_svg | nodeId | Exact SVG for a vector-friendly node |
Xcode integration
| Tool | Parameters | Description |
|---|---|---|
| get_project_path | — | Reads saved Xcode source folder |
| set_project_path | path | Persists Xcode project path to config |
| write_selection_to_xcode | generationMode, includeOverflow, projectPath, nodeIds | Direct-write scaffold tool. Generates from Figma selection + exports PNG/SVG assets + writes Swift + asset catalog in one call |
| write_generated_swiftui_to_xcode | code, structName, images[], additionalFiles[], projectPath | Writes pre-generated Swift code + manually provided image payloads |
| write_svg_to_xcode | nodeId, assetName | Exports a single SVG directly to Assets.xcassets |
| figma_to_swiftui | scope, generationMode | Preferred universal entrypoint. Exports assets, then either writes validated one-shot output or returns an AI-native prompt package |
Component analysis
| Tool | Parameters | Description |
|---|---|---|
| extract_reusable_components | scope: 'selection'|'page' | Identifies repeated structures for SwiftUI component extraction |
write_selection_to_xcode vs write_generated_swiftui_to_xcode
write_selection_to_xcode— plugin generates code AND exports assets automatically. Use this when you explicitly want an immediate local scaffold write and can tolerate lower fidelity than the AI-native flow.write_generated_swiftui_to_xcode— you supply the Swift code string and base64-encoded image payloads. This is the preferred final write step when an external or IDE-integrated LLM generated the code from the prompt package.
Generation modes
Editable (default)
Generates native SwiftUI: Text, VStack, HStack, ZStack, Image("name"), TextField, Button. Each asset becomes Image("assetName").resizable().blendMode(...). Best for responsive, interactive screens.
Fidelity
Rasterizes entire frames to a single PNG. Applied automatically when shouldRasterizeAbsoluteFrame() returns true (only in fidelity mode): 6+ complex children, layered image+text combos, or unsupported styles. Preserves exact visual appearance at cost of editability.
When the plugin chooses each mode automatically
- Interactive scaffold —
maybeGenerateInteractiveScaffold()runs first on every node. If it recognises a semantic screen pattern (welcome/auth, calendar, task list, generic form, timeline), it generates a fully wired@Statescaffold and skips the node-by-node path entirely. - Node-by-node — If no scaffold matches,
nodeToSwiftUI()is called recursively.
Output file structure
YourApp/ ← Xcode source folder (FIGMA_SWIFTUI_PROJECT_PATH)
├── FigmaGenerated/
│ ├── Screens/
│ │ ├── Welcome.swift
│ │ └── GetStartedRegister.swift
│ ├── Components/
│ │ └── PrimaryButton.swift
│ ├── DesignTokens.swift ← if design tokens extracted
│ └── Manifest/
│ ├── index.json ← catalog of all generated screens
│ ├── Welcome.json
│ └── GetStartedRegister.json
└── Assets.xcassets/
├── Welcome_BackgroundArt.imageset/
│ ├── Welcome_BackgroundArt.png
│ └── Contents.json ← { scale: "2x" }
├── Welcome_LogoArt.imageset/
│ ├── Welcome_LogoArt.svg
│ └── Contents.json ← { preserves-vector-representation: true }
└── …Asset naming: {StructName}_{AssetName} — e.g. Welcome_HeroArt, GetStartedRegister_BgTexture.
Building from source
npm install
# Compile code.ts → code.js (required after any change to code.ts)
npm run build
# Watch mode
npm run watch
# Smoke tests
npm run test:bridge-smoke
npm run test:high-fidelity-smokeAfter editing code.ts: run npm run build, then reload the plugin in Figma (right-click plugin → Reload plugin).
Figma manifest (manifest.json): plugin ID 1620328279208269762, entry point code.js, UI ui.html.
code.ts internals
code.ts is ~7400 lines. It is organized into logical sections:
Section map
| Lines | Section | What it contains |
|---|---|---|
| 1–50 | Bootstrap | figma.showUI, global state declaration, message handler wiring |
| 51–560 | Serialization | serializeNodeForBridge, serializeNodeLayout, serializeNodeStyle, serializeNodeText, serializePaint, serializeEffects — convert Figma objects to plain JSON for bridge transmission |
| 561–730 | Asset export plan | flattenSceneNodes, isSvgFriendlyNode, isPrimitiveVectorNode, buildAssetExportPlan |
| 731–1075 | Reusable components | buildReusableComponentPlan, buildStructureSignature, buildTextPropCandidates |
| 1076–1160 | Refinement hints | buildManualRefinementHints — produces the manualRefinementHints array in design context |
| 1161–1530 | Bridge handler | handleBridgeRequest, postBridgeResponse, postBridgeEvent, resolveBridgeTargetNodes |
| 1525–1830 | Interfaces & registry | All TypeScript interfaces; imageRegistry, generationDiagnostics, imageNames globals; uniqueImageName, recordGenerationDiagnostic |
| 1831–2190 | Asset utilities | exportNodeAsImage, exportNodeAsPng, exportNodeAsSvg, extractImageFillBytes, hasDropShadow, collectRasterizationReasons, isRasterHeavyNode, shouldRasterizeAbsoluteFrame |
| 2187–2291 | Generation orchestration | buildDesignContext, generateSwiftUIResult, generateSwiftUI |
| 2292–3260 | Interactive scaffolds (task/calendar) | maybeGenerateInteractiveScaffold, detectInteractiveAddTaskScreen, buildInteractiveAddTaskScreenCode, detectInteractiveCalendarScreen, buildInteractiveCalendarScreenCode |
| 3260–3980 | Interactive scaffold helpers | Tab detection, heading/subtitle extraction, bottom bar inference, primary button detection |
| 3981–5250 | Calendar & timeline | detectInteractiveCalendarPickerScreen, detectInteractiveTimelineCalendarScreen and their builders |
| 5258–6158 | Generic list & form | detectInteractiveGenericListScreen, detectInteractiveGenericFormScreen and their builders |
| 6159–6400 | Core node dispatch | nodeToSwiftUI (switch on node type), exportAsImageNode, frameToSwiftUI |
| 6401–6720 | Node type renderers | groupToSwiftUI, textToSwiftUI, rectToSwiftUI, ellipseToSwiftUI, absoluteRasterChildToSwiftUI, exportMaskedNodesToSwiftUI |
| 6720–7060 | Layout utilities | groupAbsoluteChildren, mergeButtonPairs, absoluteChildOrigin, visibleBoundsWithinParent, inferContainer |
| 7060–7390 | SwiftUI modifier builders | buildFrameModifiers, strokeToSwiftUI, shadowToSwiftUI, blendModeToSwiftUI, fillToSwiftUIColor, fontWeightToSwiftUI, sanitizeName, generatePreview |
Global state (mutable across a generation run)
let generationModeOption: GenerationMode = 'editable'; // set at start of generateSwiftUIResult
let includeOverflowOption = false; // whether to ignore Figma clip regions
const imageRegistry: ImageExport[] = []; // accumulates every PNG/SVG export
const generationDiagnostics: GenerationDiagnostic[] = []; // per-node strategy log
const imageNames = new Set<string>(); // prevents duplicate asset names
let imageNameNamespace = ''; // prefix for uniqueImageName()All four are reset at the top of generateSwiftUIResult() before each run.
Generation call graph
write_selection_to_xcode (MCP tool)
└── handleBridgeRequest('generate-swiftui-write', …) [code.ts:1307]
└── generateSwiftUIResult(nodes, includeOverflow, generationMode) [code.ts:2228]
├── imageRegistry.length = 0 (reset global state)
├── for each node:
│ ├── maybeGenerateInteractiveScaffold(node, structName) [tries semantic patterns first]
│ │ ├── detectInteractiveWelcomeScreen → buildInteractiveWelcomeScreenCode
│ │ ├── detectInteractiveCalendarScreen → buildInteractiveCalendarScreenCode
│ │ ├── detectInteractiveGenericListScreen → buildInteractiveGenericListScreenCode
│ │ ├── detectInteractiveGenericFormScreen → buildInteractiveGenericFormScreenCode
│ │ └── (returns null if no pattern matches)
│ └── nodeToSwiftUI(node, indent) [if no scaffold]
│ ├── FRAME/COMPONENT/INSTANCE → frameToSwiftUI
│ │ ├── shouldRasterizeAbsoluteFrame? → exportAsImageNode [fidelity only]
│ │ ├── hasImageFill? → image + children ZStack
│ │ ├── hasFill + isPill + allTextChildren? → styled Button
│ │ ├── layoutMode VERTICAL → VStack { children }
│ │ ├── layoutMode HORIZONTAL → HStack { children }
│ │ └── layoutMode NONE → ZStack { absoluteRasterChildToSwiftUI | nodeToSwiftUI }
│ │ └── auto-layout child? → nodeToSwiftUI (recurse)
│ │ └── other child? → absoluteRasterChildToSwiftUI (rasterize)
│ ├── GROUP → groupToSwiftUI → exportAsImageNode (if raster-heavy)
│ ├── TEXT → textToSwiftUI
│ ├── RECTANGLE → rectToSwiftUI
│ ├── ELLIPSE → ellipseToSwiftUI
│ └── VECTOR/STAR/POLYGON/LINE/BOOLEAN_OPERATION → exportAsImageNode
└── returns SwiftUIResult { code, images: imageRegistry, diagnostics }
└── xcode-writer writes Swift file + asset catalog entriesnodeToSwiftUI decision tree
nodeToSwiftUI(node)
│
├── visible === false → return ''
│
├── FRAME / COMPONENT / INSTANCE → frameToSwiftUI
│ │
│ ├── [fidelity mode only] shouldRasterizeAbsoluteFrame?
│ │ → exportAsImageNode (whole frame becomes one PNG)
│ │
│ ├── hasImageFill?
│ │ → export image fill + render children on top
│ │
│ ├── hasFill AND cornerRadius ≥ 40% AND all children are TEXT?
│ │ → styled Text button (pill shape)
│ │
│ ├── layoutMode VERTICAL → VStack(alignment:, spacing:) { each child via nodeToSwiftUI }
│ ├── layoutMode HORIZONTAL → HStack(alignment:, spacing:) { each child via nodeToSwiftUI }
│ │
│ └── layoutMode NONE (absolute) → ZStack(alignment: .topLeading)
│ groupAbsoluteChildren() groups children into:
│ 'button' pair → buttonFromPair() (shape + text → styled Button)
│ 'mask' group → exportMaskedNodesToSwiftUI()
│ single node →
│ TEXT → nodeToSwiftUI
│ auto-layout FRAME → nodeToSwiftUI ← recurse (preserves native SwiftUI)
│ everything else → absoluteRasterChildToSwiftUI (export as PNG)
│
├── GROUP → groupToSwiftUI
│ └── raster-heavy or has blend children → exportAsImageNode
│ otherwise → ZStack of children via nodeToSwiftUI
│
├── TEXT → textToSwiftUI → Text("…").font(…).foregroundStyle(…)…
├── RECTANGLE → rectToSwiftUI → Color / Image / RoundedRectangle
├── ELLIPSE → ellipseToSwiftUI → Circle / Ellipse / Image
└── VECTOR / STAR / POLYGON / LINE / BOOLEAN_OPERATION → exportAsImageNode (always rasterized)Rasterization decision chain
A node is rasterized (exported as PNG) when any of the following is true:
- Its type is
VECTOR,STAR,POLYGON,LINE, orBOOLEAN_OPERATION— always rasterized. - It is a
GROUPwith raster-heavy children or blend-mode children. - It has
rotation != 0(SwiftUI.rotationEffectcan approximate, but rasterizing is exact). - Its blend mode is not
NORMALorPASS_THROUGH. - It has a drop shadow (
hasDropShadowreturns true). - It has an image fill and is not a simple
RECTANGLEorFRAME. - [Fidelity mode only]
shouldRasterizeAbsoluteFramereturns true: frame has 6+ complex children, OR 2+ image fills under 1+ text child, OR is a complex absolute ZStack.
collectRasterizationReasons(node) returns an array of string reason codes (e.g. 'rotation', 'blend-mode', 'image-fill', 'mask-descendants') that appear in GenerationDiagnostic.reasons.
Asset export plan classification
buildAssetExportPlan iterates all descendant nodes and classifies each:
for each node (skip if hidden, skip if already seen):
isSvgFriendlyNode(node)?
→ kind: 'icon' (≤160×160) or 'vector' (larger), format: 'svg'
hasImageFill(node)?
→ kind: 'image', format: 'png'
isRasterHeavyNode(node) AND size > 0?
→ kind: 'raster', format: 'png'
Sort: SVG candidates first, then alphabetically. Cap at 50 candidates.isSvgFriendlyNode requirements:
- No image fills
- No drop shadows
- No unsupported effects
- Either: is a primitive vector node (VECTOR, BOOLEAN_OPERATION, STAR, POLYGON, LINE, ELLIPSE, RECTANGLE)
- Or: has ≤24 descendants, all of which are primitive vectors or groups, and none are TEXT
Interactive scaffold detection
Before node-by-node generation runs, maybeGenerateInteractiveScaffold tests the top-level node against a series of semantic detectors:
| Detector | Pattern it recognises | Scaffold it generates |
|---|---|---|
| detectInteractiveWelcomeScreen | Large hero image + 1–3 pill buttons at bottom | @State var selectedAction + Button closures |
| detectInteractiveAddTaskScreen | Input fields + tag list + subtask list | Full task-creation form with @State |
| detectInteractiveCalendarScreen | Calendar grid + event rows | @State var selectedDate + event list |
| detectInteractiveCalendarPickerScreen | Month grid with day cells | Date picker with month navigation |
| detectInteractiveTimelineCalendarScreen | Day columns + time slots | Timeline scroll view |
| detectInteractiveGenericListScreen | Repeated row structures | List with ForEach and row view |
| detectInteractiveGenericFormScreen | Label + value field pairs | Form with @State per field |
If none match, the node is rendered node-by-node via nodeToSwiftUI.
Key data types
// What generateSwiftUIResult returns
interface SwiftUIResult {
code: string; // complete Swift source (may be multiple structs joined by separator)
images: ImageExport[]; // all PNG/SVG assets collected during generation
diagnostics: GenerationDiagnostic[]; // per-node strategy log
selection: { ids: string[]; names: string[] };
}
// One exported asset (PNG or SVG)
interface ImageExport {
name: string; // asset catalog name, used in Image("name") in SwiftUI
base64: string; // PNG bytes as base64 string (SVGs are stored as text in this field)
width: number;
height: number;
}
// Per-node record of what strategy was used and why
interface GenerationDiagnostic {
strategy: string; // e.g. 'absolute-raster-child', 'interactive-welcome-screen'
nodeIds: string[];
nodeNames: string[];
nodeTypes: string[];
reasons: string[]; // e.g. ['mask-descendants', 'absolute-layout-export']
assetName?: string; // set when node was exported as an asset
component?: Record<string, unknown> | null;
}
// One candidate in the asset export plan
interface AssetExportPlanCandidate {
nodeId: string;
nodeName: string;
nodeType: string;
kind: 'icon' | 'vector' | 'image' | 'raster';
suggestedAssetName: string;
suggestedFormat: 'svg' | 'png';
dimensions: { width: number | null; height: number | null };
reasons: string[];
blendMode: string | null; // Figma blend mode string
blendModeSwiftUI: string | null; // ready-to-paste SwiftUI value e.g. '.multiply'
}
// Identified reusable component candidate
interface ReusableComponentCandidate {
source: 'figma-component' | 'repeated-structure';
suggestedComponentName: string;
occurrenceCount: number;
signature: string; // structural hash used to detect repetition
nodeIds: string[];
nodeNames: string[];
parentName: string | null;
figmaComponent: Record<string, unknown> | null;
propCandidates: ReusableComponentPropCandidate[];
}Bridge protocol
The plugin communicates with the bridge server via WebSocket. All messages are JSON.
Plugin → bridge (outbound events)
// Selection changed
{ "type": "bridge-event", "event": "selection-changed", "data": { "count": 1, "names": ["Welcome"] } }
// Response to a bridge request
{ "type": "bridge-response", "requestId": "uuid", "action": "get-design-context", "ok": true, "payload": { … } }
// Error response
{ "type": "bridge-response", "requestId": "uuid", "action": "generate-swiftui", "ok": false, "error": "No selection" }Bridge → plugin (inbound requests)
{
"type": "bridge-request",
"requestId": "uuid",
"action": "generate-swiftui-write", // action name maps to handleBridgeRequest() switch cases
"params": {
"generationMode": "editable",
"includeOverflow": false,
"nodeIds": [] // optional: override current selection
},
"protocolVersion": 1,
"timestamp": 1712345678000
}Action strings (used in handleBridgeRequest)
| Action | What it does |
|---|---|
| get-design-context | Calls buildDesignContext(), returns full node tree + hints |
| generate-swiftui-write | Calls generateSwiftUIResult(), returns SwiftUIResult |
| get-selection-snapshot | Calls buildSelectionSnapshot() |
| get-page-snapshot | Calls buildPageSnapshot() |
| get-node-snapshot | Calls serializeNodeForBridge() on a specific node |
| get-metadata | Calls buildMetadataSnapshot() |
| dump-tree | Calls buildNodeTreeDump() |
| find-nodes | Calls findBridgeNodes() |
| get-document-summary | Calls buildDocumentSummary() |
| get-viewport-context | Calls buildViewportContext() |
| get-asset-export-plan | Calls buildAssetExportPlan() |
| get-screenshot | Calls buildNodeScreenshot() |
| export-svg | Calls buildSvgExport() |
| get-swiftui-generation-prompt | Returns _generationPrompt from design context |
| extract-reusable-components | Calls buildReusableComponentPlan() |
HTTP endpoints (bridge-server.cjs)
| Method | Path | Description |
|---|---|---|
| GET | /ping | Health check |
| GET | /bridge/info | Connected agents, plugin status, protocol version |
| GET | /design-context | Live design context JSON (same as MCP get_design_context) |
| POST | /set-project | { path } — sets Xcode project path |
| POST | /write | { code, structName, images[], … } — writes files to Xcode (50 MB limit) |
| POST | /shutdown | Graceful bridge shutdown |
Environment variables
Bridge server
| Variable | Default | Description |
|---|---|---|
| FIGMA_SWIFTUI_BRIDGE_PORT | 7765 | HTTP/WebSocket port |
| FIGMA_SWIFTUI_BRIDGE_HOST | localhost | Listen hostname |
MCP server
| Variable | Default | Description |
|---|---|---|
| FIGMA_SWIFTUI_BRIDGE_HTTP_URL | http://localhost:7765 | Bridge HTTP base URL |
| FIGMA_SWIFTUI_BRIDGE_WS_URL | ws://localhost:7765/bridge | Bridge WebSocket URL |
| FIGMA_SWIFTUI_BRIDGE_TIMEOUT_MS | 30000 | Bridge request timeout |
| FIGMA_SWIFTUI_RESPONSE_SIZE_CAP | 102400 | Max response payload (bytes) |
BYOK generation
| Variable | Description |
|---|---|
| ANTHROPIC_API_KEY | Claude API key (Tier 2, recommended) |
| ANTHROPIC_MODEL | Claude model (default: claude-sonnet-4-6) |
| OPENAI_API_KEY | OpenAI key or "ollama" for local |
| OPENAI_MODEL | Model name (default: gpt-4o) |
| OPENAI_BASE_URL | Custom base URL (Ollama, LM Studio) |
| HF_API_TOKEN | Hugging Face inference token |
| HF_MODEL | HF model (default: Qwen/Qwen2.5-Coder-32B-Instruct) |
Xcode
| Variable | Description |
|---|---|
| FIGMA_SWIFTUI_PROJECT_PATH | Explicit Xcode source folder path |
CreateLex auth
| Variable | Description |
|---|---|
| FIGMA_SWIFTUI_ACCESS_TOKEN | Explicit CreateLex JWT |
| FIGMA_SWIFTUI_BYPASS_AUTH | "true" to skip subscription check (testing) |
| CREATELEX_CONFIG_DIR | Auth config directory (default: ~/.createlex) |
| CREATELEX_AUTH_FILE | Auth file path (default: ~/.createlex/auth.json) |
| CREATELEX_API_BASE_URL | CreateLex API base (default: https://api.createlex.com/api) |
Key source files
| File | Lines | Role |
|---|---|---|
| code.ts | ~7400 | Figma plugin: all node serialization, SwiftUI generation, asset export |
| companion/mcp-server.mjs | ~1000 | 23 MCP tools, BYOK orchestration, auth |
| companion/bridge-server.cjs | ~400 | HTTP + WebSocket bridge on :7765 |
| companion/xcode-writer.cjs | ~400 | Writes Swift files + asset catalog to disk |
| companion/local-llm-generator.cjs | ~300 | Anthropic / OpenAI / HuggingFace API calls |
| companion/createlex-auth.cjs | ~200 | JWT validation and refresh |
| bin/figgen.js | — | CLI entry point (start, login) |
| ui.html | — | Figma plugin panel UI |
| manifest.json | — | Figma plugin manifest (ID, permissions, entry) |
| companion-app/ | — | Optional native macOS app for project path selection |
Config persistence path: ~/Library/Application Support/FigmaSwiftUICompanion/config.json (macOS).
How to extend the codebase
Add a new MCP tool
- Open
companion/mcp-server.mjs. - Register a new tool with
server.tool(name, description, zodSchema, handler). - In the handler, call
bridgeRequest(ws, action, params)with a new action string. - Open
code.ts, add acase 'your-new-action':block inhandleBridgeRequest()(around line 1307). - Implement the logic, call
postBridgeResponse(requestId, action, true, { data: result }). - Run
npm run buildand reload the plugin.
Add a new interactive scaffold detector
- Define a new
interface InteractiveXxxBlueprintnear line 1596 incode.ts. - Write
detectInteractiveXxxScreen(node)returning the blueprint ornull. - Write
buildInteractiveXxxScreenCode(structName, node, blueprint)returning the Swift string. - Add a call to both in
maybeGenerateInteractiveScaffold()(around line 2250). - Build and reload.
Change rasterization behavior
- To prevent a node type from rasterizing: modify
isRasterHeavyNode()(~line 2104) orcollectRasterizationReasons()(~line 2067). - To prevent absolute-position children from rasterizing: modify the
isAutoLayoutChildcheck inframeToSwiftUI()(~line 6368). - To add SVG export support for new node patterns: modify
isSvgFriendlyNode()(~line 629).
Add a new SwiftUI modifier
All modifier builder functions live in the 7060–7390 range of code.ts:
buildFrameModifiers— padding, background, corner radius, borderstrokeToSwiftUI,shadowToSwiftUI,blendModeToSwiftUI,opacityToSwiftUI,rotationToSwiftUI
Support
- Product: https://createlex.com/figma-swiftui
- Help & billing: https://createlex.com/contact
