Automated Design Token Pipeline

An automated pipeline that transforms Figma variables into production-ready design tokens for Web, iOS, Android, and Flutter. It addresses the "structural gap" where design intent often gets lost or misinterpreted during manual handoff to engineering.
A code editor on the left displaying a JSON file for color configuration, and a mobile app preview on the right showing a button and an image with a head outline graphic.
Go Live
Role
Design System
Update
Jan 7, 2026
Client
Personal

From Figma Variables to Production Code

This side project explores a practical problem many product and design system designers face: Figma variables data stop at design, while products ship in code. Between those two points, inconsistencies appear—units drift, naming breaks, and intent gets reinterpreted.

The project documents how I built a small, repeatable pipeline that turns Figma variables data into production-ready design tokens across Web, iOS, Android, and Flutter. The goal is not automation for its own sake, but alignment and business agility through automated theming:

  • Multi-Platform Synchronization: Transform Figma variables data into platform-specific logic instantly, ensuring “Brand Blue” is identical from the Figma canvas to the final production build.
  • Scalable Theme Management: Enable seamless switching between Light, Dark, or High-Contrast modes by automating the delivery of variable "modes" directly into the code.
  • Future-Proof Rebranding: Reduce the risk and effort of visual pivots; a single update in Figma propagates through the pipeline to update the entire product ecosystem in minutes, not months.
  • White-Label Readiness: Provide the infrastructure to support multiple brand identities from a single codebase by simply swapping the token data layer.

Design Systems as Infrastructure

What users see and what we build is the visible layer and component layer: buttons, forms, motion, layouts, full screens. This is the interaction surface and where most design and engineering effort is spent.

Below it is the structural layer. This layer determines consistency, scalability, and maintainability. This is where design rules live: tokens (primitive, semantic, component) and raw values. These are treated as an architecture decision, not a design detail.

Without a strong connection between these layers, predictable failures appear:

  • Manual Mapping: Visual decisions are repeatedly translated into implementation by hand.
  • Visual Drift: Identical components diverge over time. Inconsistencies accumulate without detection.
  • High Change Cost: Foundational changes, such as brand color updates, require widespread manual edits.

The outcome is a polished surface built on an unstable system.

This gap between the visible surface and the structural layer is the core system failure. Solving it requires an explicit bridge that translates design intent into deterministic implementation. That bridge is a unified token pipeline.

Architectural Token Strategy

Transformers like Style Dictionary convert token JSON into CSS, Swift, and Android XML. Raw Figma output is not suitable for this step because it mixes design intent with tool-specific and ambiguous values.

A normalization layer is inserted before platform transformation. It converts Figma data into stable, semantic, platform-agnostic tokens. Only normalized tokens are then transformed into code.

This separation makes the system predictable, scalable, and free of manual fixes.

Each stage of the pipeline is designed with a clear separation of concerns to maintain a scalable Source of Truth:

  • Design Intent (Figma or other Design tool): Where visual styles and variables are authored.
  • Normalization (JSON converter): The engine that cleanses and structures data into a tool-agnostic format.
  • Transformation (Style Dictionary): The logic layer that adapts tokens for specific platform requirements.
  • Production Code: Consistent, automated outputs for CSS, Swift, XML, and Dart.

This unified pipeline bridges the gap between design and production, delivering a system that scales with accuracy and reliability. By removing tight coupling, the architecture remains maintainable as your design tools and engineering frameworks evolve, ensuring 100% consistency across brands and platforms.

The Challenge: Platform Transformation

The unified pipeline bridges design and production, delivering scalable accuracy and reliability, but it directly addresses a core challenge.

Missing Token Hierarchy

Design intent and implementation are conflated in a flat token structure, making it difficult to carry design decisions into production without losing meaning or efficiency. The lack of a semantic layer forces components to reference primitive values directly, tightly coupling UI decisions to raw tokens.

As the system scales across platforms, themes, and brands, this coupling leads to manual overrides, reduced readability for designers, and recurring regressions in production.

Normalization: The Missing Step

Figma exports valid JSON, but not usable tokens. The objective is to map Figma variable scopes and types to Style Dictionary token types. For example, a Figma LINE_HEIGHT variable with a Number type must resolve to a lineHeight token in Style Dictionary.

Implementing this mapping inside the transformation tool creates tight coupling and fragile logic. Each design tool defines types differently, while Style Dictionary is responsible only for platform-specific output generation, independent of any source tool such as Figma. Forcing input conversion into this layer increases complexity and degrades long-term maintainability.

lineHeight is the most complex and error-prone source of mismatch:

  1. Ambiguous values such as number type 32, 150, or 1.5 (is it px, %, or a unit-less multiplier?)
  2. Platform-specific unit systems (px, rem, em, dp, sp, pt) that require explicit conversion
  3. Different lineHeight models across platforms—CSS, iOS, Android, and Flutter interpret the same value differently

Embedding this logic directly into Style Dictionary would result in brittle, overloaded build scripts.

Manual Token merging doesn't scale

Supporting multiple brands and themes introduces combinatorial complexity. Manual workflows break down quickly.

  1. A single update requires changes across multiple files
  2. Brand values drift over time
  3. Maintenance cost grows non-linearly as combinations increase

As a result, consistency becomes difficult to sustain.

Solution: Automated Token Pipeline

The system is built on a strict, layered architecture where higher, more specific layer override lower, more abstract ones, This ensures predictability and control.

Token Architecture

With a four-layer token hierarchy that controls complexity through clear responsibility boundaries.

Layered Model

  • Primitive tokens (Foundation: play as Global): Raw, context-free values such as hex colors or numeric units. These define the available value set. Such as deep-blue: #040273, spacing-1: 16px, radii-1: 8px etc.
  • Alias sematic tokens (Context: play as Base): Mappings that adapt primitives to brands, modes (light/dark), shape, and density variants. Such as brand-500, brand-dark-500, spacing-md, radii-sm etc.
  • Semantic tokens (Intent: play as Theme): Human-readable tokens that describe usage, such as background-primary. Designers work primarily at this layer.
  • Component tokens (Scope: play as Specific theme): Component-specific bindings, limited to individual UI elements, such as button-primary-background, button-padding, button-corner etc.

Design operates at the semantic layer to preserve intent. Build tooling resolves those semantics down to primitives at output time. This separation keeps design decisions expressive while ensuring the final code remains minimal, deterministic, and platform-safe.

Token Normalization

I built a small, independent tool to process JSON data before it reaches Style Dictionary.

  • Converts numeric values into semantic token types
  • Applies unit logic explicitly
  • Aligns output with the W3C Design Token structure

For sample:

The converter normalizes token values into explicit units before transformation. For example, a numeric LINE_HEIGHT value such as 32 from Figma is converted into 32px when a fixed line-height is intended, making the value unambiguous for code generation.

After normalization, Style Dictionary operates deterministically rather than defensively, using px as a stable base unit to transform values into platform-specific units such as rem, em, dp, sp, and pt.

This states intent, scope, and responsibility without overclaiming.

Automated Token Merging

Instead of manually patching tokens, Style Dictionary is extended with custom logic that consumes converted JSON. All cleaning, and formatting happen at build time, not by hand.

Transform Groups

Platform-specific transform bundles (custom/css, custom/android, custom/ios) enforce correct units and formats during output generation.

For Example:

A line-height value of 32px is transformed into 2rem for CSS, 32sp or 32dp for Android (sp for text size, dp for size dimensions), and CGFloat(32.00) on iOS, where the value is unitless and interpreted by the system as points (pt) based on device scale.

This preserves visual consistency while respecting platform-specific conventions.

Custom Transforms

NOTE: Figma not support percentage unit for Line Height

Normalized values are converted deterministically based on semantic meaning, not raw units.

For Example:

A line-height defined as 150% or as a unit-less 1.5 is normalized to 1.5 across platforms, allowing each platform to apply its native line-height calculation model.

As a result, the system remains extensible and is not constrained by Figma’s value definitions.

Automated Merge & Build logic

A layered inheritance architecture isolates concerns to control complexity

  • Atomic separation: Brand, Mode, Shape, and Density live in independent files.
  • Automated discovery: Build scripts detect token layers automatically; no manual registration.
  • Deterministic merging: Overrides resolve in a fixed order: Brand → Mode (Light/Dark) → Shape → Density
  • Scalable by default: Adding one file (e.g., a new brand) inherits all existing logic.
  • Propagated updates: Changes in base layers flow to all themes, maintaining system-wide consistency.

Output targets

The pipeline generates tokens in a single pass per platform:

Metadata

Each output includes:

  • Version: identifies the exact token release used to generate the output
  • Timestamp: records when the file was generated to support reproducibility
  • Source reference: links the output back to the original token source and pipeline stage
  • Description: explains the intent and usage context of the generated values

This metadata makes changes explicit, simplifies debugging, and supports long-term maintenance across teams and platforms.

The complete token pipeline

Generate: Build or Change a value in the Tokens folder → run build → updates flow across the system.

Update: Add / Remove Tokens: Script syncs new value-added automatically and removes unused values.

Add Brand Theme & Versioning: Drop in a new token file → build → system generates all combinations.

Developers receiving the stylesheet CSS can easily select the desired theme variant. Updating styles is as simple as editing the text. The structure follows:

Theme = Brand Name + Mode (Light/Dark) + Shape (Base/Round/Rounded) + Density (Comfortable/Compact).

However, The system follows Style Dictionary’s required token schema. Token files must use agreed-upon structures, naming, and value types. This reduces flexibility when authoring tokens, but creates a clear contract between design and code. The tradeoff is intentional: less freedom at the input level in exchange for predictable outputs, easier maintenance, and consistent behavior across platforms.

Key takeaway

  1. Raw exports are not a system: JSON alone does not equal structure. Without normalization, tokens are just numbers.
  2. Typography exposes every weakness: Font size, line height, and spacing reveal how differently platforms behave. Ignoring this guarantees visual drift.
  3. Designers benefit from understanding constraints: This project was not about becoming an engineer. It was about understanding why designs break in code.
  4. AI accelerates learning, not judgment: AI helped automate scripts and validate structures. Design decisions still required intent and discipline.

Why This Matters

This pipeline clarifies the limitations of traditional handoff and shows how systems reduce ambiguity. It reframes design systems as infrastructure rather than documentation.

As Design System designer, it shows how separating intent, normalization, and output creates scalability without fragility.

Closing

This side project is not about speed or replacing people with automation. It is about precision. When design intent is expressed as structured data, teams stop negotiating basics and start solving real product problems.

Design systems don’t succeed because of tools. They succeed because decisions survive the journey from design to code.

Team members
No items found.
Ready to Partner on Your Next Project?
Drop me a message and let's share ideas.