My Role

Product Design

Full-Stack Dev

Product Design

Full-Stack Dev

Timeline

05.25-Present

05.25-Present

Tools

Figma, React, Typescript, HTML, Openrouter

Figma, React, Typescript, HTML, Openrouter

Project Type

Project

Figma Widget

Figma Widget

Figbud, An on‑canvas teacher that turns “how do I do this” into guided actions.

Figbud —
An on‑canvas teacher that turns “how do I do this” into guided actions.

Designed, built, and deployed an AI-powered Figma widget that teaches designers how to create clean, reusable components directly on the canvas.

3 Months

3 Months

From concept to live, working Version 1.

From concept to live, working Version 1.

Team of 2

Team of 2

Designed, engineered, and integrated the entire product end-to-end.

Designed, engineered, and integrated the entire product end-to-end.

75%

75%

AI Efficiency, Optimized tool calls so three-quarters of AI responses required no manual retry or cleanup.

AI Efficiency, Optimized tool calls so three-quarters of AI responses required no manual retry or cleanup.

User Research

Designers’ Biggest Pain Points

Designers’ Biggest Pain Points

We combined market analysis, netnography on Reddit, Twitter and Figma community forums, as well as 4 in-depth user interviews (beginner, intermediate, and advanced designers). Across all sources, the same themes emerged:

We combined market analysis, netnography on Reddit, Twitter and Figma community forums, as well as 4 in-depth user interviews (beginner, intermediate, and advanced designers). Across all sources, the same themes emerged:

How This Shaped FigBud

Our findings became the blueprint for FigBud’s feature set.

Our findings became the blueprint for FigBud’s feature set.

Timestamped Tutorial Search

Pain Point: Designers waste 30–90 minutes searching YouTube or blogs for one answer.

Pain Point: Designers waste 30–90 minutes searching YouTube or blogs for one answer.

Feature Response: FigBud uses AI to pull precisely timestamped tutorials from YouTube and the Figma Community, surfacing the exact 30–60 seconds needed to solve a task without leaving the canvas.


“If I could type ‘auto-layout’ and get the video at the right spot, I’d use it every time I’m stuck.”

Feature Response: FigBud uses AI to pull precisely timestamped tutorials from YouTube and the Figma Community, surfacing the exact 30–60 seconds needed to solve a task without leaving the canvas.


“If I could type ‘auto-layout’ and get the video at the right spot, I’d use it every time I’m stuck.”

Sandbox Mode — Learn by Doing

Pain Point: Users want to practice new skills without fear of breaking their designs.

Pain Point: Users want to practice new skills without fear of breaking their designs.

Feature Response: A safe “sandbox” inside Figma where you can try building components, flows, or interactions while FigBud teaches in real time, showing and guiding every step.

“I’d feel more confident if it showed me exactly where to click or what to do next.”

Feature Response: A safe “sandbox” inside Figma where you can try building components, flows, or interactions while FigBud teaches in real time, showing and guiding every step.

“I’d feel more confident if it showed me exactly where to click or what to do next.”

Always-On Context Awareness

Pain Point: Designers waste time re-orienting themselves in complex files and components.

Pain Point: Designers waste time re-orienting themselves in complex files and components.

Feature Response: FigBud always knows what you’ve selected — whether it’s a frame,

component, or variant — and adapts its guidance, tutorials, and suggestions to that exact context.

“It would help if I could get advanced tips for the thing I’m working on right now.”

Feature Response: FigBud always knows what you’ve selected — whether it’s a frame,

component, or variant — and adapts its guidance, tutorials, and suggestions to that exact context.

“It would help if I could get advanced tips for the thing I’m working on right now.”

Scoping → Designing Flows → Paired Agile Dev/Design

Balancing Ambition with Feasibility

Balancing Ambition with Feasibility

Scoping

In early sprints, mapped user needs to features, cut anything non-essential, and locked a lean backlog of 6 must-haves — only what solved real pain and could ship in <2 weeks.

Scoping

In early sprints, mapped user needs to features, cut anything non-essential, and locked a lean backlog of 6 must-haves — only what solved real pain and could ship in <2 weeks.

Designing Flows

In each sprint, turned features into rapid, testable journeys — low-fi to hi-fi in Figma — with clear start/end points so dev could build without guesswork.

Designing Flows

In each sprint, turned features into rapid, testable journeys — low-fi to hi-fi in Figma — with clear start/end points so dev could build without guesswork.

Scoping

In early sprints, mapped user needs to features, cut anything non-essential, and locked a lean backlog of 6 must-haves — only what solved real pain and could ship in <2 weeks.


Designing Flows

In each sprint, turned features into rapid, testable journeys — low-fi to hi-fi in Figma — with clear start/end points so dev could build without guesswork.


Paired Agile Dev/Design

Alternated weekly between design and React dev, shipping small, testable increments each sprint — keeping features moving without handoff delays.

Paired Agile Dev/Design

Alternated weekly between design and React dev, shipping small, testable increments each sprint — keeping features moving without handoff delays.

Scoping → Designing Flows → Paired Agile Dev/Design

How we built it

How we built it

Architecture Overview

/widget – Figma Widget UI + Worker logic

/api – OpenRouter + Supabase API integrations

/lib – Utilities for prompt building, context digesting, and patch-plan validation

/sandbox – Protected learning environment for “Learn by Doing” mode

/context – Context-awareness engine that parses selection data into a usable “Selection Digest”

Architecture Overview

/widget – Figma Widget UI + Worker logic

/api – OpenRouter + Supabase API integrations

/lib – Utilities for prompt building, context digesting, and patch-plan validation

/sandbox – Protected learning environment for “Learn by Doing” mode

/context – Context-awareness engine that parses selection data into a usable “Selection Digest”

Key Architecture Decisions


Context-Driven AI — Worker listens for selection changes, builds a SelectionDigest, and surfaces only relevant tutorials, fixes, or sandbox lessons.


Performance & Resilience — Tracked cache latency, AI timeouts, and SSE chunk size; added circuit breakers for network, AI, YouTube, and DB failures.


Data Flow — Used postMessage() and HTTP/2 SSE to move data between plugin, backend, AI, and YouTube.


Caching — Three-tier system (L1 localStorage, L2 in-memory, L3 Supabase) for speed and reliability.

Key Architecture Decisions


Context-Driven AI — Worker listens for selection changes, builds a SelectionDigest, and surfaces only relevant tutorials, fixes, or sandbox lessons.


Performance & Resilience — Tracked cache latency, AI timeouts, and SSE chunk size; added circuit breakers for network, AI, YouTube, and DB failures.


Data Flow — Used postMessage() and HTTP/2 SSE to move data between plugin, backend, AI, and YouTube.


Caching — Three-tier system (L1 localStorage, L2 in-memory, L3 Supabase) for speed and reliability.

// selectionDigest.ts
export type SelectionDigest = {
  kind: 'FRAME'|'COMPONENT'|'VARIANT'|'OTHER';
  id: string;
  name: string;
  autolayout?: { direction:'H|V'; gap:number; padding:[number,number,number,number] };
  constraints?: { x:string; y:string };
  variants?: Array<{prop:string; value:string}>;
  issues: string[]; // lint-y hints we surface to the user
};

export function buildSelectionDigest(sel: SceneNode[]): SelectionDigest | null {
  if (sel.length !== 1) return null;
  const n = sel[0];

  const digest: SelectionDigest = {
    kind: n.type === 'COMPONENT' ? 'COMPONENT'
       : (n.type === 'INSTANCE' && 'variantProperties' in n) ? 'VARIANT'
       : n.type === 'FRAME' ? 'FRAME' : 'OTHER',
    id: n.id,
    name: n.name,
    issues: []
  };

  if ('layoutMode' in n) {
    digest.autolayout = {
      direction: n.layoutMode === 'HORIZONTAL' ? 'H' : 'V',
      gap: n.itemSpacing ?? 0,
      padding: ['paddingTop','paddingRight','paddingBottom','paddingLeft']
        .map(k => (n as any)[k]) as any
    };
    if (n.layoutMode === 'NONE') digest.issues.push('No Auto Layout');
  }

  if ('constraints' in n) {
    digest.constraints = { x: n.constraints.horizontal, y: n.constraints.vertical };
  }

  if ('variantProperties' in (n as any)) {
    const vp = (n as any).variantProperties ?? {};
    digest.variants = Object.entries(vp).map(([prop, value]) => ({ prop, value: String(value) }));
  }

  if (!n.name.match(/\w+\/\w+/)) digest.issues.push('Non-hierarchical name');

  return digest;
}

My Contributions


Built the SelectionDigest engine for real-time, context-aware guidance.

Created Sandbox Mode with step-by-step interactive learning.

Integrated Supabase for memory, progress, and telemetry.

Designed optimized OpenRouter prompts for accuracy + efficiency.

Implemented patch-plan validation and undo-safe edits for reliability.

My Contributions


Built the SelectionDigest engine for real-time, context-aware guidance.


Created Sandbox Mode with step-by-step interactive learning.


Integrated Supabase for memory, progress, and telemetry.


Designed optimized OpenRouter prompts for accuracy + efficiency.


Implemented patch-plan validation and undo-safe edits for reliability.

// selectionDigest.ts
export type SelectionDigest = {
  kind: 'FRAME'|'COMPONENT'|'VARIANT'|'OTHER';
  id: string;
  name: string;
  autolayout?: { direction:'H|V'; gap:number; padding:[number,number,number,number] };
  constraints?: { x:string; y:string };
  variants?: Array<{prop:string; value:string}>;
  issues: string[]; // lint-y hints we surface to the user
};

export function buildSelectionDigest(sel: SceneNode[]): SelectionDigest | null {
  if (sel.length !== 1) return null;
  const n = sel[0];

  const digest: SelectionDigest = {
    kind: n.type === 'COMPONENT' ? 'COMPONENT'
       : (n.type === 'INSTANCE' && 'variantProperties' in n) ? 'VARIANT'
       : n.type === 'FRAME' ? 'FRAME' : 'OTHER',
    id: n.id,
    name: n.name,
    issues: []
  };

  if ('layoutMode' in n) {
    digest.autolayout = {
      direction: n.layoutMode === 'HORIZONTAL' ? 'H' : 'V',
      gap: n.itemSpacing ?? 0,
      padding: ['paddingTop','paddingRight','paddingBottom','paddingLeft']
        .map(k => (n as any)[k]) as any
    };
    if (n.layoutMode === 'NONE') digest.issues.push('No Auto Layout');
  }

  if ('constraints' in n) {
    digest.constraints = { x: n.constraints.horizontal, y: n.constraints.vertical };
  }

  if ('variantProperties' in (n as any)) {
    const vp = (n as any).variantProperties ?? {};
    digest.variants = Object.entries(vp).map(([prop, value]) => ({ prop, value: String(value) }));
  }

  if (!n.name.match(/\w+\/\w+/)) digest.issues.push('Non-hierarchical name');

  return digest;
}

Current State

Where we are right now

Where we are right now

We’ve shipped all user flows and are in the final stretch of development — taking our local build toward virtual deployment while optimizing feature consistency, spec accuracy, multi-user stability, and AI API cost efficiency.


We’ve shipped all user flows and are in the final stretch of development — taking our local build toward virtual deployment while optimizing feature consistency, spec accuracy, multi-user stability, and AI API cost efficiency.


We’ve shipped all user flows and are in the final stretch of development — taking our local build toward virtual deployment while optimizing feature consistency, spec accuracy, multi-user stability, and AI API cost efficiency.


What’s Built

Proof of Concept + MVP tested for core architecture.

AI tool calls live — connects to our server via OpenRouter and DeepSeek.

Working Sandbox Mode that can spin up a practice frame.

YouTube + Figma tutorial search with timestamped links fully functional.


🚧 Features In Progress

Persistent “always-on” docked mode on the right side of the screen.


Full teach-alongside experience in Sandbox Mode (currently behind a feature flag).

Built


Proof of Concept + MVP tested for core architecture.


AI tool calls live — connects to our server via OpenRouter and DeepSeek.


Working Sandbox Mode that can spin up a practice frame.


YouTube + Figma tutorial search with timestamped links fully functional.

Built


Proof of Concept + MVP tested for core architecture.


AI tool calls live — connects to our server via OpenRouter and DeepSeek.


Working Sandbox Mode that can spin up a practice frame.


YouTube + Figma tutorial search with timestamped links fully functional.

🚧 In Progress


Persistent “always-on” docked mode on the right side of the screen.


Full teach-alongside experience in Sandbox Mode (currently behind a feature flag).

🚧 In Progress


Persistent “always-on” docked mode on the right side of the screen.


Full teach-alongside experience in Sandbox Mode (currently behind a feature flag).

Art is solving problems that cannot be formulated before they have been solved. The shaping of the question is part of the answer.

Pete Hein, Architect, Poet & Mathematician

Sarvesh

Art is solving problems that cannot be formulated before they have been solved. The shaping of the question is part of the answer.

Pete Hein, Architect, Poet & Mathematician

Sarvesh

Art is solving problems that cannot be formulated before they have been solved. The shaping of the question is part of the answer.

Pete Hein, Architect, Poet & Mathematician

Sarvesh