Reverse-engineering a design system from a production website

R

I’ve written about building design systems with AI and the project template I use for structuring that work. This post is about a different starting point entirely.


A financial services client needed an application built. They had a polished public website with a clear visual identity: brand colours, consistent typography, spacing and shadows. Whether they had a design system documented internally, I don’t know. What I knew was that I needed their design language. The app had to look like it belonged to the same family.

So I went to the source. I took what was already live and worked backwards.

Step 1: extracting the visual language

This assumes the live site is internally consistent: that what you’re extracting is up-to-dat, not leftover. In this case it was. The website had a clear, coherent visual identity across every section I checked.

Most of the extraction was automated. I pointed Claude Code at the production website and let it pull what it could see. Sometimes I directed it to specific pages on the portal where specialised patterns lived: form-heavy sections, data displays, interactive elements that the homepage didn’t use.

Colours first. The primary brand colour, a another for text and headers, an accent, and a set of semantic colours for success, warning, error, and info states. Then the neutrals: the off-white page background, the slightly tinted card surfaces, the border greys. Every hex value went into a brand doc.

Typography next. Two font families: one for Latin script, one for RTL. A type scale from headings down to captions, with specific weights and line heights for each level. Then spacing: an 8px base grid, 16px card border radii, 8px button radii, 30px badge radii. Shadows: a dark at 8% opacity for card elevation. Even transition timings: 150ms for fast interactions, 200ms for standard, 300ms for slow.

The whole extraction took an hour, not a week. By the end of it I had a markdown file with the major design decision the website had already made.

Step 2: building the application

I used those extracted tokens to build the backoffice app. React, Tailwind, six waves.

The real test came when I got past the dashboard. A dashboard is friendly territory for any colour palette: cards, charts, summary numbers. But a detail page with inline comments, status badges, priority flags, and a multi-step timeline exercises completely different parts of the design system. If the extraction had missed something or got a value wrong, this is where it would show.

The first thing I noticed was that the comment thread felt off. The spacing between messages, the way the avatar sat next to the text, the border treatment on nested replies: none of that existed on the public website. I was interpolating from the patterns I’d extracted, and the interpolation was wrong on the first pass. It took a few rounds of checking the live site’s form sections and card layouts to get the right rhythm. But the core tokens held. The shadows, the border radii, the type hierarchy: all of it worked across screens that had nothing in common except the brand.

Step 3: from application to design system

After the app was built and working, I asked Claude to extract the proven patterns into a standalone design system. The first attempt was wrong in a useful way. Claude looked at the app and broke it down by page: here’s a sidebar component, here’s a dashboard component, here’s a ticket detail component. A 1:1 mirror of the application’s structure. That’s not a design system, that’s a component dump.

I had to be specific about what I actually wanted. MUI v6 as the component framework. Atomic Design as the organising principle: atoms, molecules, organisms, templates. The goal is reusable primitives that can serve any future project, not a decomposition of this particular app.

The second run was different. Six atoms: logo, button, text field, avatar, status and priority badges, toggle switch. Five molecules: form fields, search inputs, KPI cards, file upload zones, stat rows. Eight organisms: page headers, filter bars, data tables, comment threads, timelines, sidebar, topbar, mobile navigation. Two templates: the authenticated app shell and the login layout. Each one built on MUI with the same token values from the original extraction, now formalised as a proper theme.

Then the showcase needed work too. The first version crammed every component onto a single scrollable page: the kind of thing that technically shows everything and practically shows nothing. I had it rebuilt as a multi-page showcase: one page per Atomic Design layer, plus an overview and a full token reference. A sidebar for navigation on desktop, bottom nav on mobile. The showcase used the same components it was demonstrating.

What went wrong along the way

A few things that are worth knowing if you try this.

The logo problem. I assumed that if I asked for an icon variant of the client’s logo for the collapsed sidebar, Claude would extract the symbol from the existing logo file. Wrong. It generated a brand new icon from scratch. I had to stop it and redirect: take what exists, don’t invent. This happened more than once across the project. AI defaults to generating. When you want it to extract, you have to say so explicitly every time.

Spacing is never right on the first pass. I went through multiple rounds of fixing card margins, column gaps, and form input spacing. No margins between cards. Then too much margin. Then the right margin but no gap between columns. The token values were correct: the application of those values to layouts required constant correction. Spacing is the last 20% that takes 80% of the polish time.

The brand document wasn’t enough. I assumed a thorough extraction would give me a complete reference I could build from without looking back. Wrong. I kept returning to the live website to check details: button font weights, the way form inputs looked in a specific section, border radii on secondary elements. The extracted doc was a starting point. The live site remained the actual source of truth throughout the build.

Accessibility needed its own pass. Form inputs looked fine against the page background until I checked the contrast ratios. White toggle switches disappeared on white card surfaces. Things that passed a visual check failed programmatically. WCAG 2.2 AA compliance required a dedicated wave, not a checkbox at the end.

The tech stack shift was deliberate. The backoffice app used Tailwind (fast for prototyping, good for one-off projects). The design system used MUI, which is better suited to a reusable component library with standardised APIs. Same tokens, completely different implementation. The design system wasn’t a copy of the app. It was a re-implementation on a framework designed for reuse.

Why build backwards?

I’ve seen design systems built the conventional way: tokens first, then components, then shipping them to products and hoping they fit. You end up with speculative components that may or may not match what the product actually requires. Things designed “just in case” that nobody uses.

Building the app first inverts that. Every component in this design system exists because a real screen needed it. Nothing speculative survived.

The obvious limitation: this design system was proven against one application. A future project with a different shape: say, a reporting tool with heavy charting and no ticket system; will need components this one doesn’t have. But extending a tested foundation is cheaper than finding out on project two that your top-down system doesn’t fit.

What made the backwards approach practical was speed. Extracting a brand identity, building eight screens, and re-implementing everything as a component library, compressed into days with Claude Code. Without that speed, you’d spend so long on the first app that formalising the design system would feel like a luxury rather than a natural next step.


If your client has a website but no design system, you already have one. It’s just not documented yet. Open the site, extract what’s there, build something with it, and formalise the patterns that survive.

About the author

Lucas

UX Lead and AI Transformation Consultant
20+ years shaping B2B SaaS and digital products. Focused on AI-powered design, scalable UX, and turning complex business needs into simple, high-impact user experiences.
Find me on LinkedIn

By Lucas