We’ve been exploring where AI fits within the NYS Design System, and we’ve come up with an approach that I think has promise.
A lot of teams lean heavily on AI to generate code, but doing that means AI makes a lot of decisions for you. And while those choices may work short-term, they may not be the best choices for you or your codebase. And if you generate more code with a slightly different context, you may get significantly different results. This can create inconsistency in your code base over time, which can make maintenance a big challenge and introduce risk.
So we flipped the model.
Our design system designers and developers have mapped the properties and attributes of our code components to our Figma components. That way, after designers do the important work of user research, information architecture, user journeys, and all the other critical sense-making, and have built out a prototype in Figma, engineers can point an LLM at that Figma file and get working code snippets from Figma’s MCP server.
In this case, the AI is doing a lot less generative work, and a lot more stitching together working code snippets into usable, accessible markup. We’re basically asking the AI to do intelligent code snippet assembly. This makes the code output way more reliable and repeatable.
And just as important, this approach keeps designers in the loop. There’s a big push to build MCP servers for Design Systems. (GitHub released theirs recently.) This has promise, but it brings us right back to the challenge of AI generation. Unless those MCP servers have well-structured ways to convey well-tested patterns and thoughtful mappings between the UI elements and how they should be used, the likelihood that AI will put them together well is low.
We chose to anchor things in Figma so we could keep designers in the loop. Especially in government, we desperately need designers helping make sense of complicated requirements, policies, language, systems, etc. As we begin to document those good patterns into structured, well-formed guidance, we can start serving that guidance to LLMs via an MCP server.
By pointing at a design that has design components thoughtfully placed by designers yet thoughtfully mapped to real code, we give AI the context it needs to produce consistent, usable, accessible code. It’s really just connecting the dots.
Check out the video… it’s wild to watch it happen live.
