MDS Q2 vision
Q2 is about making the MDS structurally ready for AI, so we can lead with clarity instead of reacting to pressure.
This update sets out why a shift is needed, what Q2 needs to focus on, and what success should look like by the end of the quarter.
The shift we need to make
AI is not a side topic around the MDS. It is starting to shape how teams design, prototype, and build. In many cases, teams are already relying on MDS knowledge, patterns, and components as a foundation for their own AI initiatives.
So far, our approach has been measured, which made sense. But the pace of change is creating an opportunity for us to lead intentionally in how AI and design systems work together.
This shift is not just about supporting designers and developers in the traditional sense. We now also need to support a new kind of user: generative AI. That includes tools, assistants, and agents that need to understand MDS well enough to generate good solutions from it.
This means we need to go beyond design tokens, which define the visual properties to also define UX tokens: the patterns, relationships, and guidance that describe how the system should be used.
Q2 is about building those foundations, so we can support you better.
The challenge
Our current proposition is not yet clear or structured enough for this next phase.
In particular:
- We don’t yet have comprehensive UX tokens that describe how components and patterns should be used—the dos and don’ts, relationships, and decision logic that guide reliable design and development
- Our guidance has historically been lean, but now needs to become more precise and more prescriptive
- Our examples and supporting material are not yet tight enough to consistently guide AI-generated outputs
- We lack clear visual and interaction guidance for AI-related user experiences
- We are seeing growing pressure from non-developers to prompt prototypes, particularly through tools like Figma Make
At the same time, developers are increasingly enhancing their AI-assisted development workflows by providing MDS-specific context to Copilot and other AI tools. Today, much of this work is unendorsed and fragmented, highlighting the need for MDS to act as the authoritative source of context and guidance for AI-driven usage.
What Q2 needs to be
These challenges make it clear that Q2 should be a break from the normal model of working through a heavily prioritised backlog.
Instead, we are swarming around a shared goal:
Move the MDS AI offering forward so that by the end of the quarter we have a clearer proposition, better foundations, and a path to treating AI work as business as usual rather than reacting to demand case by case.
This does not mean stopping everything else. We still need to handle:
- bugs and quality issues
- requests that are blocking teams
- existing commitments we have already made (especially around enhancements to the table)
But beyond those exceptions, Q2 should be about focus and recalibration.
Q2 Core Focus Areas
These focus areas describe where we will concentrate our thinking and effort in Q2. They set direction and boundaries rather than fixed delivery commitments and give the team space to shape how progress is made within each area.
Structure
Strengthen the semantic backbone of the MDS.
In Q2, we will focus on making the MDS more explicit, structured, and prescriptive by strengthening its underlying semantic model.
This means tightening how components and patterns are described through clearer structure, defined relationships, and explicit dos and don’ts. Whether expressed through structured documentation, metadata, tooling, or a combination of approaches, UX tokens will capture the decision logic and relationships that make the MDS work: when to use a component, what patterns work together, error states, accessibility requirements, and the intent behind each choice.
Just as design tokens define the visual properties of the system, UX tokens define how the system should be used, structured guidance on usage, behavior, and decision logic. This gives both humans and AI clear, reliable patterns to work from, so the MDS can act as a reliable source of truth for both.
Usage
Clarify how MDS is used in AI-assisted workflows.
In Q2, we will focus on clarifying how designers, developers, and AI agents are expected to work with the MDS in AI-assisted design and build workflows.
This includes establishing a clear position on prompt-based prototyping in design tools such as Figma Make, continuing to strengthen developer-facing AI support through GitHub Copilot using structured MDS knowledge, and positioning the MDS Assistant as the primary product through which MDS knowledge, rules, and intent are made available to AI systems across tools and surfaces.
The emphasis is on creating shared understanding and alignment around usage, even if the maturity of individual tools and integrations evolves over time.
Experience
Set clear expectations for AI-human interaction in MDS-based products.
In Q2, we will focus on how AI involvement is communicated to users in interfaces built with the MDS.
This includes establishing clear, production-ready guidance for visual signals, behaviour patterns, and system conventions that help users understand when they are interacting with AI, or when content has been generated or influenced by AI. It also includes productionising foundations such as the AI colour role, alongside direction for supporting components where needed.
The intent is to ensure AI-powered experiences built with the MDS are clear, legible, and trustworthy for human users.
What success looks like by the end of Q2
At a strategic level
A clear and credible AI-ready proposition for MDS, grounded in structure, usage, and experience, that we can confidently communicate to teams and stakeholders.
Structure
A clear semantic direction for the MDS, with more consistent, prescriptive documentation that defines relationships, intent, and constraints.
Usage
A clear position on how MDS supports AI-assisted workflows across design and development.
A concrete recommendation on prompt-based prototyping, including how tools like Figma Make fit into our strategy and whether there is a longer-term case for an MDS-led builder capability.
Experience
Stronger, more concrete guidance for AI-related interfaces and interactions, including how AI involvement is presented clearly and consistently to users.
Team alignment
A shared understanding across the MDS team of how AI fits into normal MDS delivery going forward, with AI treated as part of BAU rather than an exception.
What happens next
- Continue delivering on committed features and bugs
- Sense-check and refine the Q2 direction based on community feedback and team discussion
- Translate the three focus areas into concrete backlog work over the coming days/weeks
- Review and adjust as we learn from real-world usage and feedback, rather than locking everything up front