A design system is a library of reusable components with defined guidelines that enable product consistency across teams. But launching a design system is only half the battle — you need to know if teams are actually using it. Without systematic tracking, you’re flying blind.
After launching our design system, over 25 projects adopted it within the first year. But we had no clear way to measure adoption, identify underused components, or track migration progress from the legacy library. Manual audits across dozens of repositories weren’t sustainable.
We needed automation.
Four Ways to Measure Adoption
Before building anything, we identified four dimensions of tracking:
- Quantitative data — component usage frequency, project counts, package versions, and the ratio of design system components vs. legacy ones
- Survey-based feedback — user experience assessments, feature satisfaction, and pain points from adopters
- Adoption rate monitoring — new user onboarding metrics, project-level adoption percentages, and trend analysis over time
- Stakeholder engagement — collecting input from both design and development teams
The first dimension — quantitative data — is the one we automated.
The Automated Pipeline
We built a four-job workflow using GitHub Actions, scheduled as a monthly cron job to batch-process all frontend repositories.
Job 1: Repository Collection
Gather details of all frontend repositories and store them into a matrix output variable. This creates the list of projects to scan.
Job 2: Asynchronous Cloning
Using GitHub Actions’ matrix strategy, every project is cloned simultaneously in parallel. The results are uploaded as GitHub artifacts, making the process significantly faster than sequential cloning.
Job 3: Parsing and Report Generation
All artifacts are downloaded into a single folder. React-scanner, an npm package that parses React codebases, scans the code to identify which components are being used, how often, and in what combinations. The parsed results are recorded in the repository.
Job 4: Database Integration
The parsed data is pushed to Metabase (an open-source business intelligence platform) using Prisma as the ORM. This transforms raw scan data into queryable, visualizable dashboards.
Key Metrics and Dashboards
Once the data lands in Metabase, we track four key dimensions:
Component Instance Count
A line chart showing the total number of design system component instances across the entire organization over time. This is the clearest indicator of overall adoption growth.
Component Distribution per Project
Charts revealing which components each project uses and how many. This helps identify popular components (validating good API design) and underutilized ones (which may need better documentation or promotion).
Adoption Ratio
A comparison of design system components versus legacy frontend library components. This visualization shows migration progress — how quickly teams are replacing old components with new ones.
Package Version Tracking
A table recording which version of the design system package is installed on each project. This surfaces outdated implementations and helps prioritize upgrade efforts.
What We Learned
The automated tracking revealed insights we couldn’t have gathered manually:
- Most-used components confirmed which parts of the system were delivering the most value
- Underadopted components highlighted where we needed better documentation, examples, or API improvements
- Version compliance across projects made it easy to identify who was falling behind on updates
- Migration progress from the legacy library gave stakeholders a clear picture of ROI
- Component-specific trends helped us prioritize which components to invest in next
Why Automate
Manual tracking doesn’t scale. With 25+ projects and growing, an automated pipeline gives you:
- Real-time adoption monitoring without manual effort
- Consistent tracking across all repositories, regardless of team size
- Data-driven decisions about which components to improve, deprecate, or promote
- Stakeholder visibility — dashboards that demonstrate the design system’s impact
- Strategic planning — informed decisions about where to invest engineering time
Key Takeaways
- Define your metrics early — know what you want to measure before building the pipeline
- Automate the scan — GitHub Actions + react-scanner + a matrix strategy handles parallel repo scanning efficiently
- Visualize the data — raw numbers are useless without dashboards that tell a story
- Track the ratio — design system vs. legacy component usage is the single most important migration metric
- Run it regularly — monthly scans give you trend data that reveals whether adoption is accelerating or stalling