Welcome to the 3rd annual DesignOps Assembly report!
Since last year’s report, we’ve set a 5-year objective to mature the DesignOps discipline and create a community of practice informed by emerging trends, operational patterns, and standard practices. Ultimately, we want to empower the DesignOps community with insights that help you optimize results by investing in the right areas of DesignOps.
To this end, this year’s report is different than last year’s. Beyond presenting the current state of DesignOps, it also sheds light on the most common entry points to DesignOps as a career. Additionally, we’ve begun sketching out archetypes for the DesignOps function based on industry, size, reporting structures, and other factors.
In this year‘s survey and report, we focus on a set of core goals that support our 5-year objective:
-
Chart common DesignOps career paths
-
Map organizational factors that influence DesignOps
-
Characterize the DesignOps function
-
Track the organizational impact of DesignOps
-
Develop a standard barometer for health + maturity
-
Monitor external factors and emerging trends
We also cover several "zeitgeist" topics that capture the spirit of our time and our community’s attention. In particular, we shine the spotlight on what we learned about the global economy’s impact on the DesignOps job market. You’ll find other timely topics, such as diversity, equity, inclusion, and belonging (DEIB) and findings on AI usage in our Highlights section. To cap it all off, the appendices cover design systems, ResearchOps, salary data, and respondent demographics.
We consider this year's survey and report as setting a baseline in these core areas of DesignOps, both as a function and as a practice. This enables us to lay the foundation for tracking trends year over year as we set out to establish common language, best practices, common standards, and guidelines for doing DesignOps—and doing it well.
The survey
In October 2023, we sent out an online survey that ran for a 2-week period. We collected 316 responses from 30 countries and territories. While the respondent pool is smaller this year, we still have a rich, multi-faceted data set.
This report highlights what we find to be most compelling—from changes in hiring to common challenges, and from widely adopted program pillars to common approaches to increasing DesignOps maturity.
Things to keep in mind
We intend this report to be foundational and illuminating, and it should be read as a guide. That said, the research methodology has some limitations, which are documented in Appendix B.
Our goal for the report articles is to help you make sense of the data. When relevant, we'll compare this year’s data to the data in last year's report or to other things we have seen in large DesignOps communities, including DesignOps Assembly.
Other things to be aware of as you read:
Unless otherwise stated, percentages listed in this report refer to percentages of the whole respondent set. There are a number of instances where we found it made sense to filter out certain responses to get a stronger signal from the data. Common examples of this include:
-
Filtering out responses in order to get a count of unique organizations represented by the data set
-
Filtering out responses that indicated the respondent’s org doesn’t have a dedicated DesignOps function (for survey questions specific to DesignOps as a function)
The survey and report have been a labor of love by a talented team of DesignOps Assembly volunteers. We hope you get as much value out of the report as the energy we put into it.