Team
-
1 x Product manager
-
1 x Product designer
-
2 x Front-end engineers
Role
-
Research
-
Lead Designer
-
Design System
Background
Progression is a tool which helps companies define and measure career growth for teams. Using the framework builder you can design career levels, create skills, define expectations, and encourage and capture simple habits that evidence growth as it happens.
The problem
Currently we have a short onboarding flow with the following steps:
-
What’s your role?
-
Do you already have content?
-
What do you hope to use Progression for?
-
Which other tools do you use?
-
Select discipline and team size
Customers are then matched a prebuilt framework from the Progression library and then dropped onto the selected framework within the Progression app.
Our current onboarding flow fails to set customers up in the right way and doesn’t show the unique value of using Progression.
Challenges:
-
Prebuilt frameworks may not be the right fit
-
Customers are presented with framework with no actionable next steps
-
Difficulty in understanding how positions, skills and teams work together
-
Failing to demonstrate the unique value of Progression
Goal
Headline opportunity
We aim to simplify the onboarding process for new users by providing a guided experience to create their first framework. After the first session, customers should feel inspired, motivated, and have a clear understanding of how framework components fit together, along with guidance on what to do next.
Business value
An effective onboarding flow will clarify product value, increase adoption and reduce short term churn. It is an investment in the overall user experience that can lead to long-term customer relationships and sustained product-led growth.
What does success look like?
-
Completion rate >70%
-
Positive user feedback
Scope and timeline
Over a 2 week sprint we aimed to build quickly and reuse existing components where possible, enabling simultaneous design and development with frequent feedback loops, aiming for a release-and-learn strategy. All the while trying to cut this down to it’s core value.
Brainstorming
As a team, we got together and did an Opportunity Tree workshop, to draw out all of the ideas we have in this space.
We focused on “I’m not excited by the task ahead of me” and “I don’t see the value of building here” branches of the Opportunity Tree.
Within those branches we focussed on:
-
Get people off on the right foot
-
Make them excited to use Progression
-
Indispensable value from day 1
We then brainstormed ideas on how we could address some of these with onboarding.
Discovery
Who's signing up?
I analyzed the onboarding flow and found that managers were the primary sign-ups, with a reasonably even split between wanting our framework suggestions as a starting point and building from scratch.
People wanted to use the app to build skills and frameworks, create personal career paths and manage team performance. Engineering was also the most popular discipline.
Initial discovery
Week 1:
-
6-8 Progression customers
-
Preferably engineering managers
Aims:
-
What are the core concepts that new customers need to know to get started on the right foot?
-
What’s the best practice for building frameworks?
Usability testing
Week 1/Week 2:
-
3-5 Newer Progression customers
-
20 (per test) People with no prior knowledge of Progression
Aims:
-
Monitor behaviours during flow to improve UX
-
Do people understand what you can do with Progression at a high level
-
Can people explain the relationship between the core components
What core concepts do new customers need to know to get off on the right foot?
Frameworks consist of positions ordered by seniority and split into tracks like management and IC. Each position has skills, each skill has levels, expectations, and examples, this all sits within a team. This relationship is crucial for understanding Progression.
This along with other concepts such as tracks and best practices is hard to grasp so deciding what to explain now and later was a challenge.
There are also some must-have steps such as team name and some of the marketing questions from the original flow.
What's the best practice for building frameworks?
I also had lots of internal discussions with customer service, our product lead, founders and engineers as well as customers about how people currently built their frameworks vs what we, as the experts think is the best way.
This is a very large topic within Progression about what an ideal workflow is. At this stage there was no ‘right or wrong’ way so we decided to go with what is easier to explain.
Team → Position → Skills
I would then use this as a starting point.
Early exploration
Feedback
In the first few rounds of prototypes I tried to explain too much and ask a customer to create a framework, introduce concepts such as best practice on writing tracks and hints about our library, and generally over explain things. This made it long winded and far too much information to remember. It also felt very generic and not tailored enough to really wow people.
Streamlining the flow
Following internal feedback, discussions with some customers and with non-customers, we streamlined to just the essentials: team name, discipline (e.g., Engineering), position, and three skills.
We felt like these concepts got to the core value even if we didn’t mention every aspect of how you build, customers would hopefully still understand the key elements and have a great starting point.
We started to realise it would be impossible to build a full framework at this stage and so concentrated more on educating.
Usability testing
Recruiting issues
An issue we had was that customers who had just signed up weren’t very responsive when it came to agreeing to talk to us. The customers most keen on giving feedback were our Progression Champions — most of whom had signed up a long time ago and were already experts in using the app.
How we moved forward
We started to use Maze, a usability testing tool, so we could talk to people who had no prior knowledge of Progression and so would prove to be a better test. However, is wasn’t possible to narrow the user base down to engineering managers, who were on the lookout for a tool like this.
Exploration continued
How could we make this new version work harder?
Having one position and 3 skills, explained the structure well enough but it wasn’t very exciting and didn’t adequately give customers an idea of what was possible.
I wanted to challenge the squad to explore ways to leverage this simplified data and still create a complete framework, maintaining educational value and showcasing what was ultimately possible with Progression. So we set about trying to use this data to auto generate gaps in the framework. Having one step inform the next.
Discipline -> Your Position -> More Positions
We would ask customers to choose the discipline of their team, then from this we could generate a list of positions related to the discipline and ask them to select their own position (or a position). Then in the next step instead of asking them to select other positions to we would auto-generate a position either side of theirs using our prebuilt frameworks as data points.
Skills
Initially, I planned to have customers choose from a limited set of generic skills, like communication, which are applicable to every position. This aimed to show how you added skills to a framework and what they looked like in the framework. It also asked you to pick more from our Library if you wanted to.
However, that wasn’t very exciting and risked disrupting the flow. Instead, again using our prebuilt frameworks as data points we could display the most popular skills for the selected positions, making it more tailored to their needs, showcasing the content library seamlessly.
We’d previously done work on incorporating AI into our skill builder. So we decided on trying to bring some of this into the onboarding flow. Our AI skill builder can create a skill from scratch across multiple levels however, this takes time to generate as it uses multiple data points. So we just used a skill name entered by the customer to generate a skill across three levels.
The output would be slightly less accurate however, for the purpose of showcasing what is possible we though this was a nice touch. The time to generate the skill was still too long so our engineers worked on reducing it to less than 20 seconds.
Usability testing
Reduced completed time
This new streamlined approach reduced completion time of the onboarding, but in the end you ended up with a framework which had the same volume of content but more tailored to the customers.
Confident and informed
We found customers felt confident about how the product worked from using this auto-generate approach even though they didn’t manually do as much.
Experiment
What next?
We introduced a 'get started' screen based on discussions with our Product Lead. The screen presented customers with various next steps, such as viewing their new framework, searching for additional skills, scheduling a demo call, or accessing helpful guides. We added a marketing question on organization size to guide larger organizations to a demo call rather than self-help guides.
Although we initially considered adding more steps to the existing onboarding checklist, we opted to test a completely new and attention-grabbing approach.
Copy
I worked closely with our content team to finalise a lot of the microcopy. We found from testing on Maze that our initial pass was far too wordy. So we cut this down significantly to simple, straight to the point language.
This performed much better in terms of what we asked of people but it also reduced the visual clutter on screen.
Build and launch
We built and designed simultaneously, having regular conversations with the engineers and feedback sessions to make sure we were aligned. This worked really well and gave every team member high context of the problems and what we were trying to achieve. It also meant we worked really fast. We always tried to use existing components wherever possible.
We released a first version to 50% of signups monitoring for any drop offs or problems. We increased this to 100% after some small copy tweaks and an improvement to the Org Name screen.
Impact and feedback
Onboarding flow
The new onboarding performed with the completion rate rising from 68% -> 75%. Although we’d added more steps the completion rate was higher than the old version. The Biggest drop off was ‘Set your role’ although we didn’t find this high enough to be a significant problem — but would monitor. From talking to customers they found it easy to use and really useful as a starting point. With some wishing they could use this approach for all subsequent frameworks as it was very easy to get from nothing to something.
Get started modal
In the initial two weeks, 20% of people explored their frameworks, while 13% sought prebuilt frameworks. The experiment proved valuable for gauging user intentions and identifying areas for improvement.
We also noticed a lot of customers not retaining their initial frameworks or immediately looking for other frameworks which confirmed our thinking that onboarding should be more educational. This meant we had to do a lot of onboarding once a customer started building ‘for real’.
Design impact
This was the first time we had tested our new process and rituals. I was really impressed by how the rest of the squad engaged with it and really threw themselves into discovery and ideation.
Building and designing simultaneously can be daunting. But I think we made smart decisions around scope and what core value to aim for. Having these rituals and short feedback loops meant we could course correct easily.
Learnings and next steps
We wanted to create a similar onboarding flow to help people build all subsequent teams. The plan was to then guide people with tasks in context using AI to help fill gaps when they build ‘for real’ as a continuation of the onboarding experience.
We also want to introduce a natural way to show examples during onboarding as they are always impactful, and explore different paths for different customers.
We could have done more interviews around the get started screen to uncover some of the issues earlier — as in the end it took longer than we would have liked to build.