-
The Product
ClosedLoop is a healthcare platform that gives providers, payers, and value-based care organizations the ability to make accurate, explainable, and actionable predictions of individual-level health risks. -
The Problem
Machine learning models don't allow users to check if representative samples, such as race, are impacted by label bias. -
The Goal
Assess whether a machine learning model uses a representative sample, is impacted by label bias, or may result in the unfair distribution of resources. After reviewing each metric, you can indicate whether the model has been validated and whether it is biased or unbiased. -
My Role
Product designer from conception to delivery. -
My Responsibilities
User-journey mapping, paper and digital wireframing, low and high-fidelity prototyping, conducting usability studies, iterating on designs, and design QA.
Insights from User Testing
User Understanding
Even though the platform has multiple tooltips to add less visual weight, users needed clear and helpful instructions on how to evaluate bias and fairness that needed to be displayed without tooltips.