1st-Round-Blind on LinkedIn

eSIM Activation Flow For The Next Million

Scope

User Research

Lean Re-Branding 

Product Design 

UX Writing

Goals

Reduced Time to Value (TTV)

Higher Completion Rate

System Learnability

Input Error Reduction

Scaleability & Increased Accessibility

Context

In today’s job market, visibility isn’t always a good thing. From automated résumé parsing to unconscious bias in recruiter workflows, many candidates—particularly people of color, older professionals, career returners, and even highly experienced applicants—are self-selecting out of the application process entirely. LinkedIn, while central to how people search for work, unintentionally amplifies these inequities through design patterns that prioritize speed over fairness. As a result, frustrated job seekers are increasingly turning to anonymous platforms like Reddit, Glassdoor, and Blind—not to apply, but to commiserate, warn, and opt out. This case study explores how anonymizing the first round of application review could help rebuild that lost trust, reduce bias, and reassert LinkedIn’s relevance as a platform designed not just for hiring—but for inclusion.

Process

I began with a desk-based qualitative audit, analyzing 300+ posts from LinkedIn, Reddit, Glassdoor, and Blind to capture lived experiences. From this I extracted six meta-insights—each tied to personas like Lina the caregiving UX designer and David the senior product manager—highlighting fear, exclusion, and systemic frustration. With insights in hand, I sketched three parallel flows: job-post messaging, anonymized applicant UI, and recruiter dashboards. Over two rounds of peer review with DEI advocates and hiring professionals, I refined the language, badge systems, and reveal mechanics. The end result: a prototype combining fairness-first design, support flows, and clear handoff systems for both users and recruiters

Outcomes

The final prototype introduced a hiring experience built around fairness, clarity, and trust. Job postings now include a visible badge and messaging system that signals "anonymous first round," giving applicants immediate psychological safety. The user flow anonymizes names, photos, and specific dates while introducing experience bands, allowing career gaps and unconventional trajectories to feel valid rather than penalized. On the recruiter side, a redesigned dashboard supports structured, unbiased reviews in the first round, with identity details unlocked only after shortlisting. To support implementation, the system also includes built-in help flows, oversight prompts, and a roadmap for future iteration. The result is a solution that reflects real-world anxieties—and reframes LinkedIn as a platform that doesn’t just host hiring, but actively protects its fairness.

No items found.
1st-Round-Blind on LinkedIn

User Research

Lean Re-Branding 

Product Design 

UX Writing

eSIM Activation Flow For The Next Million

Download Extended Case Study (PDF)

In today’s job market, visibility isn’t always a good thing. From automated résumé parsing to unconscious bias in recruiter workflows, many candidates—particularly people of color, older professionals, career returners, and even highly experienced applicants—are self-selecting out of the application process entirely. LinkedIn, while central to how people search for work, unintentionally amplifies these inequities through design patterns that prioritize speed over fairness. As a result, frustrated job seekers are increasingly turning to anonymous platforms like Reddit, Glassdoor, and Blind—not to apply, but to commiserate, warn, and opt out. This case study explores how anonymizing the first round of application review could help rebuild that lost trust, reduce bias, and reassert LinkedIn’s relevance as a platform designed not just for hiring—but for inclusion.

I began with a desk-based qualitative audit, analyzing 300+ posts from LinkedIn, Reddit, Glassdoor, and Blind to capture lived experiences. From this I extracted six meta-insights—each tied to personas like Lina the caregiving UX designer and David the senior product manager—highlighting fear, exclusion, and systemic frustration. With insights in hand, I sketched three parallel flows: job-post messaging, anonymized applicant UI, and recruiter dashboards. Over two rounds of peer review with DEI advocates and hiring professionals, I refined the language, badge systems, and reveal mechanics. The end result: a prototype combining fairness-first design, support flows, and clear handoff systems for both users and recruiters

The final prototype introduced a hiring experience built around fairness, clarity, and trust. Job postings now include a visible badge and messaging system that signals "anonymous first round," giving applicants immediate psychological safety. The user flow anonymizes names, photos, and specific dates while introducing experience bands, allowing career gaps and unconventional trajectories to feel valid rather than penalized. On the recruiter side, a redesigned dashboard supports structured, unbiased reviews in the first round, with identity details unlocked only after shortlisting. To support implementation, the system also includes built-in help flows, oversight prompts, and a roadmap for future iteration. The result is a solution that reflects real-world anxieties—and reframes LinkedIn as a platform that doesn’t just host hiring, but actively protects its fairness.

No items found.
Next project

Native App - CC stickets for apple pay (travel cash backs) (built relationship