.jpg)
PerFits is an online marketplace where people can subscribe to and receive pre-worn, commercially cleaned athletic shoes and assess them using guided review modules and support from the larger user community.

Browse through personalized and curated lists of shoes
See shoes you want using quick-access preference pills and multi-attribute filter controls
Assess, favorite, and pre-select shoes for performance comparison
Compare up to 3 shoes against multiple performance attributes
Use visual queues to intuitively asses performance pros and cons
Deep dive into user generated performance assessment details


Easily write standardized performance reviews using guided modules and intuitive controls
Properly understand and assess performance attributes using educational tool tips
Visually capture and share your performance experiences with friends and community groups
PROBLEM OVERVIEW + DESIGN JOURNEY
I began exploring this retail e-commerce concept when I experienced difficulty accurately and conveniently assessing the performance of basketball shoes before and after purchase. After having explored a few basketball shoe online communities and having some negative customer experiences with shoe vendors, I understood the problem to be more complex than previously thought.
.png)
Using university resources and classes, over the course of 6 months, I explored this problem space and generated a solution using the below research and design methods.
.png)
DESIGN PROCESS AND FEEDBACK
PRIMARY USER TASKS
Using inputs from our user research surveys and content layout and interaction patterns from popular shoe e-commerce offerings - i.e. Amazon, Finishline, Nike Snkrs App, SneakerGeeks App - my design team and I iteratively built hi-level task flows of how users could accomplish their goals.
With the intent of creating a consumer facing prototype, we assessed which tasks were the most complex and relevant to our target market's needs. Deprioritizing onboarding and subscription selection user experiences, we designed for shoe browsing, shoe comparison, and performance review writing task flows.
We sourced our usability test participants from the list of respondents who were open to follow-up questioning. We recruited participants that fell within 2 of our target market segments:
-
Recreational player and basketball shoe/culture enthusiast
-
Competitive, high frequency runner.
All three participants were in their mid to late 20's, purchased performance shoes 4+ per year and varied in their demographic attributes and occupations.
A compilation of memorable experiences and feedback we captured during our Zoom-hosted usability tests.
Our test participants brought great enthusiasm to our conversations and my team identify areas for user experience improvements. Below, in the purple indicators, are specific interface, interaction, and service design deltas that we sourced from our testers and other designers we presented to.
.jpg)
1
The floating controls didn't align well with the copy on the shoe cards. Other content seemed unnecessary or too wordy at this stage in the task flow.
5
The numbers in the context of the performance aren't clear. Phrasing is confusing. The red and green indicators also aren't sufficiently clear.
2
It took too long to scroll to desirable controls in the filter and detail page. Those should be made more readily accessible at appropriate times.
6
Using the same control for binary and multiple choice inputs isn't intuitive. Long pressing on mobile, even with instructional text, might be uncommon for users.
3
The displacement of content on the browse page with the introduction of a new pre-selected shoes section is too much of an abrupt change.
7
Native voice input for text entry field is not a value add, considering OS can handle that capability.
4
Important non-performance shoe details should placed atop the screen and colorways should be made more apparent.
8
Increase the content size to make visually clear. The additional comments might be unnecessary and too much of a burden for users.
9
Seeing how others' experiences compared to yours right after writing a review might overwhelm users and could be rendered at another time.
FINAL DESIGNS + THOUGHTS
After processing the feedback received from our test participants and design classmates, my team and I developed a design changes tracker in which each of us would identify design improvements to design details that another team member had designed in the prototype. I encouraged my teammates to assume responsibility of other's work as it were their own and to help others understand the rationale for their own prototype designs.
We iteratively justified our new designs with bi-weekly check-ins over a two week sprint to finish the final hi-fidelity prototype. We had designated a single team member to develop our brand identity, color palette, typography, and a few replicable interface components. I took it upon myself to make sure everything was aligned and consistent after everyone had finished their designs.
And, this is the result of our semester long work!
Click here to download high resolution screen mockups.
Reflecting on PerFits's evolution, I am proud of how I have been able to iteratively develop the concept - first exploring how I could design for product-based online community interactions and relationships (see my report here) and most recently collaborating with design and marketing students improving PerFits's value proposition and mobile app design
I would love to do additional concept validation tests and finish designing the remaining mobile app task flows; namely user onboarding, subscription plan selection, subscription shoes management, payment processing, and social group interactions.
RESEARCH PROCESS AND OUTCOMES
I approached user needs and context discovery with two separate teams; one focused on understanding the user and the other focused on validating a market opportunity. I worked between both teams using insights from each team to refine our understanding of the problem space and what a solution could look like.
Below are the methods used, insights, and lessons learned from each of my teams's research activities.
Online Community Participatory Research
Participant researcher exploration of how users talk about and assess shoes with other like minded people. We explored and contributed meaningful comments and questions to online communities while being transparent about our intent on participating in the community.

A small but professional community of shoe performance experts that generate detailed and entertaining review content primarily on youtube video's as well as on their paid discord channels.

An open source community of sport and shoe performance enthusiasts across several subredits that engage in less structured discourse compared to professionally WearTesters and other commercial online review communities.
In our research we came across detailed accounts of people's opinions, feelings, and recommendations captured in the user generated content. However, we were surprised, but happy to see that non-commercial enthusiasts were filling performance review insight gaps that the commercial reviewers weren't satisfying.
Above are spread sheets, charts, and score cards of shoes that users produced and maintained on their own. This finding suggested that learning and engagement needs of performance shoe consumers and enthusiasts wasn't being fully addressed by existing market players.
Exploring User Assumptions:
User Needs + Preferences Surveys
My design team conducted two complimentary user surveys. The first was designed to validate assumptions about existing user problems, behaviors and demographic attributes as well as to discover new users and attributes. This was important considering that our initial exploration into the problem space primarily focused on male-dominant basketball shoe communities. Below are key findings:
RESPONDENTS: 162
FOLLOW-UP: 51
1
Runners contributed more responses and open-ended input, but were more concerned with using shoes on trial, subscription basis
.png)
Channels: Instagram Ads, School Listserves, Reddit Communities, Quora Threads
2
Basketball players used more frequent positive words in open ended responses and made up 50% of respondents who where open to follow-up questioning
3
Women, mostly runners, comprised 35% of respondents who were open to follow-up questioning
4
Respondents who responded positively were 24 years and older, performed their activity 7+ times a week, bought shoes 4+ a year and were only moderately satisfied or indifferent with performance reviews
We sent a second survey to 25 of the follow-up respondents from the first survey. Recognizing these respondents as high activity frequency and performance engagement users, we wanted to understand their preferences for shoe e-commerce and performance review product features and content.

Validating the Idea:
Focus Groups + Concept Test Survey
Working separately with my marketing classmates, we focused on PerFits' market opportunity and consumer concept willingness to purchase . We assessed the market readiness of the concept by conducting a focus groups and distributing a concept test survey. In each research engagement our main objective was to describe the concept and get respondent feedback; our secondary objective was build context around user needs, behaviors, and preferences.
Our focus group consisted of mostly undergraduate and graduate students. One of the respondents identified as a woman. Below are our findings:
6 FOCUS GROUP PARTICIPANTS

1
Everyone experienced or recognized that understanding shoe performance as difficult and retail transactions were inaccessible or too costly
2
Almost all participants were concerned with wearing pre-worn shoes and a few expressed environmental impact concerns.
3
Participants didn't recognize the community, learning, and ease of use value propositions.
4
3 participants were interested in longer term rentals with the option of buying the shoe.
It's important to note that these participants, apart from two, fell outside of who our target market segments were based on previous surveys. There was also no use of visual aids or prototypes of the solution or comparable solutions during the virtual meeting. We suspect that these factors contributed to the participant's understanding and reaction to the product concept.
We took our findings from the focus to help clarify and refine the problem and concept messaging. We included the revised concept description and included in what'c called a product concept test.
Companies bringing a new or updated product to market use a concept test to estimate the following:
-
Consumer Willingness to Purchase (Trial)
-
Consumer Concept Perceptions - Need Relevancy, Uniqueness, Believability
-
Target Market Segments
-
Consumer Concept Features and Benefits Perceptions
We distributed our concept test using similar channels previously stated and struggled to get similar volume of responses compared to prior surveys. Below are our findings and lessons learned from the 34 usable responses we received.
21 % Trial Rate

1
With 151 opened surveys and only 34 responses, we concluded that respondents didn't enjoy reading a multi-paragraph concept description before answering survey questions.
2
Runners had more responses than basketball players, but basketball players were more likely to respond in favor of the concept along all perception questions.
3
Minimal disparity of service price and high disparity of likely usage of test verse rental shoes suggests that there is no incentive to provide rental shoes.