Insight
Roles - lead designer, researcher
Methods - visual design, wireframing, prototyping, user research, user analysis
Tools - Figma
Insight Innovations is developing a mobile app that transforms concussion diagnosis by delivering real-time, on-the-spot insights using only a mobile device. Powered by an AI-driven eye tracking algorithm, the app helps detect signs of concussion quickly and accurately using a series of 3 tests: the pupillary light reflex (PLR) test, the vesticular ocular motor screening (test), and the Sport Concussion Assessment Tool 6 (SCAT6). Our project focused on redesigning the app’s user interface and experience to make concussion screening fast, intuitive, and reliable—even for first-time users under high-stress conditions.
Here is a link to the Notion site we used as a digital notebook to document and organize the project.
Project Notebook Notion Website
Our Mission
To build a smarter, simpler concussion screening experience
Our goal was to design a concussion screening experience that’s easy to use when it matters most. We focused on making the app simple and clear enough for anyone to administer a test under pressure, while also expanding its features to meet the needs of larger organizations tracking many people’s recoveries. Through user research and testing, we set out to create an interface that feels intuitive, dependable, and ready for real-world use.
Current App Design
Pupillary light reflex (PLR) test PLR test results Vestibular ocular motor screening (VOMS) test
SCAT6 survey
The Solution
More Intuitive Testing
- Real-time guidance and prompts to make PLR and VOMS testing easier and more intuitive
- Calibration step to ensure the device setup and user positioning are correct before starting
- Error prevention through flags for common mistakes, such as incorrect light distance or positioning
Clearer SCAT6 Survey
- Adapted the SCAT6 survey for a mobile-friendly experience
- Reframed prompts as clear, user-friendly questions and simplified complex language
- Added clarification popups for each question
- Built in immediate alerts prompting medical attention if serious symptoms are detected
Customizable Memory Assessment
- The memory assessment questions can be customized based on the scenario, not limited to just sports injuries
- Scenarios include typical concussion causes, such as sports injuries, workplace incidents, and car accidents
- Improves diagnostic accuracy by tailoring the assessment to the specific nature of the injury
Prioritized Reults
- Immediately assess your potential concussion risk
- Provides guidance on next steps based on test results
- Most relevant results upfront, with detailed data available below
Initial Research
To ground our design process in real-world user needs, we began our project with an in-depth discovery research phase focused on understanding the experiences, expectations, and pain points of those using or impacted by concussion screening tools. Our research combined stakeholder interviews and a review of clinical assessment protocols. The goal: to uncover design opportunities that would improve usability, trust, and accessibility for a broader audience.
We wanted to answer five core questions:
-
What do key users (parents, athletes, coaches, and athletic trainers) need in a concussion screening app?
-
Where do current tools fall short in usability and guidance?
-
How can we design for moments of high stress and uncertainty?
-
What functionality is required for organizational use (e.g., sports teams, schools)?
-
How do competitor tools and clinical standards inform design best practices?
To explore these questions, we conducted semi-structured interviews with:
-
Parents of youth athletes
-
College athletes with recent head injury experiences
-
Team coaches, especially from sports without on-site medical staff
-
Certified athletic trainers who manage concussion protocols daily
What did we find from our reseach?
Parents need structured, trusted guidance
Parents often face a difficult choice: whether to monitor a head injury at home or seek emergency care. They typically turn to online sources like the Mayo Clinic but lack confidence in knowing when to escalate. They want clear next steps, not medical jargon. Tools that simplify decision-making—especially in borderline cases—were seen as valuable.
Athletes want clarity and education
Athletes shared feelings of uncertainty when going through a concussion screening. While they trust medical professionals, they are often left unsure about their diagnosis and confused about the return-to-play process. They valued symptom checklists but wanted more education and transparency during assessments.
Coaches lack tools and training
Coaches, especially those in sports without traveling medical staff, described feeling helpless when injuries occur off-campus or mid-competition. They rely on athletic trainers but lack visibility into athlete status. They expressed a need for live recovery tracking tools and a simplified, coach-friendly concussion screening method they could use in the field.
Athletic trainers need efficiency and EMR integration
Trainers are responsible for dozens of athletes simultaneously. They praised tools like SWAY for their data richness and integration with medical record systems, but noted that usability suffers when tools become too complex. They emphasized the importance of quick, independent testing and flexibility in selecting appropriate test types.
Competitive Analysis
Current tools are too complex for most users
We evaluated 3 concussion screening tools:
While tools like SWAY and Reflex PLR offer powerful features, their complexity and reliance on prior training make them inaccessible to non-professionals. SCAT6, though clinically trusted, includes confusing terminology and decision flows that are difficult for laypeople to follow under stress.
User Personas
To translate our research findings into actionable design insights, we developed user personas representing key stakeholder groups. These personas capture the goals, pain points, and behaviors of real users we interviewed and help ensure that our design decisions remain grounded in their lived experiences.
Wireframes
We then created low-fidelity wireframes to explore potential design directions.
Initial User Testing
To validate our design decisions and identify opportunities for improvement, we conducted our first round of user testing using a mid-fidelity mobile prototype built in Figma. Our goal was to evaluate the app’s usability across different user types—particularly parents and sports medicine trainers—while simulating realistic concussion screening scenarios.
This round included six participants in total: four athletes and two parents.
Each session followed a structured agenda
Welcome & Pre-Test Interview
We began by gathering context about participants’ prior experience with concussion care and screening tools. This helped frame their feedback within real-world use cases.
Core Task Flow
Participants completed key workflows in the prototype, including the PLR (pupillary light reflex) test, the VOMS (Vestibular/Ocular Motor Screening), and the SCAT-6 screening. These tasks were presented through scenarios that mimicked real-life situations on the sidelines or at home.
User-Specific Tasks
Parents and trainers were also asked to simulate onboarding flows relevant to their role—setting up the app for the first time and preparing to screen someone for a potential concussion.
Think-Aloud Protocol
Throughout each task, we asked participants to narrate their thoughts to capture their reasoning, hesitations, and moments of confusion in real time.
Post-Test Reflections
After completing the tasks, we gathered feedback through open-ended questions and rating scales to better understand user expectations, task difficulty, and desired improvements.
What did we want to get out of the user tests?
- Assess clarity of instructions and labeling during high-stakes tasks
-
Evaluate task flow and identify points of friction or uncertainty
-
Understand how different user types interpret test results and follow-up actions
-
Collect qualitative feedback to inform design revisions and prioritization
What did we find from the initial user tests?
Instructional Text Was Misinterpreted as Interactive
Many users mistook instruction screens for live test interfaces—particularly during the PLR task—attempting to tap or interact with static text. This revealed a need for clearer visual distinctions between preparatory steps and interactive test elements.
Camera-Based Tests Caused Confusion
Tasks like the VOMS and PLR screenings were the most challenging. Users struggled to align their faces properly and were unsure which camera was being used. Glasses-wearing users also questioned whether they needed to remove them.
Role Clarity Was Lacking in SCAT-6
In the SCAT-6 screening, users were often unclear whether they were answering questions as the test administrator or as the injured person. This confusion led to inconsistent responses and disrupted task flow.