VR Menu System design for Visual Impairment research
A Cross-Scenario Accessibility Tool for enhanced Usability & Adaptability
Duration: 1 Month
Team: 1 UX/UI Designer (me), 1 Engineer
Responsibilities
-
Stakeholder Interviews & Heuristic Evaluation
-
Cross-scenario system design
-
Collaborate with engineers for testing and iterations

Summary
The project is a system redesign for an accessibility VR research tool, aiming at enabling users to experience various Visual Impairment Conditions firsthand and collaboratively brainstorm Accessibility Tools in an immersive setting.
I conducted Stakeholder Interviews and Heuristic Evaluation to identify critical usability and adaptability challenges, as well as key research objectives.
The revamped system design enhanced users’ Contextual Understanding, Navigation Usability, and making it Adaptable Across Future VR Studies.
Background
The original design presented navigation and advanced accessibilities terms on the distant flat panel. And to create a fully immersive experience, visual effects were applied directly to the entire screen—including the navigation interface itself.

However, this caused severe Usability Problems and Extensive Pre-Experiment Explanations, which prevented participants from effectively emphasising with visual impairment conditions, not to mention creatively coming up with new assisting tools afterwards, making it impossible for researchers to achieve their intended study goals.
On the other hand, the research team also expected to Expand the system’s Usage beyond visual impairment to broader future VR research use cases.
Faced with these challenges, we begin asking the key question:
“HMW enhance usability within the immersive visual impairment experience, while ensuring the system remains adaptable for expanded research use-case?”
Design Process
Research: Conduting Stakeholder Interviews to Understand Core Research Needs
To align the design with research objectives, we conducted in-depth interviews with the Project Owner and the Research Team.

* Research Team Demonstrating their key research pbjectives and describing challenges they faced
Research Team Stated:
“We want users to fully experience visual conditions in VR to foster empathy, but many reported feeling lost when effects were applied.”
“The experiment is supposed to encourage open-ended exploration with all effects and tools, instead of merely clicking through a predefined effect-tool pair. ”
Synthesis: Identifying Key Challenges
We identified three key usability issues and scalability concerns:
Insight #1 - Cognitive Overload & Lack of Context
-
The flat grid layout presents all options simultaneously, requiring excessive scanning to track status or to recognise correlations to each visual effect, making it difficult to truly empathise.
-
Most users were unfamiliar with the visual impairment terminology, requiring extensive pre-experiment training via lengthy documents or demo videos.

Insight #2 - Effects/Tools Reducing Usability
-
Once a visual impairment effect was applied, it altered the entire scene—including the menu itself—making it difficult for users to see, select, or switch options.
-
The panel was fixed in distance, blending into background interiors. The lack of visual depth difference makes it difficult to perceive changes when effects are applied.

Insight #3 - Diverse Use Case & Interaction Needs
-
Beyond the initial Visual Impairment Experiment, the project aimed to expand the system’s use case to other VR studies and scenarios, including:

Design Strategies
To address these challenges, we therefore defined three core design strategies:
1. Understandable Context
Reduce cognitive load by providing clear categories and relevant context descriptions.
2. Intuitive Control
Maintain the Menu’s usability regardless of effects, and enhance spatial depth for better immersion.
3. Scalable System
Ensure the system can support diverse research scenarios with a seamless experience, while keeping the UI consistent and adaptable for future needs.
Final Design Highlights
#1 - Cross-Scenario Hierarchical Menu System
-
Integrate two essential global navigation controls, Mic and Locomotion, to provide seamless interaction across research rooms.
-
Such a Layered Menu System reveals only relevant options at each stage, preventing information overload.

#2 - Handle-Attached Menu for Enhanced Usability & Spatial Immersion
-
The menu is attached to the left handle whenever it appears, with a pointer from the right handle to hover or select buttons, ensuring the menu is Always Visible Within Reach with Minimised Efforts for selections.
-
The increased Visual Depth Contrast between the menu and the distant interior keeps the menu from being obscured by visual impairment effects while creating a greater sense of spatial immersion.

#3 - Contextual States and Previews for Interactions guides
-
Translate academic terms into intuitive icons, with clear button states indicated (hovered, selected, currently applied). Added preview dialogues with short descriptions for a better sense of action expectations.
