Creating AR interfaces for future spacesuits to aid astronauts during space exploration

Creating AR interfaces for future spacesuits to aid astronauts during space exploration

Overview

Designing AR interfaces for NASA’s Artemis spacesuit program

Designing AR interfaces for NASA’s Artemis spacesuit program

NASA SUITS (Spacesuit User Interface Technologies for Students) challenges university teams to build augmented reality spacesuit information displays. Our holographic interfaces are intended to inspire NASA as an agency to develop technology to assist astronauts exploring the lunar surface during future Artemis missions. Through the NASA SUITS Challenge, our team explored how augmented reality can enhance astronaut performance and safety on the Moon.

Company

Company

NASA SUITS Challenge

NASA SUITS Challenge

Collaborators

Collaborators

Engineers, Designers, Hardware, AI

Engineers, Designers, Hardware, AI

Timeline

Timeline

7 Months

7 Months

Industry

Industry

Space Exploration

Space Exploration

Role

Role

UX Designer

UX Designer

Tools

Tools

Figma, HoloLens2, Microsoft Mixed Reality toolkit, Unity

Figma, HoloLens2, Microsoft Mixed Reality toolkit, Unity

Problem Statement

Problem Statement

How might we help astronauts safely access and act on critical information during EVA, when vision is limited, hands are occupied, and no single interaction method can be trusted?

How might we help astronauts safely access and act on critical information during EVA, when vision is limited, hands are occupied, and no single interaction method can be trusted?

Specific UX Scope

Specific UX Scope

Within the broader AURA system, my work focused on three UX challenges:



1. Presenting safety-critical vitals without overwhelming the astronaut



2. Enabling reliable geological documentation across EVA and mission control



3. Designing multiple interaction methods when gloves, lighting, and motion break standard input.

Within the broader AURA system, my work focused on three UX challenges:



1. Presenting safety-critical vitals without overwhelming the astronaut



2. Enabling reliable geological documentation across EVA and mission control



3. Designing multiple interaction methods when gloves, lighting, and motion break standard input.

Solution Preview

Designed a vitals dashboard optimized for extreme environments, surfacing only the most critical suit and biomedical data through a layered visual hierarchy that minimizes cognitive load and prevents missed alerts.

Designed a vitals dashboard optimized for extreme environments, surfacing only the most critical suit and biomedical data through a layered visual hierarchy that minimizes cognitive load and prevents missed alerts.

Created LMCC web interface that receives and organizes geological samples from AR in real time, making it easier for mission control to monitor astronauts’ work, validate sample quality, and maintain a clean mission-wide database.

Created LMCC web interface that receives and organizes geological samples from AR in real time, making it easier for mission control to monitor astronauts’ work, validate sample quality, and maintain a clean mission-wide database.

Designed multiple complementary input methods including eye gaze, physical buttons, and voice to keep astronaut interaction reliable during EVA

Designed multiple complementary input methods including eye gaze, physical buttons, and voice to keep astronaut interaction reliable during EVA

Problem 1

Astronauts need real-time vitals without being overwhelmed

Astronauts need real-time vitals without being overwhelmed

Due to the limited AR field of view, I explored glanceable vitals layouts to reduce cognitive load. The final direction consolidated into a single screen to meet implementation. Within this constraint, I focused on clear visual hierarchy and scanability to keep critical information easy to read.

Initial vitals layout with all telemetry consolidated into a single screen for completeness and system visibility.

Refined vitals layout with clearer grouping and visual hierarchy, improving scanability and reducing cognitive load in a limited AR field of view.

Feedback and Iteration

Quick-look system over dashboard

Quick-look system over dashboard

Feedback from Johnson Space Center reinforced the importance of keeping information out of the astronaut’s central field of view. The evaluator emphasized glanceable, peripheral UI and noted that astronauts often want to “go out of their way” to check information rather than have it constantly intruding on their work.

  1. Kept vitals out of the center of the field of view

  2. Made vitals small, glanceable, and peripheral rather than a dominant panel

  3. Explored future directions for head-down or lower-FOV vitals clusters for quick checks

Problem 2

Geological sampling is cognitively heavy during EVA

Geological sampling is cognitively heavy during EVA

During EVA, astronauts must collect geological samples while navigating uneven terrain, managing tools, and communicating with mission control. Sample documentation requires recording location, photos, chemistry, and notes, often under time pressure.

The traditional process is:

  • Multi-step and difficult to reference mid-task

  • Easy to mislabel or lose sample context

  • Hard for LMCC to verify progress in real time

Entering Geo-sample mode, part of the AR workflow for the astronaut during sample collection

Previous web database for LMCC to verify, annotate, and track samples live

Based on previous feedback, I've addressed the hierarchy, redundant fields, and limited feedback of the UI

Designing for Scientific Ambiguity

Supporting geological interpretation

Supporting geological interpretation

Samples are affected by oxidation states, leading to possible inaccurate intepretations during collection. I expanded on color pickers for accuracy without hard coding scientific assumptions.

Free color pickers proved unreliable with eye-gaze, so we constrained color options and mirrored those constraints across AR and Web for consistency

Problem 3

No single input method is reliable during EVA

No single input method is reliable during EVA

EVA conditions break standard interaction models.

  • Gloves reduce dexterity

  • Hand tracking fails when hands are occupied

  • Eye gaze can misfire with motion or lighting

  • Voice commands can fail in noisy environments

NASA guidance emphasizes redundancy. No interaction method can be a single point of failure.

Entering Geo-sample mode, part of the AR workflow for the astronaut during sample collection

Previous web database for LMCC to verify, annotate, and track samples live

Based on previous feedback, I've addressed the hierarchy, redundant fields, and limited feedback of the UI

Designing for Scientific Ambiguity

Supporting geological interpretation

Supporting geological interpretation

Samples are affected by oxidation states, leading to possible inaccurate intepretations during collection. I expanded on color pickers for accuracy without hard coding scientific assumptions.

Free color pickers proved unreliable with eye-gaze, so we constrained color options and mirrored those constraints across AR and Web for consistency

What I Learned

What I Learned

Business vs. Academic

The start-up experience differs significantly from the design process I would tackle in school or passion projects. During my internship experience, I learn how to incorporate business goals into my designs within a start-up environment. Communicating with developers, marketing, and other teams outside of design is crucial to accomplish our goals.

Design Constraints

In most passion projects or academic research, there is a lot of ambiguity and freedom in what I can do to approach a problem. While I had multiple ideas for optimizing a design, I had to let go of many notions and choose designs that were easiest to implement and had the highest business value.

Let's create impactful designs together!

Let's create impactful designs together!

Looking for ways to create inclusive experiences that empower users.

genniferhom © 2024

genniferhom © 2024

genniferhom © 2024

Create a free website with Framer, the website builder loved by startups, designers and agencies.