NASA SUITS, which stands for Spacesuit User Interface Technologies for Students (SUITS), is a prestigious design challenge held by NASA where college students from across the country develop user interface solutions for future spaceflight needs.
In September 2022, I took on the role of co-president for the NASA SUITS club at RISD. Leading a team of over 20 students from Brown, JHU, and RISD, I was responsible for project management, organizing workshops, creating timelines, meeting with mentors, and providing high-level UX direction. Additionally, I created the critical design system that earned us finalist recognition and the award for best "visual design" among the contestants.
Sep 2022 - May 2023
RISD SUITS Design AR
Figma
VR
Microsoft holo lens
Lead UX Designer
Leadership
Project Manager
Design system
The goal of the project is to design an experience that helps astronauts complete critical mission tasks smoothly despite the harsh environmental and physical constraints.
The goal of the project is to design an experience that helps astronauts complete critical mission tasks smoothly despite the harsh environmental and physical constraints.
We were honored to be accepted as finalists and invited to NASA Johnson Space Center to conduct usability testing and present our work to NASA evaluators and astronauts.
We successfully developed a tool that allowed our end users to navigate and collect geological samples with perfect accuracy.
To begin, I ensure I understand the tasks that astronauts need to complete on the lunar surface. This process allows me to identify the questions that need to be asked during the interview phase.
After analyzing the key user flow provided by NASA, I have concluded that there are four critical tasks the astronauts need to complete on the lunar surface.
Using instructions and tools from the team-designed UI, the design evaluator will conduct egress procedures in a mock airlock by interfacing with the Umbilical InterfaceAssembly (UIA)
Next, they will exit the airlock and navigate the test site with the guidance of the test conductor, dropping waypoints as a breadcrumb trail for later return navigation.
Upon arrival at the geology site, they will perform a mock spectrometry task using a radio-frequency identification (RFID) sensor and receive scan data from the telemetry stream.
They will then pilot and command the rover, asking the cart to come to specific location and pick up samples for analyzing.
I led my team in organizing interviews with three major subgroups: astronauts, field geologists, and augmented reality specialists. From our findings, we identified common themes and pinpointed the following four pain points. This helps us inform how we should approach our designs later.
Due to multitasking under high pressure and the tight schedule of lunar missions, astronauts may feel overwhelmed and struggle with processing information.
The highly pressurized astronaut suit makes mission tasks tougher to complete because it's more difficult to grip, walk, and move.
Current headsets have small viewports, requiring users to make precise movements, which makes tasks like documenting geological samples difficult for astronauts.
Due to the harsh lighting conditions, the unfamiliar lunar surface appears homogeneous, making it laborious for astronauts to identify hazardous craters and sharp rocks.
I led my team in organizing interviews with three major subgroups: astronauts, field geologists, and augmented reality specialists. From our findings, we identified common themes and pinpointed the following four pain points.
Since the gloves make locomotion extra difficult, we want to ensure the design can be easily accesed with minimal motion.
We want to ensure the text is minimized and the information can be digested at first glance. We want to always empl0yed "less is more".
We want to make sure the colors are vivid despite low lighting and assets are positioned in the center to combat low viewport.
After determining the key pain points, the teams went through lighting rounds of brainstorming. For each pain point, we generated sketches for our concepts.
After the brainstorming session, we narrowed down our final design features from voting. Since the tasks were going in a linear movement, we thought we could have different design modes for different tasks. Here are the directions.
Egress mode
Providing a clear lists of tasks
Egress follows a linear process, so the user will have a checklist indicating the next steps. Once each task is completed, it will be marked off the list.
Navigation mode
Compass and direction
A compass will be positioned at the top of the interface, providing users with directional information and their distance from the home base.
Map, breadcrumbs, and marks
An interactive map will be accessible from the upper right corner. Once opened, the map will feature small icons that can be used to add breadcrumbs and markers.
Geological sampling mode
Sample information
Once the user scans the sample, detailed information will appear in a pop-up window on the right side of the screen.
Sample checklist
Users will be able to view checklists of the samples they need to investigate on the lunar surface.
Rover command mode
A remote controller
The user will be able to remotely control the cart, which is equipped with arrows indicating four directional movements.
A map showing the position
When entering rover mode, the map in the upper right corner will switch to a rover map, displaying the route and position of the cart.
However, just as we finished the wireframes and were ready to hand them off to the development team, we were informed that developing all modes within the given time frame would not be possible. We will need to make cuts.
As the usability test dates approached, I decided to use this opportunity to identify features to remove by testing five NASA individuals. The results showed a glaring issue. With less than 40% of the tasks being completed successfully, I knew the wireframes were difficult to understand and navigate without external help.
Complete egress
Open map?
Add breadcrumbs
Understand navigation
Understand sample
Sample checklist?
Control rover
Mode switch
Based on the usability test and follow-up interviews, these were the most important insights that needed to be iterated.
“What’s the red triangle? Warning? Arrow? Volcano? The key thing is about making the buttons consistent in word and icon choice to make it easily understandable every time.”
“It would help to merge ROVER command into the navigation map, since both you and the vehicle are navigating”
“Icons have no outline. Words are too small, line weight is too thin. Warning signs & top right notifications are too big and block my view”
As noted during testing, many testers found the features to be quite overwhelming. Based on our prior research, we have further reduced the number of features. Specifically, we merged the navigation and rover functionalities, removed unnecessary features, and ensured the remaining elements are appropriately prioritized.
When wearing a VR headset, blind spots can obstruct vision. To address this drawback, I have repositioned the assets within the field of view to ensure that key information is visible at the right time.
To address the issues of inconsistency and ambiguous icons, I have rebuilt the entire design system based on three main pillars: simplicity, glanceability, and visual prominence. Considering the limited locomotion, the icons have been made larger than usual. Additionally, the color saturation is much higher to accommodate vision in low-light conditions.
To ensure clear communication and avoid the missteps of previous implementations, I collaborated closely with our developers to integrate the new designs into the Mixed Reality Toolkit 3 (MRTK3). I held bi-weekly meetings with the development team and maintained regular communication through Zoom, Slack, and GitHub.
Provide a simple chart on Embark to explain the qualifications and purpose of an Amazon Advisor. This will allow managers to quickly understand the role and increase the likelihood of adding it to the onboarding plan."
By opening the map, astronauts can drop points, delete them, and record voice messages. A breadcrumb trail of waypoints is automatically dropped every 10 meters to help astronauts navigate home.
By scanning with the spectrometer, a 3-photo burst is automatically triggered immediately after the scan. The photos are then autosaved into the system/menu to record sample information.
When a ROVER Destination Pin is inputted onto the map, a straight line automatically appears connecting the ROVER’s current location to the ROVER pin to indicate that the ROVER is traveling to this destination.
I wanted to make sure the full experience worked and made sense. I conducted a full usability test with another 5 people. Then, the team was sent off to Huston for NASA on-site testing. Here are the results.
Complete egress
Open map?
Add breadcrumbs
Understand navigation
Understand sample
Sample checklist?
Control rover
Mode switch
On May 18th to the 23rd, my team and I were invited to Houston to test & present our design at Johnson Space Center. Unfortunately, due to conflicts with final exams, I was not able to join the team. However, here is the final presentation of RISD!
During the feature creation process, our team's initial goal was to have more features. However, sometimes minimal features can accomplish things more than imagine.
Onboarding the development team earlier in the process would provide more time to refine our assets and design, as engaging with engineers and developers early on helps ensure feature feasibility.
Since this was a student club, there were times when people were busy with school. As a leader, I learned to plan with flexibility around the project schedule, accepting things could go wrong.