NASA cover.png

Mixed Reality + NASA JPL

Winter 2017

Augmented Reality for NASA’s Mars 2020 Rover Mission

 🚀

The Mars 2020 Rover Mission control team at JPL faces a fundamental interaction problem—the team's 2D interfaces cannot provide the level of spatial awareness needed to think through complex 3D problems, like rover navigation. This is especially problematic when piloting uncharted terrain filled with geographical obstacles and occlusions, like steep slopes or sand pits. Spatial misjudgements can undermine the entire mission, and negotiating these risks with the demands of the science is only possible through active dialogue and collaboration. 

Our capstone explores how emerging mixed reality technologies like HoloLens can bring to life the Mars topography through scalable, interactive, three-dimensional simulations that anchor to real space. By contextualizing the rover's spatial workflow in a shared immersion, mixed reality can help JPL's cross-functional teams achieve alignment in making sense of complex, high-risk environments, and therefore articulate rover strategy with heightened consensus and clarity. 

 
 

 
 
 

OBJECTIVES

01

Analysis

How can we unblock spatial awareness of the Mars through holographic immersion? How can we more effectively translate two-dimensional data into three-dimensional visualizations?

02

Communication

How can we help teams convey complex ideas to each other in the mixed reality environment? How can we help them annotate and share 3D data? How can this help teams with conflicting needs reach consensus upon rover strategy?

03

Scalability

How can we free collaboration from physical constraints and allow teams to scale holograms across different spaces, contexts, and use cases? How can we afford teams better control over the magnitude, legibility, and location of their data?

 
 
 

TESTIMONIALS

"What would help is to put everyone in that same environment. It’s all about overlapping and finding the common areas that are worth exploring."

— Nathaniel Guy, Lead UX Developer at JPL

"The problem is that people have really different stakes in the process. Scientists focus on the science—they won't pay attention to the limitations of the machine. They need to have some sort of risk assessment with engineers around the actual terrain, instead of wasting time hypothesizing about what can and can’t happen."

— Greg Quetin, Mechanical Engineer at JPL

 
 The team's current 2D tools depict an incomplete map of the Mars topography, inhibiting tactical clarity and obscuring the route planning process. These tools can stitch together 2D photography to reconstruct 3D terrains, but they are still accessed through 2D interfaces that add a layer of visual abstraction and complicate the user’s ability to perceive complex spatial relationships.

The team's current 2D tools depict an incomplete map of the Mars topography, inhibiting tactical clarity and obscuring the route planning process. These tools can stitch together 2D photography to reconstruct 3D terrains, but they are still accessed through 2D interfaces that add a layer of visual abstraction and complicate the user’s ability to perceive complex spatial relationships.

 
 
 
 

 
 
 

COMPONENTS OF MIXED REALITY

2D 3D Real.png
 
 

As a mixed reality tool, Hololens allows us to augment 2D user interfaces and 3D holograms to our actual environment. Unlike most virtual reality machines (e.g. Oculus, Vive) that create stereoscopic environments closed off from the world around us, Hololens allows us to seamlessly blend digital immersions with the contours of real space. This project aims to provide a blueprint for the kinds of spatial microinteractions that can fluidly stitch together these different dimensions.

 
 
 

 
 
 

SCENARIO ONE

Bringing the Mars terrain to life.

 

The Mars 2020 Rover Mission team control team is comprised of geologists, engineers, and rover planners whose work at large involves analyzing massive swathes of data on screens (e.g. spreadsheets, photographs, charts). As such, 2D interfaces, while sometimes more limiting than 3D, are not obsolete, and the ability to move between the two fluidly is critical. 

In this scenario, you receive an incoming slice of terrain on your device from your team member Rashida, who has been investigating the same terrain elsewhere in 3D. Once donning your HoloLens, the terrain on your screen materializes into 3D. You are prompted to pull it onto any surface, where you may move, scale, rotate, and examine the terrain as you wish.  

 
Drag n Drop.png
 

Recorded into the hologram is a message from Rashida, which includes her annotations sketched into the terrain, a replay of her gestures and movement, and a voice note to narrate. 

Subtle markings framing the hologram help anchor it to the surface and give you a better sense of scale and elevation.

The window on your device converts to a dashboard with key supplementary information about the captured terrain, rover, and mission status. 

Desktop and Hologram New.png
 
 

 
 

THE REST IN UNDER CONSTRUCTION...TIDYING UP SOME PIXELS...BE BACK SOON!