Human Interfaces Group

Hi! The Human Interfaces Group is part of the Operations Laboratory(Ops Lab) at NASA Jet Propulsion Laboratory in Pasadena, California. Our mission is to create natural user interfaces in support of NASA space exploration missions. We also conduct future research to create new modes of human-robotic interaction.

Projects

Future Robotic Interfaces

Robotic spacecraft will play a critical role in the future of space exploration. Increasing the efficiency of their operation will allow for a more rapid exploration of the solar system. Efficiency is inherently limited by the time delay induced by the distance between operator and spacecraft. Traditional control systems become unpredictable and unresponsive in the presence of time delay. Our system decreases perceived time delay by allowing the user to control the spacecraft at a predicted future state. By predicting and displaying a range of expected future states, safe operation is possible within the modeled uncertainty of the spacecrafts' behavior. Work includes the RAPID, a Default-Tolerant Networking protocol.

Controlling Space Missions

Curiosity is a multi-billion dollar robot pushing the forefront of human knowledge on another planet. We can’t physically repair the robot, so we must rely on software to operate the robot safely and efficiently. The time delay of sending signals millions of miles makes realtime operations impossible; instead we must plan ahead and send up a large batch of commands once per day. Additionally, the robot is a large and complex creature with data and power limitations as well as many instruments, not all of which can be operated in parallel with each other. The Curiosity mission operations interface combines image browsing, high-level planning and low-level commanding and validation all in one tightly-integrated tool.

Earth Science Data Processing

Earth science gives us a way to better understand our planet and its short and long term changes. By generating hypotheses and creating models, we can improve our ability to predict natural disasters, respond more effectively to current changes, and build our knowledge about the complex interactions between atmosphere, vegetation, ice sheets, volcanic activity, tectonic activity, soil moisture, and many other components of our environment. The Human Interfaces Group is developing a system in which Earth science data processing workflows are run in a cloud computing environment, with a web-based interface for managing, creating, and monitoring the computational processes. Our goal is to decrease total processing time, significantly lower the barriers to participation, and provide a common platform for collaboration.

Deep Space Communication

The Deep Space Network (DSN) is the worldwide antenna array that NASA uses to communicate with all active space missions. Network operators have to manage every bit from every space mission 24x7x365 through the 15 antennae. The Human Interfaces group develops the system that allows operators to manage this extraordinary flow of data.

Partners in NASA Outreach

Working with the Mars Outreach team at NASA, we have created a suite of interactive 3D experiences for the public (starting with the Spirit MER rover and extending to the Curiosity MSL rover and beyond). Partnering with Microsoft Studios, we have released the first NASA console game for free on Xbox Live: Mars Rover Landing

Multi-disciplinary Data Visualization

We bring together computation, interaction and visual communication to investigate the Universe. The data to discovery symposium data to discover symposium is one outlet for this trans-disciplinary collaboration.

People

Scott Davidoff

Dr. Scott Davidoff manages Human Interfaces for Mission Operations at NASA’s Jet Propulsion Lab. As principal investigator for NASA’s Space Networking and Mission Automation program, he leads design and development of JPL’s next-gen robot and spacecraft controls. As principal investigator of JPL’s Data-to-Discovery program, he leads efforts to create new ways to interrogate and interact with planetary scale datasets. Dr. Davidoff serves on steering committees for the National Science Foundation, the Office of Naval Research, and the Association for Computing Machinery. Across his over 18 years of experience, he has introduced numerous lightweight prototyping methods that have become industry standard software practice. Dr. Davidoff has a Ph.D. in Human-Computer Interaction, an MS in Computer Science (research), and an M.HCI in Human-Computer Interaction (practice), all from Carnegie Mellon.

Alex Menzies

Alex Menzies is a software engineer with a passion for taking the fiction out of science-fiction. Since joining the lab in 2007 he has worked on a variety of projects. Currently, he is investigating novel interfaces for controlling high degree of freedom mobile robots with time delay. Some of his other projects include a terrain engine capable of rending multi-gigapixel HiRISE mosaics in real-time over the web, a new Cloud Fraction by Altitude science data product for the MISR mission, and “Mars Rover Landing” – NASA’s first console game. In his spare time he enjoys hacking on new technology, cycling, and attempting to surf.

Alexandra Holloway

Alexandra Holloway, Ph. D. designs interactive user experiences for space missions in the Human Interfaces Group (397F), specializing in user-centered design research. Alexandra’s work with operators of the Deep Space Network led to the design of micro-displays as data-driven mental model realignment tools. Previously to JPL, Alexandra taught human-computer interaction at Mills College and computer organization (architecture) at University of California, Santa Cruz.

Bryan Duran

Bryan Duran has been a software engineer at JPL since 2011 with a background in game development. His previous work with the Microsoft Kinect sensor led him to find JPL. JPL has many projects involving technologies such as the Kinect, so it was an easy transition for him. For Bryan’s first year, he had been mainly working on Mars Public Outreach projects including the “Mars Rover Landing” game on Xbox Live and several online 3D experiences designed to educate the public about Mars and the newly landed rover, Curiosity. Currently, he is working on the Human Robotic Systems project in designing and creating natural interfaces for controlling robots with many degrees of freedom.

Erin Murphy

As an Interaction Designer, Erin Murphy designs user experiences between the environments people inhabit, technologies people develop, and social interactions people share. Erin started her career in Design at the University of Washington in Seattle studying Interaction Design. Erin has worked in partnership with Boeing, Microsoft, and Teague designing user experiences for flight decks, urban planning, and branding. She has been an active member of design communities such as the Interaction Design Association in Seattle (IxDA), IxDA UW, Creative Mornings, and Seattle Design Festival.

At JPL, Erin has worked closely with an variety of missions including user research and design for the OCO-2 Virtual Science Data Environment, Mars Relay Operations Service (MaROS), Human and Robotic Systems (HRS), and Vortex. Currently, Erin is supporting the visual communication design efforts with M2020 5-Hour Tactical Prototype. She is also assisting Dr. Sarah Milkovich in the supporting efforts for M2020 Science System Engineering.

Garrett Johnson

Garrett Johnson joined JPL doing software engineering and interaction design in 2012 with a background in game design and development. He works primarily on the Human Robotic Systems project, researching innovative ways to remotely control robotic systems, including the high-degree of freedom robot, ATHLETE. Currently, he is working on ways to mitigate the negative effects of moderate time delay while remotely driving rovers, as well as designing software to manipulate a 36 degree of freedom robot using a stereo display and 6 degree of freedom stylus.

Jon Blossom

Jon Blossom leads a team devising and implementing new approaches to spacecraft design in concurrent engineering sessions, primarily focused on early formulation and analysis of new mission ideas. Jon joined JPL with over 25 years of experience creating software experiences that reach beyond the keyboard, mouse, and monitor. His past projects have included software libraries, computer games, educational toys, art installations, and theme park attractions, many of which received multiple awards for quality and innovation in their own industries. Jon graduated from college with a degree in Religious Studies, a black belt in shotokan karate, a job writing media systems software for Microsoft, and the seed of his book on programming real-time 3D graphics systems in the very early days of Virtual Reality. He appreciates problems that require a similarly broad and eclectic range of knowledge combined with a creative approach to design and software development.

Krys Blackwood

Krys has been designing user experiences primarily for E-commerce companies for the last 20 years, in environments as small as 5 person startups and as large as Cisco Systems. She started her career as a hardware technician for IBM and most recently was the Lead User Experience Designer for eHealth Medicare. As a lifelong enthusiast of the space program, Krys leapt at the chance to evangelize User Experience at JPL.

Krys, her husband, their two cats and puppy all relocated from the silicon valley about a month ago. Her daughter stayed in San Francisco to finish out college. Krys is really enjoying getting to know the area and exploring the Lab. In her spare time, she reads, watches and writes science fiction, as well as practicing ballroom dance, archery, knitting, sewing and photography.

Mark Powell

Mark Powell is a senior computer scientist in the Human Interfaces group at the Jet Propulsion Laboratory. Since 2001 Mark has designed, led and implemented software systems to support robotic exploration of earth, sea, air, space and the planets. He co-developed the Science Activity Planner operations interface for the Mars Exploration Rover mission which received the NASA Software of the Year Award in 2004. He served as Cognizant Engineer (Product Lead) of the primary activity planning software for the Mars Science Laboratory rover mission from 2009 to 2012. Currently Mark is leading software design and development efforts in support of Earth science collaboration using the cloud and the development and use of mobile and web applications in Mars rover operations and public outreach. Mark’s areas of interest include visualization, agile development with design and Scrum, mobile applications, mapping, image processing and 3D graphics.

Marsette (Marty) Vona

Marty is an Engineering Applications Software Engineer with over 17 years of experience in applied robotics research and software development including 3D graphical human interfaces. He is currently working in the Ops Lab on augmented reality applications for increased operator awareness and enhanced visual communication.

Marty received an A.B. from Dartmouth College (1999) and M.S. (2001) and Ph.D. (2009) degrees in Electrical Engineering and Computer Science from MIT. From 2010-2014 he was an Assistant Professor of Computer Science at Northeastern University where he developed and taught courses in applied robotics algorithms and software, computer graphics, geometric algorithms, and introductory math and programming for computer science. His work in robotics research includes hardware, software, and algorithms for several novel self-reconfiguing modular robots (1998-2007), new metrology systems and planning algorithms for manufacturing (2000-2005), and climbing and walking robots (2006-14). His most recent research in 3D perception for robots locomoting in uncertain natural environments (2010-14) was funded by an NSF CAREER award. He has published 30 peer-reviewed papers.

Marty previously worked at JPL from 2001-03 as a software engineer. His main contributions during that period were the 3D visualization and graphical user interface components of the MER Science Activity Planner, a recipient of the NASA software of the year award in 2004. Collaboration with JPL continued as he returned to MIT to complete his Ph.D., which included the development of the MSim environment for operating high-degree-of-freedom robots including JPL’s ATHLETE.

Marty enjoys being outdoors, hiking, cooking, building and fixing mechanical and electronic things, and spending time with his wife and their pug Cleopatra.

Matt Clausen

Matt works on Hololens projects at JPL.

Nat Guy

Nat joined the Human Interfaces Group in 2016. Prior to joining JPL, Nat spent five years as a software engineer at Nintendo of America and three years as a video game translator, and has also worked on system software testing for SpaceX and on ground station development for the Hakuto Lunar XPRIZE team. He has Master’s degrees in Computer Science and Aerospace Engineering, and greatly enjoys development at the intersection of the two disciplines. At JPL, he works on 2D data visualization and authoring tools for advanced telemetry analysis, as well as 3D augmented reality applications for mission operations and planning (both projects support current Mars Science Laboratory operations). In his spare time he enjoys hardware hacking projects, traveling to Japan, and writing puzzles.

Parker Abercrombie

Parker Abercrombie works in the intersection of computer science, data analysis, and software engineering. Parker holds an M.A. in Geography from Boston University, and a B.S. in Creative Studies with emphasis in Computer Science (which he swears is more technical than it sounds) from the University of California, Santa Barbara. He has a special interest in geographic information systems, and has worked with teams at NASA and the U.S. Department of Energy on systems for geographic visualization and data management. Parker has also worked with several early-stage start-up companies to turn ideas into technology products. In his spare time, Parker enjoys baking bread and playing the Irish wooden flute.

Tom Crockett

Tom Crockett is a computer scientist and engineer who has developed mission operations software at JPL since 2005. One of his first projects was to design and develop a tiled image viewer which allows smoothly panning and zooming around extremely large images, such as the stitched panoramas taken by robots on other planets. He also developed a system for creating and sharing annotations on top of such images, to facilitate collaborative planning. For the Mars Science Laboratory he created an integrated development environment (IDE) for developing robot command sequences, exploiting the close analogy between robot sequencing and programming. His technical interests include programming languages, data visualization, concurrency and armchair math.

Work with us

The Operations Lab creates the software that is used to command all JPL spacecraft (like Curiosity rover), ranging from desktop to AR and VR systems. The Human Interfaces (HI) group researches, designs, prototypes and develops the ways users interact with these systems.

Full Time and Internship Opportunities

If you’re interested, please send us a resume. Include a 1-paragraph summary of your experience, and goals at JPL. Much of our work is on active space flight missions, so unfortunately we can only work with US nationals.

User Research

HI Group User Researchers direct efforts to ground development in observation and evidence. User researchers define critical unknowns, and develop plans to study, interpret and document user needs.

Interaction Design

HI Group Interaction Designers direct efforts to evaluate and translate user needs into actionable insights and system capabilities. They rapidly prototype and evaluate paper and software prototypes, and simulation and role play.

Visualization

HI Group Visualization Developers direct efforts to create software that allows scientists and engineers to move, parse, analyze, interact with, and share discoveries with the massive data NASA instruments and spacecraft generate every day. They build systems that merge alien landscapes with ambiguous sensor data, create new kinds of interactive maps, and express complex robotic controls in compact visual language.

Keep exploring!