Johns Hopkins Applied Physics Laboratory presents:
2019 xR Symposium

Kossiakoff Center, Laurel, MD
July 23 – 24, 2019

*This event is for APL sponsors, U.S. government employees, and health care industry professionals only. A limited number of presentations will require US citizenship.

REGISTER NOW

Overview

Join us for the Johns Hopkins Applied Physics Laboratory’s inaugural xR Symposium, where we’ll showcase how JHU APL is exploring the use of virtual (VR), augmented (AR), and mixed reality (MR) technologies. Some of the many questions the Symposium will address are:

  • What are the differences between virtual, augmented and mixed reality?
  • How do I know which is the right technology for my application?
  • What is needed to develop, deploy and maintain an immersive solution?
  • How do I know which hardware or software products I should be using?

This event will cover technologies including:

  • Robotics and Brain Computer Interfaces
  • Social, Thermal, and Hyperspectral Imaging
  • Spacecraft Design and Integration
  • Wayfinding for Emergency Responders
  • Visualizing Finite Element Analysis
  • Prosthetics
  • Remote Collaboration
  • Visualizing Spacecraft Science Data
  • Emotion Recognition and Deception Perception
  • Haptic Feedback

Attendees will have the opportunity to view and experience demonstrations of various APL VR/AR/MR projects that are making critical contributions to our nation’s current and future challenges in the following application areas:

  • Military and Intelligence
  • Disaster recovery
  • Space
  • Medical
  • Human factors engineering
  • Education

All presentations are geared toward government employees in the DOD, DOE, NASA, Homeland Security, Special Forces, and other Departments and Agencies.

Agenda

7:00 a.m.–8:00 a.m.
Registration

8:00 a.m.–8:30 a.m.
Welcome and Overview

8:30 a.m.–11:30 a.m.
Session I (Auditorium)

10:00 a.m.–11:30 a.m.
Session I (Dining Area)

11:30 a.m.–1:00 p.m.
Break for Lunch

1:00 p.m.–3:00 p.m.
Session II (Auditorium)

3:00 p.m.–5:00 p.m.
Exhibits and Demonstrations

8:30 a.m.–11:30 a.m.
Session I (Auditorium)

9:30 a.m.–11:30 a.m.
Session I (Dining Area)

11:30 a.m.–1:00 p.m.
Break for Lunch

1:00 p.m.–3:00 p.m.
Session II (Auditorium)

1:00 p.m.–3:00 p.m.
Session II (Classrooms 3 & 4) * US Citizens Only

Presentation Abstracts

Scott Simpkins, Applied Physics Laboratory

We’ve come a long way! Virtual reality, Augmented Reality and Mixed Reality have increased in their capabilities in the last few years, with many new technologies and providers coming to market. This overview provides definitions of each XR technology, their current capabilities, how they differ and short demos of each. We’ll close this overview with the critical challenges facing the future expansion and usability of XR technologies in general, and where we perceive we are on the “Gartner Hype Cycle.” Specific examples and demonstrations of XR technologies will follow this overview.

James Dean, Applied Physics Laboratory

Proto-HEAD (Prototype Holograms Enabling Analysis of Data) is an initial proof of concept to visualize and interact with 3D data utilizing the Microsoft HoloLens. This application renders 3D data as a point cloud contained within an interact-able 3D wire cube, which can be placed onto physical surfaces and resized if necessary.

Nicholas Vavalle, Applied Physics Laboratory

An anthropomorphic crash test dummy was developed for the Army to provide better injury prediction capabilities in under-body blast testing. The testing was enhanced with the use of a finite element model to accelerate the design process and provide simulated injury prediction capabilities. Subsequently, a new method using augmented reality was required that allowed an audience to experience the technology first hand. This talk will describe how augmented reality allows the user to connect with the project in their own surroundings while receiving information about various ATD parts at their own pace.

Blake Schreurs, Applied Physics Laboratory

Most Augmented Reality devices are designed to be used under highly-controlled conditions. The AR AoA required an analysis of Augmented Reality systems to characterize their performance in potential real-world scenarios. How well do they work outdoors? In the dark? How long can an operator use a system in the field? Understanding the conditions under which certain devices could, or could not, be used for various kinds of field work is instrumental in knowing which devices should ultimately be selected for certain tasks.

*Please note: this presentation requires US citizenship.

Kelles Gordge, Applied Physics Laboratory

Utilizing AR to validate virtual 3D CAD design with a physical environment. This presentation will demonstrate how the design team leveraged the MS HoloLens to provide the sponsor a full scale immersive visualization of the shipboard container outfitted with new equipment.

Arthur Tucker, Applied Physics Laboratory

ARMD is an APL application for enhancing the understanding of the 3D design space and rapidly generating spacecraft paths and mission plans in an immersive environment.

Robert Berardino, Applied Physics Laboratory
Arthur Tucker, Applied Physics Laboratory

ARMOUR X is an APL application for visualizing earth-orbiting resident space objects (RSO) and associated attributes in an interactive and collaborative setting using the Microsoft HoloLens.

Blake Schreurs, Applied Physics Laboratory

This presentation will pose some thoughts about where xR combined with other emerging technologies like AI, ML and BCI may one day take us. Topics will include future user interfaces, human/machine collaboration, and potential social impacts.

Stephen Bailey, Applied Physics Laboratory

The goal of this project is to incorporate LiDaR data into the warfighter’s field of view to effectively provide X-ray vision of their surroundings. In essence, letting them see the enemy behind physical obstacles in the environment. By feeding LiDaR data directly into the HoloLens we are able to overlay a high-fidelity reconstruction of the warfighters environment with their physical surroundings. By attending this presentation, you will learn how LiDaR and mixed reality can work together.

David Garber, Applied Physics Laboratory
Jeff Garretson, Applied Physics Laboratory

The ability to allow post-processing and immersive visualization of 3D geometry and data fields that are prohibitively large or unwieldy is critical. This talk describes a method whereby large datasets may be converted into binary formats and stored in memory such that real-time visualization and view manipulation are possible.

*Please note: this presentation requires US citizenship.

Arthur Tucker, Applied Physics Laboratory
James Curbo, Applied Physics Laboratory

CAMVAS is an integrated, space mission-based, user-defined operating picture (UDOP) to support mission planning, situational awareness during mission execution, and course of action (CoA) analysis before, during and after the mission.

Julie Marble, Applied Physics Laboratory

ESCAPE is a two-player cooperative virtual reality puzzle-platform game developed for the experimental study of trust. Players must coordinate non-verbally to jointly overcome challenges insurmountable when attempted alone.

Josh Steele, Applied Physics Laboratory

APL is proposing the Dragonfly mission to explore Saturn’s moon Titan and investigate its prebiotic organic chemistry. The mission would use a rotorcraft designed to fly in Titan’s dense atmosphere and low gravity. Instrumentation includes cameras to image the surface, including stereo data that will be used to create three-dimensional representations of Titan’s terrain. We describe a Virtual Reality (VR) environment that allows 3-D immersive operations planning for exploration of Titan. The current implementation simulates Titan with terrestrial data and imagery. As the rotorcraft flies over terrain, multiple perspectives are available, such as Dragonfly camera fields of view as well as third-person views from outside the rotorcraft. The VR environment can also be scripted to create sequences illustrating Dragonfly operations scenarios. The VR environment would be used by engineers and scientists for mission planning and science operations.

Ben Nuernberger, NASA Jet Propulsion Laboratory
Lucien Parsons, University of Maryland
Blake Schreurs, Applied Physics Laboratory
John Grant, Defense Intelligence Agency

Panel members will share their experiences regarding how their organizations began using xR, and discuss successes and challenges in implementing these technologies.

James Dean, Applied Physics Laboratory

This use case describes how virtual reality is a powerful tool to obtain initial suitability assessment of prospective installation sites. Using existing digital terrain data, installation specifications, and models of critical infrastructure, this application enables users to gain insight over feasibility of site construction at locations of interest as well as identify potentially superior locations in the surrounding area. Attendees will gain an understanding of the ways VR can benefit contractors for new building locations and feasibility.

Michael Boyle, Applied Physics Laboratory

This presentation will highlight how you can quickly and easily import a BIM file using Enscape to create a virtual tour of a facility during construction prior to occupation. The speaker will highlight benefits realized by the future occupants as well as the construction team. Using the HTC Vive Pro, users can navigate and tour the interior and exterior of a new APL building before construction is completed.

Victoria Scalfari, Applied Physics Laboratory

VR/AR solutions may not always fit every use case. This presentation attempts to explore methods to roadmap suitable VR/AR applications with real world problems. Exploiting the strengths of AR/VR based on the driving need, rather than using technology because it's new and trendy.

Blake Schreurs, Applied Physics Laboratory

This is a proof-of-concept Processing Exploitation and Dissemination (PED) system designed with a spatial-first ideology. This system allows analysts and warfighters to work with intelligence information in virtual reality, on desktop or mobile device, presenting data in a form most appropriate for a given client device. This project accelerates the analysis process for 3D data, improves timeliness of dissemination and looks establish a more effective visualization mechanism than is currently provided to today's analysts and warfighters.

Alexandra Gonzalez, Applied Physics Laboratory

Consumer grade VR/AR tech has made it affordable to distribute immersive systems for complex environments. Research has been conducted to mix VR components with mechanical components. Mixing these components is not trivial and presents a considerable learning curve. However, the real world haptic feedback makes this an attractive offering.

Ariel Greenberg, Applied Physics Laboratory

IN:UREfACE (Investigating Nonverbals: Using mixed-Reality Environments for Augmented Consideration of Emotion) is a prototype mixed-reality system that accentuates the expressions of an emoting individual by overlaying real-time psychophysiological information on the subject’s face.

Ben Nuernberger, NASA Jet Propulsion Laboratory

The NASA Jet Propulsion Laboratory is improving procedure execution using augmented reality (AR) for various space applications. This talk will discuss a controlled user experiment comparing AR instructions to paper instructions, as well as explore the use of AR in NASA’s underwater NEEMO Aquarius habitat and by Astronaut Scott Kelly on the International Space Station, among others.

Billy Gallagher, NASA Goddard Space Flight Center
Thomas G. Grubb, NASA Goddard Space Flight Center

This presentation will briefly talk about the work that NASA’s Goddard Space Flight Center AR/VR Pilot Program Initiative has been doing on applying AR and VR to science and engineering. We will present a high-level overview of applications that we are developing to visualize and interact with science data in Earth Science, Planetary Science, Astrophysics and Heliophysics. We will also present our work on a mixed-reality toolkit targeted at engineering domains throughout the spacecraft and instrument lifecycle, discussing high-level features of the toolkit with a deeper dive into how the toolkit is being used for the Restore-L mission.

Ariel Greenberg, Applied Physics Laboratory

The Novel Perception project seeks to enhance users’ awareness of the world through naturalistic access to sources of information beyond the basic human senses. The system collects signals from a variety of sensors, synchronizes them in real time, registers them in real space, and then overlays them onto the real world by display though a mixed-reality headset. The team has implemented social, thermal, and hyperspectral overlays with plans to develop physiological, radiological/nuclear, and radio frequency overlays. The team will further explore the use of artificial intelligence and brain-computer interfaces to manage the volume and velocity of the data streaming from these additional sensor modalities.

James Howard, Applied Physics Laboratory

When disaster strikes, what was once isn't, and what remains is unrecognizable. From first responders to insurance claims agents, the ability to understand the way things were is critical for the response, rescue, and recovery phases of disaster management. We will create a system that uses data from Google Maps and official records to create an AR overlay for goggles showing the world as it was.

Jacob Hoffman, Applied Physics Laboratory

Using prefabs in unity greatly increases the reuse of code and lowering the barrier for risk of having non-coders on a development team. This presentation will explore the benefits of prefabs and talk about some of the more advanced ways to manage them.

Devin Hahne, Applied Physics Laboratory
Felipe Ruiz, Applied Physics Laboratory

This talk accentuates the concept of experiencing design intelligence. It explains how two years of applying the Jet Propulsion Laboratory's ProtoSpace AR application have benefited Parker Solar Probe and Europa Clipper, and thinking ahead for what could be next.

Brock Wester, Applied Physics Laboratory
Jordan Matelsky, Applied Physics Laboratory

The geographic movement of individuals and assets is a complicated maze of relational data, further complicated by the individuals’ relationships or allegiances to organizations and regions. Understanding this depth and complexity of information is difficult even on purpose-built systems using conventional compute architectures. APL's MINARD project upgrades the “war-room” to a virtual reality space. This system now provides analysts with a collaborative and secure virtual environment in which they can interact with and study complex and noisy data such as alliances, the transit of individuals or groups through 3D space, and the evolution of relationships through time. APL engineers designed intelligent visualization systems to bring the best of human intuition to state of the art virtual reality, with human-machine teams interacting both through the VR headset as well as behind a conventional computer terminal.

Blake Schreurs, Applied Physics Laboratory
Arthur Tucker, Applied Physics Laboratory

VESPAR is an APL educational tool for comprehending complex relationships of observational data by experiencing environmental data from the point of view of the Parker Solar Probe (PSP) spacecraft in AR.

Matthew Fifer, Applied Physics Laboratory
Brock Wester, Applied Physics Laboratory

Learn how APL has integrated VR and AR technology into our prosthetic and assistive technology developments to facilitate development, patient training, and clinical testing. Several of these training scenarios have also been adapted for STEM outreach efforts for middle and high school aged students as a way to discuss robotics, prosthetics, and neuroscience.

Blake Schreurs, Applied Physics Laboratory

Being able to visualize a constrained space at human scale in an immersive platform is extremely helpful when addressing human factors requirements. The utilization of 3D CAD models and Unity enables rapid ideation and design changes to accommodate human factors such as height (reachability) and safety.

Victoria Cambell, Applied Physics Laboratory
Ashley Morse, Applied Physics Laboratory
Suman Sajjan-Woolums, Applied Physics Laboratory

Paratroopers need to pay attention to the GPS strapped on their chest, altimeter and compass on their wrist, and keeping situational awareness of their surroundings when in mid-air. Focusing on metrics while maneuvering in the air causes a stressful environment that can be mitigated by WARrIOR. Warfighter Augmented Reality for Operational Readiness co-locates relevant user information, such as altitude, glide ratio, wind direction and speed, and target navigation onto a GUI layout projected by an APL built AR method. The AR was designed to have a form factor that is low profile enough to fit under the user’s existing eyewear and is designed modularly to upgrade and replace sensors and tools on the user. WARrIOR aims to bring useful information for the user to view in one place while designed to fit in with the current sensors and eyewear.

*Please note: this presentation requires US citizenship.

Chad Smith, Space Telescope Science Institute

Recent advances in the field of XR, or cross reality, have created new opportunities for communicating complex information through virtual mediums. This talk will focus on WebbVR, a virtual reality public outreach product created by the Space Telescope Science Institute that aims to inform and educate the public about the James Webb Space Telescope and the astrophysics that Webb is designed to explore. We will cover a variety of topics including 3D user interface and experience design, targeted audiences and learning objectives, to the benefits and limitations of virtual reality in public communications.

*Please note: a limited number of presentations will require US citizenship.

Venue

The Johns Hopkins University Applied Physics Laboratory
Kossiakoff Center
11100 Johns Hopkins Road
Laurel, MD 20723

Campus Map and Directions

Contact Info

Nick DeMatt
Nick.DeMatt@jhuapl.edu