*This event is for APL sponsors, U.S. government employees, and health care industry professionals only. A limited number of presentations will require US citizenship.
Join us for the Johns Hopkins Applied Physics Laboratory’s inaugural xR Symposium, where we’ll showcase how JHU APL is exploring the use of virtual (VR), augmented (AR), and mixed reality (MR) technologies. Some of the many questions the Symposium will address are:
This event will cover technologies including:
Attendees will have the opportunity to view and experience demonstrations of various APL VR/AR/MR projects that are making critical contributions to our nation’s current and future challenges in the following application areas:
All presentations are geared toward government employees in the DOD, DOE, NASA, Homeland Security, Special Forces, and other Departments and Agencies.
Proto-HEAD (Prototype Holograms Enabling Analysis of Data) is an initial proof of concept to visualize and interact with 3D data utilizing the Microsoft HoloLens. This application renders 3D data as a point cloud contained within an interact-able 3D wire cube, which can be placed onto physical surfaces and resized if necessary.
An anthropomorphic crash test dummy was developed for the Army to provide better injury prediction capabilities in under-body blast testing. The testing was enhanced with the use of a finite element model to accelerate the design process and provide simulated injury prediction capabilities. Subsequently a new method using augmented reality was required that allowed an audience to experience the technology first-hand. This talk will describe how augmented reality allows the user to connect with the project in their own surroundings while receiving information about various ATD parts at their own pace.
Most Augmented Reality devices are designed to be used under highly-controlled conditions. The AR AoA required an analysis of Augmented Reality systems to characterize their performance in potential real-world scenarios. How well do they work outdoors? In the dark? How long can an operator use a system in the field? Understanding the conditions under which certain devices could, or could not, be used for various kinds of field work is instrumental in knowing which devices should ultimately be selected for certain tasks.
*Please note: this presentation requires US citizenship.
Displaying augmented reality designs in 1:1 true-scale adds a sense of realism and depth to a project, becoming a powerful tool that allows users to inform design decisions as if they could see the final product in front of them. Gain insight into how APL’s mechanical design engineers are using augmented reality in every step of the design process, from early prototype brainstorming and modeling to displaying final products and capabilities in fully-realized space.
ARMD is an APL application for enhancing the understanding of the 3D design space and rapidly generating spacecraft paths and mission plans in an immersive environment.
ARMOUR X is an APL application for visualizing earth-orbiting resident space objects (RSO) and associated attributes in an interactive and collaborative setting using the Microsoft HoloLens.
AR is many things and available on many devices. AR is really challenging to accomplish, lots of technical issues still to be solved. This presentation will give a few examples of what, and cover some (of the many) AR developments to date. Peddie believes AR will change the social structure of our lives.
The goal of this project is to incorporate LiDaR data into the warfighter’s field of view to effectively provide X-ray vision of their surroundings. In essence, letting them see the enemy behind physical obstacles in the environment. By feeding LiDaR data directly into the HoloLens we are able to overlay a high-fidelity reconstruction of the warfighters environment with their physical surroundings. By attending this presentation, you will learn how LiDaR and mixed reality can work together.
The ability to allow post-processing and immersive visualization of 3D geometry and data fields that are prohibitively large or unwieldy is critical. This talk describes a method whereby large datasets may be converted into binary formats and stored in memory such that real-time visualization and view manipulation are possible.
*Please note: this presentation requires US citizenship.
CAMVAS is an integrated, space mission-based, user-defined operating picture (UDOP) to support mission planning, situational awareness during mission execution, and course of action (CoA) analysis before, during and after the mission.
ESCAPE is a two-player cooperative virtual reality puzzle-platform game developed for the experimental study of trust. Players must coordinate non-verbally to jointly overcome challenges insurmountable when attempted alone.
APL is proposing the Dragonfly mission to explore Saturn’s moon Titan and investigate its prebiotic organic chemistry. The mission would use a rotorcraft designed to fly in Titan’s dense atmosphere and low gravity. Instrumentation includes cameras to image the surface, including stereo data that will be used to create three-dimensional representations of Titan’s terrain. We describe a Virtual Reality (VR) environment that allows 3-D immersive operations planning for exploration of Titan. The current implementation simulates Titan with terrestrial data and imagery. As the rotorcraft flies over terrain, multiple perspectives are available, such as Dragonfly camera fields of view as well as third-person views from outside the rotorcraft. The VR environment can also be scripted to create sequences illustrating Dragonfly operations scenarios. During science operations, the VR environment would be used by engineers and scientists for mission planning and science operations.
Members of the panel will share their experiences with how their organizations got started with xR and answer questions from the audience. Attendees can expect to hear tips on what worked as well as what didn't work.
Mobilizing staff across various portions of a large organization to create a cooperative community of practice doesn’t happen by accident. Though ultimately a grass-roots effort, these communities require planning and intentional action. This presentation will discuss the process used by APL’s staff to establish a community of practice, the impact that’s had on the organization, how it drives value to sponsors, and the fundamental questions that need to be addressed to establish a community of practice.
This presentation will highlight how you can quickly and easily import a BIM file using Enscape to create a virtual tour of a facility during construction prior to occupation. The speaker will highlight benefits realized by the future occupants as well as the construction team. Using the HTC Vive Pro, users can navigate and tour the interior and exterior of a new APL building before construction is completed.
VR/AR solutions may not always fit all every use case. This topic attempts to explore methods to roadmap suitable VR/AR applications with real world problems. Exploiting the strengths of AR/VR based on the driving need rather than using technology because it's new and trendy.
This is a proof-of-concept Processing Exploitation and Dissemination (PED) system designed with a spatial-first ideology. This system allows analysts and warfighters to work with intelligence information in virtual reality, on desktop or mobile device, presenting data in a form most appropriate for a given client device. This project accelerates the analysis process for 3D data, improves timeliness of dissemination and looks establish a more effective visualization mechanism than is currently provided to today's analysts and warfighters.
Consumer grade VR/AR tech has made it affordable to distribute immersive systems for complex environments. Research has been conducted to mix VR components with mechanical components. Mixing these components is not trivial and presents a considerable learning curve. However, the real world haptic feedback makes this an attractive offering.
IN:UREfACE (Investigating Nonverbals: Using mixed-Reality Environments for Augmented Consideration of Emotion) is a prototype mixed-reality system that accentuates the expressions of an emoting individual by overlaying real-time psychophysiological information on the subject’s face.
The NASA Jet Propulsion Laboratory is improving procedure execution using augmented reality (AR) for various space applications. This talk will discuss a controlled user experiment comparing AR instructions to paper instructions, as well as explore the use of AR in NASA’s underwater NEEMO Aquarius habitat and by Astronaut Scott Kelly on the International Space Station, and more!
This presentation will briefly talk about the work that NASA’s Goddard Space Flight Center AR/VR Pilot Program Initiative has been doing on applying AR and VR to science and engineering. We will present a high-level overview of applications that we are developing to visualize and interact with science data in Earth Science, Planetary Science, Astrophysics and Heliophysics. We will also present our work on a mixed-reality toolkit targeted at engineering domains throughout the spacecraft and instrument lifecycle, discussing high-level features of the toolkit with a deeper dive into how the toolkit is being used for the Restore-L mission.
The Novel Perception project seeks to enhance users’ awareness of the world through naturalistic access to sources of information beyond the basic human senses. The system collects signals from a variety of sensors, synchronizes them in real time, registers them in real space, and then overlays them onto the real world by display though a mixed-reality headset. The team has implemented social, thermal, and hyperspectral overlays with plans to develop physiological, radiological/nuclear, and radio frequency overlays. The team will further explore the use of artificial intelligence and brain-computer interfaces to manage the volume and velocity of the data streaming from these additional sensor modalities.
We’ve come a long way! Virtual reality, Augmented Reality and Mixed Reality have increased in their capabilities in the last few years, with many new technologies and technology providers coming to market. This overview provides definitions of each XR technology, their current capabilities, how they differ, as well as short demos of each. We close this overview with the critical challenges facing the future expansion and usability of XR technologies in general, and where we perceive we are on the “Gartner Hype Cycle.” Specific examples and demonstrations of XR technologies will follow this overview.
When disaster strikes, what was once isn't, and what remains is unrecognizable. From first responders to insurance claims agents, the ability to understand the way things were is critical for the response, rescue, and recovery phases of disaster management. We will create a system that uses data from Google Maps and official records to create an AR overlay for goggles showing the world as it was.
Using prefabs in unity greatly increases the reuse of code and lowering the barrier for risk of having non coders on a development team. This topic will explore the benefits of prefabs and talk about some of the more advanced ways to manage them.
This talk accentuates the concept of experiencing design intelligence. It explains how two years of applying the Jet Propulsion Laboratory's ProtoSpace AR application have benefited Parker Solar Probe and Europa Clipper, and thinking ahead for what could be next.
The geographic movement of individuals and assets is a complicated maze of relational data, further complicated by the individuals’ relationships or allegiances to organizations and regions. Understanding this depth and complexity of information is difficult even on purpose-built systems using conventional compute architectures. APL's MINARD project upgrades the “war-room” to a virtual reality space. This system now provides analysts with a collaborative and secure virtual environment in which they can interact with and study complex and noisy data such as alliances, the transit of individuals or groups through 3D space, and the evolution of relationships through time. APL engineers designed intelligent visualization systems to bring the best of human intuition to state of the art virtual reality, with human-machine teams interacting both through the VR headset as well as behind a conventional computer terminal.
VESPAR is an APL educational tool for comprehending complex relationships of observational data by experiencing environmental data from the point of view of the Parker Solar Probe (PSP) spacecraft in AR.
Learn how APL has integrated VR and AR technology into our prosthetic and assistive technology developments to facilitate development, patient training, and clinical testing. Several of these training scenarios have also been adapted for STEM outreach efforts for middle and high school aged students as a way to discuss robotics, prosthetics, and neuroscience.
Being able to visualize a constrained space at human scale in an immersive platform is extremely helpful when addressing human factors requirements. The utilization of 3D CAD models and Unity enables rapid ideation and design changes to accommodate human factors such as height (reachability) and safety.
Paratroopers need to pay attention to the GPS strapped on their chest, altimeter and compass on their wrist, and keeping situational awareness of their surroundings when in mid-air. Focusing on metrics while maneuvering in the air causes a stressful environment that can be mitigated by WARrIOR. Warfighter Augmented Reality for Operational Readiness co-locates relevant user information, such as altitude, glide ratio, wind direction and speed, and target navigation onto a GUI layout projected by an APL built AR method. The AR was designed to have a form factor that is low profile enough to fit under the user’s existing eyewear and is designed modularly to upgrade and replace sensors and tools on the user. This presentation will review the optics that drive most near eye displays and the low profile approach of this concept. WARrIOR aims to bring useful information for the user to view in one place while designed to fit in with the current sensors and eyewear.
*Please note: this presentation requires US citizenship.
Recent advances in the field of XR, or cross reality, have created new opportunities for communicating complex information through virtual mediums. WebbVR is a virtual reality public outreach product created by the Space Telescope Science Institute that aims to inform and educate the public about the James Webb Space Telescope and the astrophysics that Webb is designed to explore.
The Johns Hopkins University Applied Physics Laboratory
Kossiakoff Center
11100 Johns Hopkins Road
Laurel, MD 20723
Nick DeMatt
Nick.DeMatt@jhuapl.edu