Our Contribution
In remote conflict zones, medics face the challenge of providing critical trauma care with limited resources and under constant threat, often far from medical facilities. APL researchers leveraged the power of augmented reality, predictive anatomy visualization, artificial intelligence (AI), and real-time ultrasound to develop a tool that makes it possible to “see” where organs are situated beneath the skin and enable improved medical care guidance. The anatomy visualization aid relies on a statistical shape atlas—a detailed map of variations in human anatomy that the team built with hundreds of CT scans chosen to match the physical composition of warfighters in the U.S. military. Preliminary results showed the models can predict the shape of 66 different anatomical structures within the thorax with high accuracy for an individual’s specific anatomy and external anthropometry. Integrating this predictive model with an augmented reality headset and portable ultrasound scanner, medics can see a digital overlay of the patient’s predicted anatomy based on prominent external body landmarks. The visualization aid then provides the user real-time guidance on how and where to deliver a common medical scan, the Extended Focused Assessment with Sonography in Trauma (eFAST), which is a bedside ultrasound protocol to scan specific abdominal and thoracic organs to detect life-threatening injuries. This precision-guided approach enables early identification and care of battlefield injuries. APL is refining this technology for broader medical applications, enhancing its accuracy and usability in various field conditions.