π΄ Website π https://u-s-news.com/
Telegram π https://t.me/usnewscom_channel
A top US Army general stationed in South Korea said heβs been turning to an artificial intelligence chatbot to help him think through key command and personal decisions β the latest sign that even the Pentagonβs senior leaders are experimenting with generative AI tools.
Maj. Gen. William βHankβ Taylor, commanding general of the Eighth Army, told reporters at the Association of the United States Army conference in Washington, DC, that heβs been using ChatGPT to refine how he makes choices affecting thousands of troops.
βChat and I have become really close lately,β Taylor said during a media roundtable Monday, though he shied away from giving examples of personal use.
His remarks on ChatGPT, developed by OpenAI, were reported by Business Insider.
βIβm asking to build, trying to build models to help all of us,β Taylor was quoted as saying.
He added that heβs exploring how AI could support his decision-making processes β not in combat situations, but in managing day-to-day leadership tasks.
βAs a commander, I want to make better decisions,β the general explained.
βI want to make sure that I make decisions at the right time to give me the advantage.β
Taylor, who also serves as chief of staff for the United Nations Command in South Korea, said he views the technology as a potential tool for building analytical models and training his staff to think more efficiently.
The comments mark one of the most direct acknowledgments to date of a senior American military official using a commercial chatbot to assist in leadership or operational thinking.
The US military has been pushing to integrate artificial intelligence into its operations at every level β from logistics and surveillance to battlefield tactics β as rival nations like China and Russia race to do the same.
Officials say AI-driven systems could allow faster data processing and more precise targeting, though they also have also raised concerns about reliability and accountability when software takes on roles traditionally reserved for human judgment.
The Pentagon has said future conflicts could unfold at βmachine speed,β requiring split-second decisions that exceed human capability.
Former Air Force Secretary Frank Kendall warned last year that rapid advances in autonomous weapons mean βresponse times to bring effects to bear are very short,β and that commanders who fail to adapt βwonβt survive the next battlefield.β
AI has already been tested in combat simulations, including an experiment by the Air Force and the Defense Advanced Research Projects Agency in which an algorithm piloted a modified F-16 jet during a mock dogfight.
Other programs are being used to sift through satellite data, track logistics and streamline administrative paperwork for units in the field.
The Armyβs Special Operations Forces have adopted similar tools to reduce what they call the βcognitive burdenβ on operators β using AI to draft reports, process mission data and analyze intelligence at scale.
At the same time, Pentagon officials are urging caution.
Defense leaders have warned that generative AI systems can leak sensitive information or produce faulty conclusions if the data is incomplete or manipulated.
Taylor acknowledged one of the challenges of using the cutting-edge tech is keeping pace with the rapid evolution of AI tools β including ensuring they meet the militaryβs strict security requirements.
ChatGPT has drawn global scrutiny as governments and companies rush to understand its promise and its pitfalls.
While newer versions of the program are capable of complex reasoning and analysis, theyβve also been shown to produce errors and fabrications.