The brain implants, controlled by artificial intelligence and aimed at treating mood disorders, have been used on people for the first time.
AI-controlled brain implants designed to treat mood disorders have been tested on people for the first time, it’s been reported.
Researchers are being funded by the US military’s research arm - Defence Advanced Research Projects Agency (DARPA) - with the eventual goal of treating soldiers and veterans with depression and post-traumatic stress disorder (PTSD).
The devices use artificial intelligence (AI) to detect patterns linked to mood disorders and then adapt to shock the brain back into a healthy state.
US veterans are considered a higher suicide risk than ordinary civilians and up to 20 percent of them who served in Iraq and Afghanistan are estimated to suffer from PTSD.
But the use of brain implants raise ethical concerns, such as researchers being able to access a person’s inner feelings.
There is also people who believe the use of AI brain implants by the military could represent a slippery slope to something more sinister.
“Direct neurostimulation by deep brain implants is a potentially useful means for treating people with Parkinson’s disease,” wrote Dr Adam Henschke, an ethicist.
“But as Jens Clausen’s coverage of the neuroethics of deep brain stimulation shows, it has numerous unwanted side-effects: speech disturbance, memory impairment, increased aggression, hypomania, depression and suicide.
“It is important to recognize that the numbers vary across studies: 1.5-25% of research subjects displayed depression; increased aggression was observed in only 2% of the cases in one study.
“However, enhancement in the military context can directly impact when and how one decides to apply potentially lethal violence. The unwanted effects in this case are not merely side-effects: they demand primary consideration.
“Decisions made during war are literally matters of life and death, and any enhancement to moral decision-making in warfare would surely be a welcome development. But, if any cognitive enhancement technology were to undermine the capacity of a subject to follow the law of armed conflict, it would be a source of very serious concern indeed.”