Virtual social life is possible with brain-machine interfaces

1 year ago
tgadmintechgreat
1846

primary goal in the field of neuroprosthetics focused on improving the lives of paralyzed patients by restoring their lost real abilities.

One example is a 2012 paper by neuroscientists Lee Hochberg and John Donoghue at Brown University. Their team trained two people with long-term paralysis – a 58-year-old woman and a 66-year-old man – to use a brain-machine interface (BMI), which deciphered signals from their motor cortex, to direct a robotic arm to the right place. reach out and grab things. One subject was able to pick up a bottle and drink from it using the device.

More recently, in 2017, a French team from the University Hospital of Grenoble surgically implanted an epidural wireless brain-machine interface into a 28-year-old tetraplegic man. After two years of training, the patient was able to control some of the exoskeleton’s functions using only his brain activity.

From advanced robotics to the delicate reinnervation of damaged peripheral nerves in patients’ arms and legs, these projects require outstanding medical and technological breakthroughs. Extensive development is still needed to realize real clinical applications of these approaches.

However, fully mastering the brain-computer interface itself—the precise translation of brain signal into intended action—may require a much simpler, cheaper, and safer technology: virtual reality. Indeed, in many BMI projects, initial training is based on virtual simulations: for example, before trying to control a real robotic arm, test subjects first learn how to control a virtual arm.

As the game world and metaverse evolve, the next big breakthroughs in BMI applications will take place in the virtual world before they are implemented in the real world. This has already been demonstrated by a team of researchers at Johns Hopkins University who were able to teach a paralyzed patient to fly a combat aircraft using a computer simulation of flight using BMI. According to their report, “From the subject’s point of view, this was one of the most exciting and entertaining experiments she has performed.”

In 2023, we will see many more BMI applications that allow people with disabilities to fully participate in virtual worlds. Initially, by taking part in simpler interactive spaces for communication, such as chat rooms; later, fully controlling 3D avatars in virtual spaces where they can shop, socialize, or even play games.

This refers to my own work at UC San Francisco, where we are creating a BMI to restore verbal communication. We can already train patients to communicate via text chat and real-time messaging. Our next goal is to achieve real-time speech synthesis. We have previously shown that it is possible to do this offline with good accuracy, but doing this in real time is a new challenge for paralyzed patients.

We are now expanding our work to include the ability to control face avatars, which will enrich virtual social interactions. Watching the movement of the mouth and lips when someone speaks greatly improves the perception and understanding of speech. The areas of the brain that control the speech tract and mouth also overlap with the areas that control non-verbal facial expressions, so facial avatars will also be able to express them more fully.

As virtual reality and BMI converge, it is no coincidence that tech companies are developing consumer applications for neural interfaces, both non-invasive and invasive. computers, but also how we interact with each other.

However, for paralyzed patients, the meaning is much more fundamental – it is about their ability to participate in social life. One of the most damaging aspects of paralysis is social isolation. However, as people’s social interactions are increasingly based on digital formats such as text messages and email, as well as in virtual environments, we now have an opportunity that was not there before. With brain-machine interfaces, we can finally satisfy this unmet need.

Leave a Reply