Friday, March 13, 2015

Brain-machine interface – Science fiction or essential patient care?

We live in an increasingly technological age. From iPhone apps that allow users to track their pregnancy or menstrual cycles to alarm clocks that sense sleep patterns to wake the user feeling refreshed and invigorated, technology is becoming more and more integrated with our daily lives and health. Technology has allowed the average person to become more in touch with their biological rhythms.


Prosthetic limbs, cochlear implants, and pacemakers take this a step further, modifying the human body with technology. These advances have drastically improved the quality of life of millions of patients world wide. As the power of modern technology and our understanding of human physiology grow, scientists continue to develop models in which technology can supplement a loss of biological function due to disease, genetic disorder, or injury.

With the development of Brain-machine interface (BMI) devices, scientists have found a way to tap into the complex signaling of the human brain, and integrate technology seamlessly into human function. These spectacular advancements that seem to bring the world of science fiction into reality have applications beyond sending hands free emails with simply the power of thought, or changing the television channel with out moving searching for the remote. Researchers hope that the continued development of BMIs will provide assistance for patients suffering from paralysis by decoding movement-related neural signals. Computer software will then integrate these signals to guide computer cursors or prosthetic limbs.






Jose Carmena and colleagues have been investigating Brain-machine interfaces as a means of communication for patients with severe motor disabilities. One type of interface uses functional near-infrared spectroscopy (fNIS) to measure blood flow changes caused by neuron firings. This method avoids the potentially dangerous procedure of implanting electrodes in the brain, and cuts down on electrical noise. fNIS uses light with a wavelength of 650-1000 nm emitted from a device placed on the scalp. These waves diffuse through the skull and grey matter. Some of the waves that pass through the cortical region of the brain are absorbed by oxygenated and deoxygenated hemoglobin (HbO and HbR). The waves admitted by excited photons from HbO and HbR are picked up by receptors attached to the scalp nearby. Due to differing absorption coefficients, computer software can then calculate the changing concentrations of HbO and HbR. This information is used to map blood flow changes in capillary networks related to neuron firing.

This method of signal acquisition, which bypasses the peripheral nervous system, can be used to intercept and interpret neural signals in patients suffering from motor disabilities. In fact, fNIS methods were able to decode the yes-or-no responses of 70% of ALS patients who had lost the ability to control all or most voluntary movement. However, development of a more accurate decoding system is necessary for a BMI to be implemented for more complicated communication, or the control of a prosthetic limb.

While scientists are working to better understand the brain’s signaling and apply this to the control of prosthetics, it seems that the brain is adapting to better communicate with machines. Carmena and colleagues reported that when BMI decoders do not accurately interpret neural signals, neural activity is actually modified. This was observed in trials where human subjects used a BMI connection to control an on-screen cursor to complete tasks. When the decoder did not correctly model the subject’s neural activity, neural adaptation was seen to occur. This meant that when a subject wanted to move an on-screen cursor but was not able to, their brain adapted to send the kind of signals that the decoder could model.

These findings are particularly interesting because it means that instead of creating highly advanced decoders capable of interpreting complicated neural signaling, researchers can instead focus on teaching the brain to use simple decoders. There is so much that researchers do not know about the brain and it’s vastly complicated signaling, yet we are beginning to find that it is not necessary to understand it all to harness its power. In the very near future, we can expect to see BMI technology revolutionizing patient care on a large scale, and making its way into our daily lives.

References:

Shenoy, K. V., & Carmena, J. M. (2014). Combining Decoder Design and Neural Adaptation in Brain-Machine Interfaces. Neuron, 84(4), 665-680.


6 comments:

  1. As you mentioned in your blog, technology is advancing at an exponential rate and the probability for new biomedical technology is ever-increasing. However, I wonder at what point will humans be uncomfortable with technology? Although humans significantly integrate technology into their lives, I feel as though humans still want to hold the power and control over technology. Will machine have the ability to take over the brain-machine interface?

    ReplyDelete
  2. I think that the idea of the brain adapting to better interface with machines interesting. It seems like the brain has no "preference" for whether some desired function is executed by an organic or an artificial appendage- whatever neural circuits get the job done will be adopted. As brain-machine interfaces get better and better, I wonder whether processing functions themselves will start to be outsourced to the devices on the artificial side of the interface. Rather than brain activity triggering the movement of a prothesis, why not let some impulse trigger a complex function? This kind of progression would make it possible for the limited number of thoughts possible in a given period of time to control a greater number of simultaneous functions than would otherwise be possible.

    ReplyDelete
  3. I think Catherine's title for this blog post is spot on; it really does seem that the realm of science fiction is becoming reality with these brain-machine interfaces (BMIs). As with Erin's comment, I agree that with these new advances in artificial intelligence, there must come a certain point where humans are not comfortable with more and more technology integration in their own lives. The use of BMIs highlighted in this study do seem to be beneficial for humans, and promising for individuals suffering from paralysis or similar conditions. This technology is constructive and may help the patients to lead more normal lives with a broader range of communicative functions. However, I can see that as research progresses, BMIs could possibly be used in less constructive ways, from menial tasks such as changing the TV channel to more dangerous implications that blur the line between the individual and the technology they are connected to. This machine-brain takeover may also seem like science fiction, but BMIs demonstrate that science fiction is increasingly becoming our reality.

    ReplyDelete
  4. As others have indicated, there is a fine line between using technology to help people who need it, or to help us become lazier. Technology could take over. I feel, however, we cannot let our fears of technology becoming too powerful inhibit us from creating potentially very helpful and life altering technologies. People are going to create technological innovations that consumers want, even if they are superfluous, just because they are profitable, so we might as well also harness these new technologies and innovations to help those in need.

    ReplyDelete
  5. I'm amazed that the brain is able to modify the signals it sends out in order to be able to effectively communicate with the brain-machine interface. The BMI could possibly be used to discover more about how the brain functions, which is a really exciting prospect. Does anyone know whether stray thoughts can have an influence on how this works? I worry that machines which connect directly to the brain may act on every stray thought I have, like saying something which comes to mind but which I don't actually want to say. Anyways, trivial doubts aside, I hope people can use the BMI to learn more about the brain, and I'm also happy to see that this is non-invasive. The thought of installing electrodes, computer chips, or something else in a person's brain is worrisome. Especially considering how widespread the use of BMI might be in the distant future.

    ReplyDelete
  6. Living in the technological age is something we have to be wary about. Sure, using our technologies to help fix serious medical issues of people in need, but at what point does it become too much? Humans are not going to stop here, in all likelihood they will use that technology and spread it out to commercial use for consumers without people who need it. These enhancements can corrupt and separate aspects that are uniquely human. The blog post talks about cochlear implants. Recently, I viewed a documentary about how many deaf people were wary of cochlear implants. They said that hearing people do not understand their subculture and deaf people do not see their loss of hearing as a disability (they can still communicate through sign language). At what point are we infringing on people's identity and changing what is unique about them? These technologies can be useful for curing life debilitating diseases, but humans will eventually go overboard.

    ReplyDelete