Direct Mind-Machine Interfaces Open Up New Computing Possibilities

Comments Off on Direct Mind-Machine Interfaces Open Up New Computing Possibilities
Direct Mind-Machine Interfaces Open Up New Computing Possibilities

When the computer mouse was introduced in the 1980s, much was made of this revolutionary technology.  We were told how intuitive and natural it was to point at something and click to indicate our choice.  In the meantime, endless variations on the mouse were being hastily developed to jump on the bandwagon of this supposed advantage. Touch pads appeared, followed by joysticks and numerous other gadgets to move the cursor around on screens.

But soon, some users began to develop carpel tunnel syndrome.  And then people began to wake up to the fact that there was nothing intuitive or natural about a mouse and a cursor.  In fact, aging Baby Boomers, with their eyesight failing, began to find it harder and harder to hit the miniscule targets on the screen with the tiny moving arrow. 

This will all be a thing of the past soon, however, as truly intuitive interfaces between man and computer reach the market.  Many of them are being tested right now and hold out promise not only for easier use of computers but for medical and therapeutic innovations as well.  Some are already available.

According to the New York Times,1 a new headset made by Emotiv Systems in San Francisco picks up electrical signals from the brain, facial muscles, and elsewhere, and can translate them into commands for controlling a computer.  The headset sells for $299.  It comes with software to train the user's brain. 

Another headset, called the Neural Impulse Actuator, is made by OCZ Technology Group in Sunnyvale, California and costs $169.  It senses eye and facial movements and converts them into commands. 

While Emotiv and OCZ are aiming mainly at the computer game player now, they represent the tip of a very large technological iceberg.

In June of 2008, for example, researchers at Keio University in Japan demonstrated a system that reads brain waves and uses signals from the sensory-motor cortex to control an avatar in the multi-player online game Second Life.  According to ScienceDaily, a completely disabled person was able to play the game, moving around and conversing with other avatars.2  This is only the most recent advance to come out of the marriage of neuroscience, computer technology, and the Internet. 

In another development, University of Florida researchers are working on a device that not only converts brain signals to computer instructions, but also evolves and learns as time goes on...

To continue reading, become a paid subscriber for full access.
Already a Trends Magazine subscriber? Login for full access now.

Subscribe for as low as $195/year

  • Get 12 months of Trends that will impact your business and your life
  • Gain access to the entire Trends Research Library
  • Optional Trends monthly CDs in addition to your On-Line access
  • Receive our exclusive "Trends Investor Forecast 2015" as a free online gift
  • If you do not like what you see, you can cancel anytime and receive a 100% full refund