While recent developments in brain-computer interface (BCI) technology have given humans the power to mentally control computers, nobody has used the technology in conjunction with the Second Life online virtual world — until now.
A research team led by professor Jun’ichi Ushiba of the Keio University Biomedical Engineering Laboratory has developed a BCI system that lets the user walk an avatar through the streets of Second Life while relying solely on the power of thought. To control the avatar on screen, the user simply thinks about moving various body parts — the avatar walks forward when the user thinks about moving his/her own feet, and it turns right and left when the user imagines moving his/her right and left arms.
The system consists of a headpiece equipped with electrodes that monitor activity in three areas of the motor cortex (the region of the brain involved in controlling the movement of the arms and legs). An EEG machine reads and graphs the data and relays it to the BCI, where a brain wave analysis algorithm interprets the user’s imagined movements. A keyboard emulator then converts this data into a signal and relays it to Second Life, causing the on-screen avatar to move. In this way, the user can exercise real-time control over the avatar in the 3D virtual world without moving a muscle.
Neuroscientists have significantly advanced brain-machine interface (BMI) technology to the point where severely handicapped people who cannot contract even one leg or arm muscle now can independently compose and send e-mails and operate a TV in their homes. They are using only their thoughts to execute these actions.
Thanks to the rapid pace of research on the BMI, one day these and other individuals may be able to feed themselves with a robotic arm and hand that moves according to their mental commands.
In previous studies, this lab developed the technology to tap a macaque monkey’s motor cortical neural activity making it possible for the animal to use its thoughts to control a robotic arm to reach for food targets presented in 3D space.
In the Pittsburgh lab’s latest studies, macaque monkeys not only mentally guided a robotic arm to pieces of food but also opened and closed the robotic arm’s hand, or gripper, to retrieve them. Just by thinking about picking up and bringing the fruit to its mouth, the animal fed itself.
The monkey’s own arm and hand did not move while it manipulated the two-finger gripper at the end of the robotic arm. The animal used its own sight for feedback about the accuracy of the robotic arm’s actions as it mentally moved the gripper to within one-half centimeter of a piece of fruit.
“The monkey developed a great deal of skill using this physical device,” says Meel Velliste, PhD. “We are in the process of extending this type of control to a more sophisticated wrist and hand for the performance of dexterous tasks.”