In old movies we were going to improve society by making everything think like a computer. Now the goal is to make computers think like brains. Researchers at Missouri University of Science and Technology say they can make power network management more efficient by literally tapping brain cells grown on networks of electrodes.
The Missouri S&T group, working with researchers at Georgia Institute of Technology, plans to use the brain power to develop a new method for tracking and managing the constantly changing levels of power supply and demand.
Led by Dr. Ganesh Kumar Venayagamoorthy, associate professor of electrical and computer engineering, the researchers will use living neural networks composed of thousands of brain cells from laboratory rats to control simulated power grids in the lab. From those studies, the researchers hope to create a “biologically inspired” computer program to manage and control complex power grids in Mexico, Brazil, Nigeria and elsewhere.
“We want to develop a totally new architecture than what exists today,” says Venayagamoorthy, who also directs the Real-Time Power and Intelligent Systems Laboratory at Missouri S&T. “Power systems control is very complex, and the brain is a very flexible, very adaptable network. The brain is really good at handling uncertainties.”
Venayagamoorthy hopes to develop a system that is “inspired by the brain but not a replica. Nobody really understands completely how the brain works.”
The research is funded through a $2 million grant from the National Science Foundation’s Office of Emerging Frontiers in Research and Innovation.
The Missouri S&T team will work with researchers at Georgia Tech’s Laboratory for Neuroengineering, where the living neural networks have been developed and are housed and studied. A high-bandwidth Internet2 connection will connect those brain cells over 600 miles to Venayagamoorthy’s Real-Time Power and Intelligent Systems Laboratory. Missouri S&T researchers will transmit signals from that lab in Rolla, Mo., to the brain cells in the Atlanta lab, and will train those brain cells to recognize voltage signals and other information from Missouri S&T’s real-time simulator.
Venayagamoorthy’s lab is capable of simulating a power grid the size of Nigeria’s, or a portion of the combined New England and New York grid in the United States.
The Pentagon’s crash program to create an artificial brain is just about up and running. And, if it all goes as planned, we could see an electronic chip that mimics the “function, size, and power consumption” of a cat’s cortex some time in the next decade. Darpa, the Defense Department’s way-out research arm,
is in late-stage negotiations with Malibu’s HRL Laboratories to spearhead its Systems of Neuromorphic Adaptive Plastic Scalable Electronics (“SyNAPSE“) program. The goal: Build a chip with a “neuroscience-inspired architecture that can address a wide range of cognitive abilities — perception, planning, decision making, and motor control,” a company release notes.
The first nine-month phase of the program will focus on designing, fabricating, and characterizing synaptic and neural elements and combining them into a high-density, interconnecting microelectronic “fabric,” which will be incorporated into a more complex system-level fabric design…
In the following 15-month phase, HRL [a joint venture between Boeing and General Motors] will combine the synaptic and neural elements to fabricate and demonstrate “cortical microcircuits” that can model various lower-level brain functions and actually “learn” by interacting with the environment.
“The follow-on phases of the project will create a technology that functions like the brain of a cat, which comprises 108 neurons and 1012 synapses,” Dr. Narayan Srinivasa, SyNAPSE Program Manager and Senior Scientist, said. “The human brain has roughly 1011 neurons and 1015 synapses.”
The source article from Wired has plenty of interesting links in it.
This is only a few months after Henry Markram announced he wants to build an artificial rat brain and put it in a robot rat body in only two years.
Is your overflowing e-mail in-box a herald of the next stage in human evolution? Those e-mails represent just a small sample of the vast amount of digital information being generated by the gigabyte every minute. If we can cope with this rising flood of information, we are likely to be on track for using technology in the creation of superhuman intelligence, according to Vernor Vinge, futurist, best-selling science fiction author, and retired professor of computer science. Machines will become far more than just tools; they will physically merge with us, seamlessly endowing powers that are currently beyond our imagination. And all of this will happen in our lifetime, Vinge says.
DISCOVER asked Vinge about the consequences of living in a networked world that generates and distributes more and more data every day and how to cope with information overload.
Also see this article.
Imagine being trapped in your own body, aware of what’s going on around you but unable to move or even speak.Thanks to a modern technological innovation known as a neural interface — a direct link between the human brain and a computer — there may be hope for sufferers of what’s commonly known as “locked-in syndrome.”
As portrayed in the 2007 movie “The Diving Bell and the Butterfly,” locked-in patients are conscious, but fully paralyzed except for their eyes.
Thanks to advances in life-support technology and rising survival rates following brain-stem strokes, there may now be as many as 50,000 locked-in patients in the United States, the National Institutes of Health estimates.
No matter how hard you try, your mind can’t bend a spoon or channel the powers of a Jedi knight. Thanks to a new headset under development by neuroengineering company Emotiv Systems, however, you may soon be able to do this and more via the magic of video games.
By the end of this year, San Francisco–based Emotiv’s sensor-laden EPOC headset will enable gamers to use their own brain activity to interact with the virtual worlds where they play. The $299 headset’s 14 strategically placed head sensors are at the ends of what look like stretched, plastic fingers that detect patterns produced by the brain’s electrical activity. These neural signals are then narrowed down and interpreted in 30 possible ways as real-time intentions, emotions or facial expressions that are reflected in virtual world characters and actions in a way that a joystick or other type of controller could not hope to match.
Ambient Corporation has demonstrated a “voiceless” phone call. The call was made using a neckband called Audeo, which translates thoughts into speech by intercepting nerve signals. Although the device’s recognition abilities are currently limited to 150 words, is the company predicts it will be fully functional by the end of the year. Possible applications range from helping the disabled to performing discreet phone calls in public places.
In a recent conference held by microchip manufacturer Texas Instruments, the co-founder of Ambient Corporation, Michael Callahan, demonstrated the Audeo’s abilities. It seems that after careful training, a person can send nerve signals to his vocal cords, signals which can be ‘picked up’ by the Audeo and relayed wirelessly to a computer. The signals are converted into words, which are spoken by a computerized voice.
Users that might worry about the system voicing inner thoughts can relax. Callahan says that the production of nerve signals for the Audeo requires “a level above thinking”, meaning a conscious effort must be taken. A user must think specifically about voicing his words, or the Audeo will not intercept the signals. The new device has previously been used by handicapped people who were able to control wheelchairs using their thoughts.
While recent developments in brain-computer interface (BCI) technology have given humans the power to mentally control computers, nobody has used the technology in conjunction with the Second Life online virtual world — until now.
A research team led by professor Jun’ichi Ushiba of the Keio University Biomedical Engineering Laboratory has developed a BCI system that lets the user walk an avatar through the streets of Second Life while relying solely on the power of thought. To control the avatar on screen, the user simply thinks about moving various body parts — the avatar walks forward when the user thinks about moving his/her own feet, and it turns right and left when the user imagines moving his/her right and left arms.
The system consists of a headpiece equipped with electrodes that monitor activity in three areas of the motor cortex (the region of the brain involved in controlling the movement of the arms and legs). An EEG machine reads and graphs the data and relays it to the BCI, where a brain wave analysis algorithm interprets the user’s imagined movements. A keyboard emulator then converts this data into a signal and relays it to Second Life, causing the on-screen avatar to move. In this way, the user can exercise real-time control over the avatar in the 3D virtual world without moving a muscle.
Neuroscientists have significantly advanced brain-machine interface (BMI) technology to the point where severely handicapped people who cannot contract even one leg or arm muscle now can independently compose and send e-mails and operate a TV in their homes. They are using only their thoughts to execute these actions.
Thanks to the rapid pace of research on the BMI, one day these and other individuals may be able to feed themselves with a robotic arm and hand that moves according to their mental commands.
In previous studies, this lab developed the technology to tap a macaque monkey’s motor cortical neural activity making it possible for the animal to use its thoughts to control a robotic arm to reach for food targets presented in 3D space.
In the Pittsburgh lab’s latest studies, macaque monkeys not only mentally guided a robotic arm to pieces of food but also opened and closed the robotic arm’s hand, or gripper, to retrieve them. Just by thinking about picking up and bringing the fruit to its mouth, the animal fed itself.
The monkey’s own arm and hand did not move while it manipulated the two-finger gripper at the end of the robotic arm. The animal used its own sight for feedback about the accuracy of the robotic arm’s actions as it mentally moved the gripper to within one-half centimeter of a piece of fruit.
“The monkey developed a great deal of skill using this physical device,” says Meel Velliste, PhD. “We are in the process of extending this type of control to a more sophisticated wrist and hand for the performance of dexterous tasks.”
The Department of Defense is planning to implant microchips in soldiers’ brains for monitoring their health information, and has already awarded a $1.6 million contract to the Center for Bioelectronics, Biosensors and Biochips at Clemson University for the development of an implantable “biochip”.Soldiers fear that the biochip, about the size of a grain of rice, which measures and relays information on soldiers vital signs 24 hours a day, can be used to put them under surveillance even when they are off duty.
But Anthony Guiseppi-Elie, C3B director and Professor of Chemical and Biomolecular Engineering and Bioengineering claims the that the invivo biosensors will save lives as first responders to the trauma scene could inject the biochip into the wounded victim and gather data almost immediately.
He believes that the device has other long-term potential applications, such as monitoring astronauts’ vital signs during long-duration space flights and reading blood-sugar levels for diabetics.
SCIENTISTS have jump-started the consciousness of a man with severe brain injury in a world-first procedure in which electrodes were inserted deep into his brain.The 38-year-old, who had been in a minimally conscious state for six years after an assault, could only move his fingers or eyes occasionally and was fed through a tube.
Now he can chew, swallow and carry out movements like brushing his hair and drinking from a cup, say the US neuroscientists who carried out the procedure, known as deep brain stimulation.
I have here two movie clips that demonstrate the current state of affairs in the whacky world of mind control.
The first one shows a monkey moving a mechanical arm by thought.
The second one shows a guy lifting boxes in a virtual world, just by thinking about it.The technology holds great promise for disabled people.
Even though this technology is being developed with disabled people as the major motive, they won’t be the only ones that end up using the technology.
Imagine a day where people interact with their computer. By the time the technology hits the mainstream (at least a decade from now), computers will be small enough to be practically invisible.
We will have smoothly hooked up the Internet to our brains… or our brains to the Internet. It all depends on how you wanna look at it.
It might seem harmless enough, but in fact it is the beginning of our transcendance to something bigger.