Abstract
Millions of people use computers every day. They interact with the computer using a mouse and keyboard. This interaction causes a slowdown and bottlenecks the speed of computers. People use computers to get things done faster, but the current interface is not sufficient. A new and faster interface needs to developed and made so that it is the standard interface. Much research is already being done into these areas. Some companies have already built computers with new interfaces. Once these interfaces are refined even more, an interface will be standardized that does not slow down the user.
1. Introduction
As a shopper walks down the computer isle, he is greeted by an employee advertising a new computer product. The employee tells the shopper of the computer that can be controlled by their brain. All that is required of the shopper is the purchase of a new computer and the purchase of surgery to implant the computer chip on the shopper’s brain that allows the shopper to control the computer. The shopper, who is one of the few people that still does not accept these new technologies, makes an angry gesture, and runs out of the store.
This is the future of the computer industry. Eventually, people will control computers with their thoughts. Today, it is a radical concept. But in the future, it will be as commonplace as the keyboard and mouse.
Today, almost all of some one’s life can be lived online through a computer. People can create avatars based upon them in massively multiplayer online video games, chat with friends on social networking sites, or watch videos or listen to music. Modern computers allow all of this to be loaded onto the screen within seconds.
But once all of this is loaded, the whole system is bottlenecked by the time it takes the user to interact with the information on the screen. The user must move the mouse with their hand or press the arrow keys with their fingers in order to get to the information they desire. This is very inefficient.
Even typing papers is inefficient. Most people type their papers on a computer and print them out to be graded. The user must physically type in every letter of every word. For someone with arthritis this can be very difficult, and for someone with a disability, this can be impossible. People should not be conforming to interfaces, “interfaces should be conforming to us” [4].
Brain-computer interfaces (BCI) allow these disabled people to use a computer. A BCI is essentially a computer chip put either directly on the brain or on the outside of the skull which collects the brain signals from a person’s brain. The chip sends the information back to the computer and the computer can execute the signals that the brain originally sent out.
Touch screens are also a new interface that can put the keyboard and mouse out of business. With touch screens, people can use their hands to manipulate the data on the screen and control the computer. Some of these screens have support for multiple users to control the objects on the screen at any given time.
The way people interact with a computer should be fast and simple. Touch screens and brain-computer interfaces will help to make this happen. The age of the keyboard and mouse is over. A new computer interface will become the standard computer interface and it will change the way people use computers forever.
2. Two Groups
The main problem in the whole mess is the keyboard and mouse. Today, it is hard to find a reason why they were made the standard interface. But just thinking about it makes one realize that it was the simplest way to interact with a computer. With a keyboard and mouse, people were able to navigate through the information on the screen and type in any command they wanted. The keyboard and mouse were also the cheapest and basically the only options decades ago.
Many people are noticing that the keyboard and mouse should not be the standard. These people are split up into two separate groups with the same goals. One group believes that touch interfaces should become the standard. The other group believes that a brain-computer interface is the future. There are drastic differences between these two groups.
3. Touch Screen Interfaces
Some people believe in the touch screen interface as the future of computer interfaces. There is evidence that this is already becoming reality. Apple’s iPhone is controlled only by the user’s hands. Tablet PC’s have been around for years and allow the user to navigate through everything on the screen using a special pen. These Tablet PC’s have not become very popular and they still require the user to type using a keyboard. These touch interface devices are still relatively new and expensive.
But there is much research being done to make touch interfaces the future standard interface. Microsoft’s Surface is evidence of this. Surface allows the user to navigate through their digital content with their fingers. Microsoft expects to make Surface available in 2008 and ship it with a 30 inch screen. Surface will at first only feature multimedia options, but Microsoft plans on making it the future of computers in general.
Fig. 1 A picture of Microsoft’s Surface computer.
A company called IO2 Technology has taken touch screens to a whole new level. Their product, called the M3 Heliodisplay, projects the display into the air and allows the user to navigate through the data using their fingers. The current version of the Heliodisplay allows for a computer, DVD player, TV, or other video source to be connected and it will display the image in the air. The image is displayed in a two dimension plane, but it looks three dimensional. The Heliodisplay can connect to any computer through a USB port. This makes the Heliodisplay a new interface that can be used today.
A company called Perceptive Pixel is also working on touch screen interfaces. This company was started by Jeff Han in 2006. They are currently the furthest ahead in the race to create software for their touch screens. As they make even better software, touch screens will become even more popular. Since they mostly do research, they will be one of the last companies to release a commercial product.
Touch screens are currently the favorite to win the race to become the next standard computer interface. They allow for work to be done faster than with the keyboard and mouse. Also the technology is very capable of being mass produced today. The only problem is that they don’t offer as mush power as brain computer interfaces.
4. Brain-Computer Interfaces
The other group developing a new interface believes that a brain computer interface (BCI) is the solution. BCIs read electrical signals sent through the brain and translate these signals into a form that computers can understand and then convert into some action. This technology is good for people who have spinal cord injuries, allowing them to control computers, televisions, or other devices.
Research first began on BCI’s in the early 1970’s. Through the years, BCI sensors were placed in rats, mice, monkeys, and humans. In the 1990’s, a sensor was implanted in a paralyzed man’s brain, and he was able to control a computer cursor.
Fig. 2 An illustration of how a brain-computer interface works.
As with most types of technology, there are a couple ways that BCI’s can be made. There are invasive techniques, in which the sensor is implanted directly on the brain, and noninvasive techniques, in which sensors are placed on caps covering the brain and skull. The decision between the two is not clear. Invasive techniques are more effective but require surgery and can cause infections. Noninvasive techniques can read a wider range of brain activity.
Many different types of BCI’s already have been made using invasive techniques. A company called Cyberkinetics Neurotechnology Systems has made the Brain-Gate Neural Interface System. This device gives patients with spinal cord injuries the ability to control a computer. Researchers at Brown University on the other hand are trying to learn the way the brain turns our thoughts into our actions. They have been able to capture brain signals and convert them into a computer-readable format.
Some companies have taken the noninvasive approach. The New York State Public Health Department’s Wadsworth Center is using an electroencephalogram (EEG) cap on the outside of the skull to capture brain signals. In order for the system to be more effective, the subjects are taught to control their thought process. This machine allows people with speech problems to communicate. This is done by showing the patient letters and images and their brainwaves spike when they see something they want to say. This process is currently slow, only allowing for two to four words to be developed per minute. The researchers are trying to design a faster system by working on the signal-analysis methods of the machine.
Fig. 3 Example of what an EEG cap looks like.
Controlling a computer is not the only goal of brain-computer interfaces. Japan’s Honda Motor Corp. and ATR Computational Neuroscience Laboratories have used brain signals to control robot movements. Subjects are placed in a MRI scanner and move their hands and fingers. The MRI signals are sent to a computer, and the computer tells the robot hand the way it should move. Since this system requires a MRI machine, it is not very portable.
Other researchers are also focused on body movement. Researchers at Stanford University are trying to identify the signals the brain makes when it is planning to move the body. Knowing this could help improve mathematical estimates of how the body moves, which in turn would create faster systems. This creation would help to push these systems into a realm that has not currently been reached.
A different type of research is being done at Columbia University. Scientists at Columbia University’s Laboratory for Intelligent Imaging and Neural Computing are creating the fastest image searching tool. The system uses the brain’s ability to notice elements in images much faster than humans. The user wears an EEG cap and is shown images very quickly. The system ranks the images based on the brain’s activity when each image was shown. This system would be vastly superior to computers, which cannot recognize objects in images.
Scientist at Finland’s Helsinki University of Technology Laboratory of Computational Engineering’s Cognitive Science and Technology Research Group are also using an EEG cap. They are capturing brain signals from a person thinking about hand movements and converting these signals into text typed from a keyboard. This is a major step in getting rid of the keyboard and mouse interface.
Canada’s Carleton University is using a BCI system to replace fingerprint scans and eye scans. EEG signals generated by the brain are unique to each person. The system has not yet been built, but in concept users would have a thought as a password. The system would recognize their thought as the password and allow them to enter. This would allow users to change their password easily.
The company Neural Signals has released a BCI system aimed at speech-restoration. An electrical chip is surgically placed in the part of the brain that controls speech. The chip captures the electrical signals that are sent when the user wants to talk. The system then converts the signals into the words that the user wants to say.
Brain computer interfaces offer people the ability to utilize all of the power that a computer has. Computers would finally be able to be used just like the human brain because the two would be connected. BCI’s are still very early in their development, and as a result, will not see commercial development for decades. But the world will be greatly impacted when their time comes.
5. Conclusion
A change in computer interfaces is imminent. Touch screens are already available to the public at large. These early products have some flaws though. They are all marked by high prices. The manufacturers have not had a chance to develop alternative or better ways to make touch screen systems yet. There is also much software that needs to be developed in order for the touch screens to be used to their full advantage. Although touch screen interface technology “has begun to receive serious attention from the systems analysis, design, research, and development community, its potential for adaptive aiding has not been realized” [7].
BCI technology is improving every day, but it is still not ready for widespread use. Since it is so new, researches are still trying to adapt it to different patients. BCIs are also very expensive and very large. They are complex to use and require technicians to be present. Users must learn to control their thoughts, which can take months. Companies are not investing the time and money needed to make effective products. The accuracy of the systems is not high enough either.
All of this does not rule out new interfaces from becoming commercial products. Major companies have already given support to touch screens. In the next few years, they will become more prominent and cheaper to make, resulting in touch screen interfaced computers.
Brain-computer interfaces have a little different story. In the coming years, they will continue to be refined, and will help many disabled people control computers. It will eventually be developed for everyone. People will have chips implanted on their brains, and they will control computers with their thoughts. This is very far off, but it will eventually happen because it allows computers to be used at a speed unthought-of in today’s keyboard and mouse era. This is why new interfaces will be standardized and they will revolutionize the world.
References
[1] Microsoft Corporation, “About Microsoft Surface,” Microsoft Surface: About Surface, 2008. [Online]. Available: http://www.microsoft.com/surface/about.html [Accessed April 1, 2008].
[2] IO2 Technology, “Overview,” IO2 Technology: Heliodisplay/ Interactive Free Space Display, 2007. [Online]. Available: http://www.io2technology.com/technology/overview. [Accessed: April 1, 2008].
[3] E. Grabianowski, "How Brain-Computer Interfaces Work," HowStuffWorks.com, Nov. 02 2007. [Online]. Available: http://computer.howstuffworks.com/brain-computer-interface.htm. [April 1, 2008].
[4] Jeff Han, Unveiling the Genius of Multi-Touch Interface Design. [Videorecording]. TED Conferences, 2006.
[5] “Method of the Month: EEG,” Sept. 4, 2007. [Online]. Available: http://brainvat.wordpress.com/2007/09/04/method-of-the-month-eeg/. [April 10, 2008].
[6] Sixto Ortiz Jr, “Brain-Computer Interfaces: Where Human and Machine Meet”; http://csdl.computer.org/dl/mags/co/2007/01/r1017.htm.