Saturday, August 17, 2013

Your next GUI will be a BUI

When the common language of computers was first being established, engineers had to agree that the piece of code 1101 was the equivalent to an A (example). This collaboration helped create a standard working model that current and future developers could use to build off of. As computers continued to grown in sophistication, so did the standardized models that helped current developers push new software developments without having to rebuild the wheel.

After the standard model of computer architecture was firmly established in the computing community, a new method for operating computers started to manifest itself in the form of a Graphical User Interface (GUI). This GUI was a new and exciting development, and was one of the major launching points to the personal computer. However, this was the first GUI in development, and would be consumed by a mass audience that more than likely had never seen a computer, let alone a GUI before. Developers had to create a standard graphical model that allowed the end users, no matter what GUI they might be operating, to have a standard subconscious model of how this system operated. They did this by giving them an idea of how one GUI relates to another, and how to complete simple task with little cognitive strain.The GUI started to integrate itself into society and take a concrete form, new touch technologies began to be launched for mass consumption, repeating the same process as the GUI. This created a standard model of touch technology that set a precedence for what hand gestures represented for a certain input command to the device. This allowed for the standardized gestures to be adopted for mass integration into all touch technology. For example, the thumb and index finger coming together represents a close or zone out command for the device.

With the development of EEGs and new brain to computer technologies, a new standardized model needs to be developed for how to think and operate these new emerging technologies. For example, to operate a thought-guided helicopter that Professor Bin He and his team created, your EEG patterns or the thoughts that you are thinking to maneuver the device need to be calibrated to the computer. So, before you could begin to operate the helicopter, you would have to have the computer register what you are thinking for up, down, left, and right. Normally, to do this, most people either think of an object or color. Green for take off, red for stop, and a random object or color for left and right.process needs to take place in order for the device to understand what you are trying to convey. This is why a Brain User Interface (BUI) needs to be developed as a standard natural model to translate what we are trying to communicate to our devices. A standard operating procedure such as this would help standardized how to think and control our technologies with our mind until we are able to develop true mind reading technologies. This would lay down a foundation that is similar to gesture and GUI models so that the mass audience could adopt and apply the same ‘thought principles’ to all EEG devices. It was create out of the box devices that require little EEG calibration and operate on the same thought-principles as every other EEG device.

No comments:

Post a Comment