Interfaces Beyond Multitouch
23:30 § Leave a comment
That future of interface development may include using neurotransmitters to help translate thoughts into computing actions, face detection combined with eye tracking and speech recognition, and haptics technology that uses the sense of touch to communicate with the user.
For instance, the Nintendo Wii made popular the idea of using natural, gestural actions and translating them into movements on screen. The Wii controller takes the swinging motion made by a user and translates it into a golf swing, or it takes a thrust with a remote and turns it into a punch on the screen. At Drexel University’s RePlay Lab, students are working on taking that idea to the next level. They are trying to measure the level of neurotransmitters in a subject’s brain to create games where mere thought controls gameplay.
The lab created a 3-D game called Lazybrains that connects a neuro-monitoring device and a gaming engine. At its core is the Functional Near-Infrared Imaging Device, which shines infrared light into the user’s forehead. It then records the amount of light that gets transmitted back and looks for changes to deduce information about the amount of oxygen in the blood. When a user concentrates, his or her frontal lobe needs more oxygen and the device can detect the change. That means a gamer’s concentration level can be used to manipulate the height of platforms in the game, slow down the motion of objects, or even change the color of virtual puzzle pieces.