New Spatial Thinking

23:30 § Leave a comment

New spatial thinking projects are very useful in understanding how new relationships are being developed between information in interfaces and users controlling or interacting with these interfaces.
Real-time customization is being applied to an increasing number of new UI projects and creates a unique experience as it allows users to dynamically manipulate the organization
of information without needing to refresh a specific page (giving a user an uninterrupted experience).
Picture shows the ability to move modules on Facebook – giving the user freedom of control
Advertisements

Personalised Targeting for Shoppers

23:30 § Leave a comment

New advertising techniques are using facial recognition software to identify a shopper’s gender (with 85-90% accuracy), ethnicity and approximate age. With obvious attractions for marketers, they can then be targeted with ads for appropriate products – perfumes for women, for example.

Tokyo are also producing camera-equipped vending machines that suggest drinks to consumers according to their age and gender. Weather conditions and the temperature are taken into account too.

Interfaces Beyond Multitouch

23:30 § Leave a comment

That future of interface development may include using neurotransmitters to help translate thoughts into computing actions, face detection combined with eye tracking and speech recognition, and haptics technology that uses the sense of touch to communicate with the user.

For instance, the Nintendo Wii made popular the idea of using natural, gestural actions and translating them into movements on screen. The Wii controller takes the swinging motion made by a user and translates it into a golf swing, or it takes a thrust with a remote and turns it into a punch on the screen. At Drexel University’s RePlay Lab, students are working on taking that idea to the next level. They are trying to measure the level of neurotransmitters in a subject’s brain to create games where mere thought controls gameplay.

The lab created a 3-D game called Lazybrains that connects a neuro-monitoring device and a gaming engine. At its core is the Functional Near-Infrared Imaging Device, which shines infrared light into the user’s forehead. It then records the amount of light that gets transmitted back and looks for changes to deduce information about the amount of oxygen in the blood. When a user concentrates, his or her frontal lobe needs more oxygen and the device can detect the change. That means a gamer’s concentration level can be used to manipulate the height of platforms in the game, slow down the motion of objects, or even change the color of virtual puzzle pieces.

Motion-Detecting Interface

23:30 § Leave a comment

Terry Gou, CEO of Hon Hai (which makes iPods for Apple), said that the next iPod will be released soon and that it’ll have a “none-touch” interface.

They didn’t comment on what a “none-touch” interface could be. To me, the only sensible possibility is an internal motion sensor somewhat similar to the one Apple already uses in its laptops, but more finely-tuned — along the lines of the upcoming Nintendo Wii controller.

Extension of Human Activity

23:30 § Leave a comment

Ubiquitous revolution was foreseen by Mark Weiser (1991) who predicted: ‘In the 21st century the technology revolution will move into the everyday, the small and the invisible … The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.’

Mark Weiser 1991, The Computer for the 21st Century

The goal of ambient intelligence is to make it possible for users to interact naturally with their environment. For environments to be truly intelligent systems they are being developed to learn needs and preferences of the occupants, diagnose situations and then react to them in a context specific way.

‘A digital environment that proactively, but sensibly, supports people in their daily lives’

Augusto Boal 2007, A Digital Revolution


This requires the system to be sensitive, achieved through the ability to classify complex human behaviour and respond to their needs with appropriate use of technology. For example, a heating system can detect is someone is present in the room, but for it to react appropriately it needs to consider a wider context, someone watching TV for example might require a warmer room than someone undertaking exercise.

The development of integrated technologies means that the home of tomorrow will be more like the home of yesterday that the home of today. The way we experience technological power is about to change. Technology and computers will no longer be seen as an intermittent step between a physical us and an outcome – but as Intelligent systems operated through interfaces that are an intuitive extension of our natural speech and movements, through touch panels, heat and weight sensors and intelligent cameras that track our eyeball movements. All striving to atomise our personal preferences with ease to provide extra convenience and help in everyday tasks.

Where Am I?

You are currently browsing entries tagged with Intuitive Interfacing at Ambient Environments.