A natural user interface (NUI) is computer interface defined by how a user interacts with it, i.e., naturally, without the use of a keyboard or mouse. You’ll often hear of the interface being “effectively invisible.” For you science fiction fans, think Minority Report gadgetry (except for the time travel stuff).
Natural user interfaces are being developed to recognize
- Gaze vectors
- Facial expression
- Handheld device movements
- Biometrics such as heart rate, temperature, pupil size, body temp, heart rate, and even sweat (aka skin impedence)
(Wiki NUI group, 2010)
NUI Design Considerations
Designers of natural user interfaces focus on how humans interact with their environments through their senses (touch, vision, speech) and consider how sensory interactions impact cognitive processing and creativity. For example, despite the development of many excellent paint programs, there’s something intrinsically more satisfying about using your hands and paint brushes/pencils to sketch something. There’s some sense of immediacy when you push an object on a screen and it moves out of your way.
Some design advice for creating interfaces is offered by Ergosign in this Slideshare presentation.
Despite the sensory richness of NUIs, good design focuses on enhancing interactivity while reducing cognitive load.
Where have we been? Where are we going?
While this is an area of extensive research, the future is already here in a basic sense with the advent of touch screen mobile devices, tablets, and netbooks.
Some more advanced NUI devices now on the table include:
This multi-touch platform by Nuiteq can be integrated into diverse user environments including public spaces, schools, business settings and immersive/simulation environments.
Perceptive Pixel provides multi-touch interfaces with a variety of applications for personal use and collaborations.
Microsoft Surface is an interface that senses the characteristics of objects placed on it and displays information about that object. It’s currently being used in certain AT&T retail stores.
It’s also been used to create a patient consultation interface.
Part of the Open Cobalt Metaverse Project, Edusim is designed to provide a 3D virtual immersive whiteboard experience.
Use of Edusim as a teaching tool in medical education, is exemplified in this video.
Kinect for Xbox 360 (aka Project Natal)
This NUI is designed to be part of a video game platform that enables users to interact with the Xbox 36o via a webcam-like add-on using gestures, speech, and presented objects. A video demonstrating a very sophisticated implementation of the product is shown below.
In this video, XBox Kinect is used to teach learners how to engage in healthy behaviors by making exercise fun and interactive through the Your Shape: Fitness Evolved Platform.
Other health applications of Xbox Kinect technology are described in this article by Chris Niehaus.
Lest you think that NUIs are all about gestural interactions, remember applications like Dragon Naturally Speaking, a voice converter that allows users to translate speech to computer interactions like typing.
Implications for education
As noted by Don Norman in his article, “Natural User Interfaces Are Not Natural,” marketing rhetoric when it comes to NUIs is probably ahead of reality. There are many complications to be worked out, particularly when it comes to developing interfaces responsive to gestures. Additionally, designers should remember that not all gestures are universal and there are cultural nuances to consider (e.g., head shakes and hand shakes can mean different things in different cultures).
However, NUIs are attractive tools for education and training because of the potential to create:
- A sense of immediacy and immersion in the learning environment
- A sense of individualized engagement
- An adaptive learning environment that responds as the learner interacts with it
- Collaboration environments that respond quickly to users’ intentions (e.g., via gestures)
- A virtual lab experience (e.g., allowing users to interact with virtual models and other objects)
- Educational games (for some of the reasons described above)
- Educational tools for learners who learn differently (e.g., learners with autism)
- Educational tools for learners with physical challenges (e.g., learners with mobility challenges might use text commands to drive computer interactions)
Given the enthusiasm for touchscreen devices like the iPad in education, it’s not hard to imagine the integration of more complex devices in schools and training environments.
Important factors influencing adoption will include:
- Cost-benefit ratios (very device/application specific)
- Mobility of a particular platform/device
- The learning curve needed to optimize use of the tool
- The ability to integrate the device into natural learning (and work) flows
Just as many useful tools can be used to produce horrendous instructional materials (think PowerPoint, for example), NUIs will be no exception. However, I do see this as an exciting, evolving space to watch.
Some Additional Resources