Post by Scot Teller with LA Core IT Pro
There's probably no debate in IT more intense than that surrounding user interfaces.
And, like so many other great technical (and, for that matter, social) issues, there is no single right answer. A UI is just
that, the connection between the human and the incarnation of a technology that does something of use for the human.
UIs should be, IMHO, simple, intuitive, and adapt over time to the needs of the specific user. They are otherwise at their
best when they are constant, so as to minimize training and support costs.
One should not change UIs just to change
them with the goal of creating the illusion of something new (Microsoft's Vista has been justifiably criticized for doing
precisely that), but one needs to realize that no single UI will work for everyone all the time and that UI technology continues
Regardless, when I hear that someone doesn't like a product, more often than not the criticism is
directed at the UI, not the underlying technology itself.
Mobile devices represent a significant challenge to UI designers. Their
small physical stature means that traditional, successful UI strategies, most notably big keyboards, mice, and window-based
displays, will be problematic in highly-mobile (smaller than a notebook computer) packages.
We've tried it all
- pens and touch screens, Graffiti, cursor wheels a la BlackBerry, virtual
(on-screen) keyboards a la Windows Mobile, and on and on. It appears at this point that the touch screen and integral physical
micro-keyboard have won - although as a regular user of a Motorola Q, I must admit that I miss being able to use a pen as
a pointing device, but the otherwise one-handed operation does come in handy a lot of the time.
No matter, the user interface issue
is far from resolved, and innovation continues.
One of the
latest ideas is to use haptics as an element. If you've not heard that term before, it's from the Greek for touch, and in
this case touch feedback. I have a lot of experience with haptic interfaces from my days working with supercomputers.
We used a variety of (rather odd at the time) products to provide, for example, force feedback
to indicate the strength of numerically-represented materials or even molecular bonds being modeled. You can see an example
of a contemporary device for these applications here.
Haptics can be
limited to simple vibrational feedback when pressing a virtual key on a touch screen, or as a recent White Paper from haptics innovator Immersion Corporation notes, such possibilities as identifying unique information on a Web page, providing location-based
services cues ("you're getting closer"), context-aware message notification alerts, and, of course, games and entertainment.
By the way, you can skim the White Paper; it's overly academic. The useful stuff starts on page 9.
My guess is that haptics could become
very important on mobile devices going forward. Think about it - we have five senses. Taste and smell are of little value
in mobile UIs, except for the very, very strange among us. Sound is of limited value because it can disturb others nearby.
So, apart from sight, touch is it, and I think touch is going to get a lot bigger. You can already find haptics
in some handsets today, like the LG VX10000 (Verizon Voyager). On a related note, I've just begun a project examining fingerprint
recognition technologies and applications in mobile devices and applications, and I'll have more on that for you in a month
or so, with Clear-Cloud Network Post forum......