by Michael Poltorak Nielsen, Statsbiblioteket/State and University Library, Aarhus, Denmark
Current Mobile Interaction Paradigm
You do a lot with your hands, everyday. Our hands are a really good tool, but currently, the handheld interaction is based on glass. That is you do functions by sliding your fingers, which means there is no feedback on what it does, i.e. it’s not intuitive.
- direct manipulation
- gesture driven
- Google’s Gesture Search where you write down your search
- Project where buttons dimple when pressed, but costly.
The near future may mean combining something like the Wiimote and the iPhone.
The idea was to build an HTML5 app that searches library data, favourites, view own items, renew, and request. Currently in beta, but to be published soon.
The search app can be augmented with gestures, gestures combined with multi-touch interactions.
Possible interactions with focus on
- keyboard – typing
- microphone speech
- screen – touch, visuals
- camera – pattern, movement
- accelerometer – acceleration
- gyroscope – rotation
- compass – direction
- GPS – movement, position
Might include simple ones using accelerometer data, including
The problem is that gestures are only really supported by Firefox, and partially supported by Chrome. Thus, it was decided that development would move to the native iPhone app environment with gestures, and HTML5 web app without gestures (but possibly later when supported). Features that are implemented include:
- Restart search – face down
- Scroll – tilt up and down
- Switch views – tilt
- Request items – touch and tilt left
- Favourites – touch and tilt right
Check out the demo:
- no standard mobile gestures
- gesture maybe individual
- gesture may not be appropriate at all
- sophisticated gestures are hard to code