Saturday, July 12, 2014

Top tips for touch-screen design

How to strip-down features and design elements to increase usability and sales


To the untrained eye, a finger looks podgy and inaccurate in comparison to a mouse pointer.

How, you might wonder, can this flabby, clumsy little digit replace the pixel-perfect accuracy of a mouse?

Well, though human fingers may seem like bloated embarrassments, they are in fact more versatile. They can pinch, rotate, zoom, stroke, dab, slide, flick, and many other gestures are within the finger’s repertoire. But, best of all, the finger allows something called Direct Manipulation.

What is Direct Manipulation?
Well, consider this: When you move a computer mouse, this motion is conveyed along the wire connecting the mouse to the computer, then into the computer screen, then to a mouse pointer which moves in relation to the motion of your hand.

The action on the screen and the action of your mouse are connected, but they are indirect: The motion of the input device (the mouse) causes the motion of the pointer. Contrast this with the way a user’s fingers manipulates targets on a touchscreen mobile device:

 
On a touchscreen you literally and directly ‘move’ the objects you are interacting with.

Touchscreen manipulations, and the way we interact with actual 3D objects in real life, are almost exactly the
same. The manner, direction, distance, and speed of movements are all directly determined by your fingers. Fingers specify the direction of travel, how far the object moves, and how fast it moves. This is the first time in the history of computing that mass market devices have allowed direct, and literal, manipulation.

The touchscreen is totally unlike a mouse. With a mouse, you must fish about wildly on the screen, first to locate the pointer position, and then to adjust your movements relative to this starting point. The touchscreen, however, allows you to instantly ‘see’ where you want to touch and home in straight on this point. There is no need for orientation. This is the magic of touchscreens. The other great thing about touchscreens is that you have lots of fingers available to you at any one time. Again, contrast this with the traditional mouse or stylus input system. Mobile devices have been popularised by the emergence of the finger as a new input instrument.

It has often been said that the best camera is the one you have with you, the same is true of input devices. The best input device is the one you have with you. For most people on the move, this is the human digit. You don’t have to rifle through your dusty backpack for a stylus, or peck about on an inefficient travel keyboard. The touchscreen has liberated us from decades of cumbersome computer input paraphernalia.

Understanding finger ergonomics






Fingers can be remarkably efficient at inputting complex information. Consider the efficiency of sign language. See how vivid and meaningful hand gestures can be. Many touch screens respond to over eleven finger-touches simultaneously; this gives you enormous scope for different interaction methods in your apps.

You can create an infinite number of potential gestures, and these gestures can consist of approximately eleven touches in any combination. As an analogy: if you consider the eleven touches to be ‘keys on the keyboard’, these gestures are ‘words’- i.e. the result of combining letters (or, in our case, finger gestures).

Take a look at how the apps in the screenshots in this post use input gestures, enabling the user to perform complex manipulations very simply. Just for starters, finger manipulation allows you to scale, spin, trim, and edit objects. Interestingly, developers have observed that some gestures are beginning to form part of a universal gesture vocabulary - where the same gestures have a consistent effect across different apps. Make finger gestures an important consideration when planning and testing your app.

No comments: