tap is the new click

Post on 17-Aug-2014

21.351 Views

Category:

Design

5 Downloads

Preview:

Click to see full reader

DESCRIPTION

An introduction to designing for touchscreens and interactive gestures.

TRANSCRIPT

Tap is the New ClickDan Saffer, Kicker Studio

DRC 2009 // Dan Saffer, Kicker Studio

DRC 2009 // Dan Saffer, Kicker Studio

We're using bodies evolved for hunting, gathering, and gratuitous violence for information-age tasks like word processing and spreadsheet tweaking. —David Liddle

We’re in the midst of an interaction design revolution.

How do we design for interactive gestures?

What we’re going to talk about

Sensors and touchscreen types

Kinesiology and physiology

Touch targets

Communicating

Choosing appropriate gestures

Case study: Canesta Entertainment Center

Gesture: any physical movement that can be sensed and responded to by a digital system without the aid of a traditional input device such as a mouse or stylus.

DRC 2009 // Dan Saffer, Kicker Studio

DRC 2009 // Dan Saffer, Kicker Studio

Two types of interactive gestures

Touchscreen

aka TUI

Single and multi-touch (MT)

Free-form

Wide variety of forms

Why not to have a gestural interface

Heavy data input

Relies heavily on the visual (for now)

Can be inappropriate for context

More physically demanding

Why have a gestural interface?

More flexible

Less visible hardware

Hardware fits context better

More “natural”

More fun

The secret sauce: sensors

Common sensorsPressure

Light

Proximity

Acoustic

Tilt

Motion

Orientation

Types of touchscreensResistive: pressing two layers together creates the touch event

Surface wave: finger disrupts ultrasonic waves

Capacitive: finger conducts electricity

Infrared: finger breaks grid of infrared beams

Camera-based: looks for “blobs.” Rear- and front- projectors

Kinesiology & physiology

The ergonomics of human gesturesAvoid hyperextension or extreme stretches

Avoid repetition

Utilize relaxed, neutral positions

Avoid staying in a static position

No “Gorilla Arm”

Gorilla armHumans not designed to hold their arms in front of their faces, making small gestures

Ok for short-term use, not so much for repeated, long-term use

Fun Fact: Telegraph operators had “glass arm”

Sorry, Minority Report-style UIs

Gorilla arm

Gorilla arm

Stephan Pheasant’s (via Rob Tannen) cardinal rules of anthropometricsReach

Clearence

Posture

Strength

The more challenging and complicated the gesture, the fewer people who will be able to perform it.

What about accessibility?No good, clear answer

Improving via addition of haptics (and hopefully, eventually, speech)

Some touchscreen systems much better than traditional WIMP systems

Special care when designing touch targets

16-20mm

8-10mm

10-14mm

FingersFingernails: Blessing and curse

Fake fingernails: evil

Finger oil

Fingerprints

(Left) Handedness

Wrist support

Gloves

Inaccurate (when compared to a cursor)

Attached to a hand aka Screen Coverage

DRC 2009 // Dan Saffer, Kicker Studio

DRC 2009 // Dan Saffer, Kicker Studio

Avoid putting essential features or information like a label below an interface element that can be touched, as it may become hidden by the user’s own hand.

Touch events and targets

Touch target sizeRemember Fitts’ Law! (Time it takes to get to a target = distance to target / size of target)

As close to the user as possible to avoid users’ covering the screen with their hands

Space between the targets (when possible)

Create reasonably-sized targets: no smaller than 1cm in diameter/square (the size of finger pads)

Touch target size comparisons

~25mm ~18mm ~13mm ~8mm 5mm

Two touch target tricksIceberg tips

Adaptive targets

Two touch target tricksIceberg tips

Adaptive targets

Traditional UI elements to watch out for

Cursors

MouseOvers and hovers

Double-click

Right-click

Selected default buttons

Undo

Touchscreen patterns

Tap to open/activate

Tap to select

Drag to move object

Slide to scroll

Spin to scroll

Flick to nudge

Pinch to shrink & Spread to enlarge

Ghost fingers

Freeform patterns

Proximity activates/deactivates

Move body to activate

Point to select/activate

Wave to activate

Rotate to change state

Step to activate

Shake to change state

Prototyping gestures

Low-fidelity: Paper prototype

Low-fidelity: The “man behind the curtain”

Low-fidelity: Environments

High-fidelity: Exact

High-fidelity: Off-the-Shelf

High-fidelity: Do It Yourself

Turning gestures into code

Variables: what are you measuring?

Data: get the data in from the sensor

Computation: determine difference between data

Patterns: what do the sums mean?

Action: if a pattern is matched, do something

Documenting gestures

Dance notation

Annotated wireframes still work

Architectural wireframes

“Master UI” “Individual UI”Run by presenter

Live TouchscreenProjection Area

Used by show attendees

[floor]

[ showing typical arm’s reach for 6’ tall user ] [ showing typical arm’s reach for 6’ tall user ]

touchscreenoverview

[floor]

Keyframes

Gestural modules

Gestural modules

Storyboards

Swim lanes framework

Animation

Movies

Communicating interactive gestures

Three zones of engagement

Attraction

Observation

Interaction

Attraction affordance

Written instruction

Illustration

Demonstration

Symbolic

Determining the appropriate gesture

Four part equation1. The task that needs to be performed

2. The available sensors and input devices

3. The physiology of the human body

4. The context

This can be pretty straightforward

Or not

ContextDifferent behaviors in different locations

Avoiding accidental emotional weight

Cultural issues

Usability issuesAvoid unintentional triggers via everyday actions!

Wide variation in performing gestures: need requisite variety

Pick one: select then action, or selecting does action

Gestures as command keys: Provide a normal means of performing the action (buttons, etc.) but have “advanced” gestures as shortcuts

Case study: Canesta Entertainment Center

Case study: Canesta Entertainment Center

Case study: Canesta Entertainment Center

Case study: Canesta Entertainment Center

The complexity of the gesture should match the complexity of the task at hand.

The best designs are those that “dissolve into behavior.” (Naoto Fukasawa)

The best, most natural designs, then, are those that match the behavior of the system to the gesture humans might already do to enable that behavior.

Thanks.

http://www.kickerstudio.com

http://www.designinggesturalinterfaces.com

dan@kickerstudio.com

odannyboy on Twitter

top related