cogs160wk4summary 3

Upload: hanley-weng

Post on 05-Apr-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 cogs160wk4summary 3

    1/1

    There are a wealth of unexplored interaction techniques in the field of hand-gestural and pen interaction. From the

    representation and design of custom novel gestures (suitable for complex language-sets), to the abstraction of gestures

    from observations. With the advancement of hand-gestural interactions in consumer devices, there have been a cry for

    research on different gestural and pen states and the implications of these states. Varying combinations of pen and touch

    interactions have also been explored in recent literature - many with the goal of streamling the flow of the workspace.

    There are two predominant design choices used to determine emerging gestures. These are; the closed-design, or

    individual-design of a gesture, and the abstraction of gestures through user observations in familiar contexts. In both (but

    moreso in the former), a steeper learning curve is usually required to progress transition novice to expert users.

    (Long,A.C.,et.al.,1999) suggests the design of a tool that aids the user in creating their own gestures. They note that

    their feedforward system (effectively letting the user know the accuracy of the machines comprehension of their gesture)

    was important to the user, and required better visualisation than a table format. Since this paper, commercial multitouch-

    enabled devices have emerged that have contradicted some of the desires in users that this paper surveyed, namely a

    multitude of gestures for varying tasks, and the ability to create their custom gestures. However, a multitude of gesutres

    can certainly benefit the communication of complex languages in narrower cases such as the creation of 3d models and

    sign-languages. Current User-created gestures are efficient for power-users, and has the potential to be collated - and

    aid in the design of new universal gestures. To learn such novel gestures, visualization in feedback and feedforward

    systems have been shown to be effective teachers (ShadowGuides, Octopocus). Such dynamic feedback can benefit

    (e.g. voice) recognition systems.

    The observation of common interactions have been used to abstract gestures. (Hover Widgets) did this to test theaccuracy of different pen-gestures, whilst (Ken Hinckley, Bill Buxton, et. al.) observed paper-notebook interactions to

    develop a scrapbook app that employed common functions such as page tearing / tracking, piling, etc. (Pen and Touch)

    advocates a pen writes, and touch manipulates interaction - which aligns with our traditional assignment of pen and

    touch interactions.

    Pen states have also been explored, resulting in interesting implications regarding hovering state (hover widgets), haptic

    feedback (haptic pen), and grip mode (H.Song,et.al.,2011).

    The combination of pen and touch interactions have also been explored, with unimodal /+ multimodal input, sequence of

    interactions, and bimanual input.

    Together, the design of gestures, pen states, and the combination of interactions, the workspace can be extended with

    both of localized and fixed (menu bezel, Pen+Touch) user interfaces. This allows better workflow as users acquire learnt

    and abstracted interaction paradigms more intuitively.