principles of human-computer interfaceportal.unimap.edu.my/portal/page/portal30/lecture... · •...

56
PRINCIPLES OF HUMAN-COMPUTER INTERFACE DR NOORMAIZATUL AKMAR ISHAK SCHOOL OF HUMAN DEVELOPMENT AND TECHNOCOMMUNICATION (iKOM)

Upload: others

Post on 17-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

PRINCIPLES OF HUMAN-COMPUTER INTERFACE

DR NOORMAIZATUL AKMAR ISHAKSCHOOL OF HUMAN DEVELOPMENT AND

TECHNOCOMMUNICATION (iKOM)

PART 2: DESIGN PROCESS

CHAPTER 4DESIGN RULES AND IMPLEMENTATION

SUPPORT

DR NOORMAIZATUL AKMAR ISHAKSCHOOL OF HUMAN DEVELOPMENT AND

TECHNOCOMMUNICATION (iKOM)

CONTENT

4.1 Principles to support usability

4.2 Standards

4.3 Guidelines

4.4 Golden rules and heuristics

4.5 HCI patterns

4.6 Elements of windowing systems

4.7 Programming the application

4.8 Using toolkits

4.9 User interface management systems

DESIGN RULES

INTRODUCTION

• A designer follows design rules to increase usability of the eventual software product.

• Design rules for interactive systems can be supported by psychological, cognitive, ergonomic, sociological, economic or computational theory whether they have empirical evidence or not.

• The more general a design rule is, the greater the likelihood that it will conflict with other rules and the greater the need for the designer to understand the theory behind it.

PRINCIPLES TO SUPPORT USABILITY

• Derivation of principles for interaction has usually arisen out of a need to explain why a paradigm is successful and when it might not be.

• The principles are divided into three main categories:

– Learnability – the case with which new users can begin effective interaction and achieve maximal performance.

– Flexibility – the multiplicity of ways in which the user and system exchange information.

– Robustness – the level of support provided to the user in determining successful achievement and assessment of goals.

SUMMARY OF PRINCIPLES AFFECTING LEARNABILITY

Principle Definition Related Principles

Predictability Support for the user to determine the effect of future action based on past interaction history.

Operation visibility

Synthesizability Support for the user to assess the effect of past operations on the current state.

Immediate/ eventual honesty

Familiarity The extent to which a user’s knowledge andexperience in other real-world or computer-based domains can be applied when interacting with a new system.

Guessability, affordance

Generalizability Support for the user to extend knowledge of specific interaction within and across applications to other similar situations.

-

Consistency Likeness in input-output behaviour arising from similar situations or similar task objectives.

-

SUMMARY OF PRINCIPLES AFFECTING FLEXIBILITYPrinciple Definition Related

Principles

Dialog initiative

Allowing the user freedom from artificialconstraints on the input dialog imposed by the system.

System/user pre-emptiveness

Multi-threading

Ability of the system to support user interaction pertaining to more than one task at a time.

Concurrent vsinterleaving, multi-modality

Task migratability

The ability to pass control for the execution of a given task so that it becomes either internalized by the user or the system or shared between them.

-

Substitutivity Allowing equivalent values of input and output to be arbitrarily substituted for each other.

Representationmultiplicity, equal opportunity

Customizability Modifiability of the user interface by the user or the system.

Adaptivity, adaptability

SUMMARY OF PRINCIPLES AFFECTING ROBUSTNESS

Principle Definition Related Principles

Observability Ability of the user to evaluate the internal state of the system from its perceivable representation.

Browsability, static/dynamic defaults,reachability, persistence, operation visibility

Recoverability Ability of the user to take corrective action once an error has been recognized.

Reachability, forward/backwordrecovery, commensurateeffort

Responsiveness How the user perceives the rate of communication with the system.

Stability

Task conformance

The degree to which the system services support all of the tasks the user wishes to perform and in the way that the user understands them.

Task completeness, task adequacy

STANDARDS

• Standards for interactive system design are usually set by national or international bodies to ensure compliance with a set of design rules by a large community.

• Standards can apply specifically to either the hardware or the software used to build the interactive system.

• Since standards are relatively stable, they are more suitable for hardware than software.

STANDARDSDifferent characteristics between hardware and software

Underlyingtheory

Hardware:-Based on an understanding of physiology or ergonomic/human factors, the results of which are relatively well known, fixed and readily adaptable to design of the hardware.-Directly related to a hardware specification and reflect the underlying theory.

Software:- Based on theories from psychology or cognitive science, which are less well formed, still evolving and not easy to interpret in the language of software design.

Change

Hardware:- More difficult and expensive compare to software. Thus is flexibly designed.

Software:- Frequent requirement for change compare to hardware.

STANDARDSISO 9241 – usability specification applies equally for both hardware and software design

Usability The effectiveness, efficiency and satisfaction with which specified users achieve specified goals in particular environments.

Effectiveness The accuracy and completeness with which specified users can achieve specified goals in particular environments.

Efficiency The resources expended in relation to the accuracy and completeness of goals achieved.

Satisfaction The comfort and acceptability of the work system to its users and other people affected by its use.

• The authority of a standard (or a guideline) can only be determined from its

use in practice.

• Some software products become de facto standards long before any

formal standards document is published eg X windowing system.

GUIDELINES

• The incompleteness of theories underlying the design of interactive software makes it difficult to produce authoritative and specific standards.

• As a result, the majority of design rules for interactive systems are suggestive and more general guidelines.

• Guidelines: – Abstract guidelines suit to requirements specification.

– Specific guidelines suit to detailed design.

– Can be automated to provide a direct means for translating detailed design specifications into actual implementation.

GUIDELINES

• The basic categories of the Smith and Mosier guidelines: – Data Entry

– Data Display

– Sequence Control

– User Guidance

– Data Transmission

– Data Protection

• Each of the above categories is further broken down into more specific subcategories which contain the particular guidelines.

GOLDEN RULES AND HEURISTICS

• All the abstract principles and detailed guidelines require a certain amount of commitment of the designer – either to track down appropriate guidelines or to interpret principles.

• User-centred designs have ‘golden rules’ or heuristics.

• These are useful checklist for design rule to produce a better system compare to those who ignore them.

• Many sets of heuristics but most well used are:– Nielsen’s Ten Heuristics use in evaluation,

– Shneiderman’s Eight Golden Rules of Interface Design,

– Norman’s Seven Principles for transforming difficult tasks into simple ones.

SHNEIDERMAN’S EIGHT GOLDEN RULES OF INTERFACE DESIGN

Use during design but can be applied to evaluate the system.

1 Strive for consistency in action sequences, layout, terminology, command use etc.

2 Enable frequent users to use shortcuts eg abbreviations, special keys to perform regular, familiar actions more quickly.

3 Offer informative feedback for every user action, at a level appropriate to the magnitude of the action.

4 Design dialogs to yield closure so that the user knows when they have completed a task

5 Offer error prevention and simple error handling.

6 Permit easy reversal of actions to relieve anxiety and encourage exploration.

7 Support internal locus of control so user is in control of the system that responds to his actions.

8 Reduce short-term memory load by keeping displays simple, consolidating multiple page displays and providing time for learning action sequences.

NORMAN’S 7 PRINCIPLES FOR TRANSFORMING DIFFICULT TASKS INTO SIMPLE ONES

Please read THE DESIGN OF EVERYDAY THINGS to gain the full picture

Use both knowledge in the world and knowledge

in the head.

Systems should provide the necessary knowledge within the environment, and the operation must transparent to support user in building an appropriate mental model.

Simplify the structure of tasks.

Provide mental aids to help user to keep track of stages in complex task, provide information to help user.

Make things visible. Interface can clearly and simplify what system can do and achieve.

Get the mappings right. Controls, sliders and dials should reflect the task.

Exploit the power of constraints.

Both natural and artificial constraints need to design to correct the action in the correct way.

Design for error. Design recovery into the system.

When all else fails, standardize.

Standardize the critical controls eg accelerator, brake, clutch, steering. Lights and windscreen wipers are not fixed.

HCI PATTERNS

• One way to approach design is to learn from examples that have proven to be successful in the past - to reuse the knowledge of what made a system – or paradigm –successful.

• Patterns are an approach to capturing and reusing this knowledge – of abstracting the essential details of successful design so that these can be applied again and again in new situations.

• Originated and successfully been used in architecture, and widely used in software development to capture solutions to common programming problems.

HCI PATTERNS –what distinguish them from other design rules

They capture design practice and embody knowledge about successful solutions –practice rather than psychological theory.

They capture the essential common properties of good design – they do not tell the designer how to do something but what needs to be done and why.

They represent design knowledge at varying levels, ranging from social and organizational issues through conceptual design to detailed widget design.

They are not neutral but embody values within their rationale. HCI patterns can express values about what is humane in interface design.

The concept of a pattern language is generative and can therefore assist in the development of complete designs.

They are generally intuitive and readable and can therefore be used for communication between all stakeholders.

SUMMARY• Design rules can be used to provide direction for the design

process, although the more general and interesting the design rule is for promoting usability.

• The most abstract design rules are principles which represent generic knowledge about good design practice.

• Standards have the highest authority, being set by national or international bodies to ensure compliance by a large community,

• Guidelines are less authoritative but offer specific contextual advice, which can inform detailed design.

• Heuristics and ‘golden rules’ are succinct collections of design principles and advice that are easily assimilated by any designer.

• Patterns capture design practice and attempt to provide a generative structure to support the design process.

EXERCISE #7

• It has been suggested in this chapter that consistency could be considered a major category of interactive principles, on the same level as learnability, flexibility and robustness. If this was the case, which principles discussed in this chapter would appear in support of consistency?

IMPLEMENTATION SUPPORT

INTRODUCTION• The detailed specification gives the programmer instructions

what the interactive application must do. The programmer must translate that into machine executable instructions how that will be achieved on the available hardware devices.

• The emphasis is on how building levels of abstraction on top of the essential hardware and software services allows the programmer to build the system in terms of its desired interaction techniques - relationship between input and output.

• The distinction between input and output devices in the hardware devices and at the lowest software level, can be removed at the programming level with the right abstractions and hiding of detail.

ELEMENTS OF WINDOWING SYSTEMS

• We discussed the elements of WIMP interface on how they enhance the interaction with the end-user.

• Now we will see more details of windowing systems used to build the WIMP interface.

• Windowing systems:– Ability to provide programmer independence from the specifies of the

hardware devices – workstation will involve visual display screen, a keyboard and some pointing device eg mouse.

• Any variety of these hardware devices can be used in any interactive system. They are all different in terms of the data they communicate and commands that are used to instruct them.

ELEMENTS OF WINDOWING SYSTEMS

• It is imperative to be able to program an application that will run on a wide range of devices. Programmer needs to direct commands to an abstract terminal – understands a more generic language and can be translated to the language of many other specific devices.

• Abstract terminal:– Portability of application program possible,

– Only one program – device driver – needs to be written for a particular hardware device and then any application program can access it.

• Have fixed generic language for the abstract terminal which is called its imaging model. Imaging models are sufficient to describe very arbitrary images.

EXAMPLES OF IMAGING MODELS

Pixels The display screen is represented as a series of columns and rows of points – or pixels – which can be explicitly turned on or off, or given a colour. This is a common imaging model for personal computer and is also used by the X windowing system.

Graphical kernel system (GKS)

An international standard which models the screen as a collection of connected segments, each of which is a macro of elementary graphics commands.

Programmer’s hierarchical interface to graphics (PHIGS)

Another international standard, based on GKS but with an extension to model the screen as editable segments.

PostScript A programming language developed by Adobe Corporation which models the screen as a collection of paths which serve as infinitely thin boundaries or stencils which can be filled in with various colours or textured patterns and images.

ELEMENTS OF WINDOWING SYSTEMS

• We have discussed the WIMP interface as an interaction paradigm and pointed out its ability to support several separate user tasks simultaneously.

• In summary, the role of a windowing systems as providing:

– Independence from the specifics of programming separate hardware devices;

– Management of multiple, independent but simultaneously active application.

ARCHITECTURES OF WINDOWING SYSTEMS

• Bass and Coutaz identify three possible architectures for the software to implement the roles of a windowing system.

• All of them assume that device drivers are separate from the application programs.

• Option 1: – implement and replicate the management of the multiple processes

within each of the separate applications.

– It forces each application to consider the difficult problems of resolving synchronization conflicts with the shared hardware devices.

– It reduces the portability of the separate applications.

ARCHITECTURES OF WINDOWING SYSTEMS

• Option 2:– Implement the management role within the kernel of the operating

system, centralizing the management task by freeing it from the individual application.

– Applications must still be developed with the specifics of the particular operating system in mind.

• Option 3:– Provides the most portability.

– The management function is written as a separate application in its own right and can provide an interface to other application programs that is generic across all operating systems.

– Is referred as client-server architecture.

THE ROLES OF A WINDOWING SYSTEM

ARCHITECTURES OF WINDOWING SYSTEMS

• In practice, no clear differences among the three options.

• Any actual interactive application or set of applications operating within a window system may share features with any one of these three conceptual architectures.

• A classic example of a window system based on the client-server architecture is the industry-standard X Window System (Release 11), developed at the Massachusetts Institute of Technology (MIT) in the mid 1980s.

• X is based on a network protocol which clearly defines the server-client communication.

• The X Protocol can be implemented on different computers and operating systems, making X more device independent.

• The client and server need not be on the same system in order to communicate to the server.

The X Server Performs The Following Tasks:

• Allows (or denies) access to the display from multiple client applications;

• Interprets requests from clients to perform screen operations or provide other information;

• Demultiplexes the stream of physical input events from the user and passes them to the appropriate client;

• Minimizes the traffic along the network by relieving the clients from having to keep track of certain display information, like fonts, in complex data structures that the clients can access by ID numbers.

THE CLIENT-SERVER ARCHITECTURES

THE X WINDOW SYSTEM (RELEASE 11) ARCHITECTURE – CLASSIC EXAMPLE, MIT 1980s

PROGRAMMING THE APPLICATION

• Interactive applications are generally user driven in the sense that the action the application takes is determined by the input received from the user.

• Two programming paradigms can be used to organize the flow of control within the application.

• The windowing system does not necessarily determine which of these two paradigms is to be followed.

– Read-evaluation loop

– Notification based

THE READ-EVALUATE LOOP PARADIGM

THE NOTIFICATION-BASED PROGRAMMING PARADIGM

USING TOOLKITS

• A key feature of WIMP interfaces from the user’s perspective is that input and output behaviours are intrinsically linked to independent entities on the display screen.

• The input coming from the hardware device is separate from the output of the mouse cursor on the display screen.

• A toolkit provides the programmer with a set of ready-made interaction objects – alternatively called interaction techniques, gadgets or widgets – which she can use to create her application programs.

• Toolkits exist for all windowing environments – OSF/Motif and XView for the X Window system, the Macintosh Toolbox and the Software Development Toolkits for Microsoft Windows/

THE SINGLE INHERITANCE CLASS HIERARCHY OF THE XVIEW TOOLKIT

Window

Frame Try (Openwin)

Text Canvas

Panel

Scrollbar Icon

USER INTERFACE MANAGEMENT SYSTEMS

• Toolkits provide only a limited range of interaction objects, limiting the kinds of interactive behaviour allowed between user and system.

• Toolkits are expensive to create and are still very difficult to use by non-programmers.

• The set of programming and design techniques which are supposed to add another level of services for interactive system design beyond the toolkit level are user interface management systems (UIMS).

USER INTERFACE MANAGEMENT SYSTEMS

• The main concerns of a UIMS for our purposes are:

– A conceptual architecture for the structure of an interactive system which concentrates on a separation between application semantics and presentation;

– Techniques for implementing a separated application and presentation whilst preserving the intended connection between them;

– Support techniques for managing, implementing and evaluating a run-time interactive environment

UIMS AS A CONCEPTUAL ARCHITECTURE

• The major issue in this area of research is one of separation between the semantics of the application and the interface provided for the user to make use of that semantics.

• Portability – allow the same application to be used on different systems.

• Reusability – separation increases the likelihood that components can be reused in order to cut development costs.

• Multiple interfaces – enhance the interactive flexibility of an application, several different interfaces can be developed to access the same functionality.

• Customization – the user interface can be customized by both the designer and the user to increase its effectiveness without having to alter the underlying application.

THE SEEHEIM MODEL OF THE LOGICAL COMPONENTS OF A UIMS

Presentation component

Dialog control

Application interface

model

Intended

user

Intended

application

SemanticSyntacticLexical

The Logical Components Of A UIMS By Seeheim

• Presentation

– The component responsible for the appearance of the interface, including what output and input is available to the user.

• Dialog control

– The component which regulates the communication between the presentation and the application.

• Application interface

– The view of the application semantics that is provided as the interface.

THE MODEL-VIEW-CONTROLLER TRIAD IN SMALLTALK

Model

View

User

Controller

Display

Mouse

Keyboard

THE PRESENTATION-ABSTRACTION-CONTROL MODEL OF COUTAZ

Abstraction Presentation

User

Control

IMPLEMENTATION CONSIDERATIONS• Implementations based on the Seeheim model determine

how the separate components of presentation, dialog controller and application interface are realized.

• Window systems and toolkits provide the separation between application and presentation.

• Communication from the model to either view or controller,or between a view and a controller, occurs by the normal use of method calls used in object-oriented programming.

• Ultimately, the programmer would want access to a variety of these techniques in any one UIMS.

• Components of a UIMS which allow the description of the application separate from the presentation are advantageous from a software engineering perspective but no proof they are as desirable in designing for usability.

Techniques used in dialog modelling in UIMS

Menu networks The communication between application and presentation is modeled as a network of menus and submenus. Menu can be represented as graphical items or buttons that users can select with a pointing device.

Grammar notations

The dialog between application and presentation can be treated as a grammar of actions and responses. These are good for describing command-based interfaces, but are not so good for more graphically-based interaction techniques.

State transition diagrams

Can be used as a graphical means of expressing dialog. The difficulty with these notation lies in linking dialog events with corresponding presentation or application events.

Event languages

Are similiar to grammar notations, except that they can be modified to express directionality and support some semantic feedback.

Techniques used in dialog modelling in UIMS

Declarative languages

All of above techniques (except for menu networks) are poorfor describing the correspondence between application and presentation because they are unable to describe effectively how information flows between the two.

Constraints A special subset of declarative languages. Can be used to make explicit the connection between independent information of the presentation and the application. Constraints embody dependencies between different values that must always be maintained.

Graphical specification

Techniques that allow the dialog specification to be programmed graphically in terms of the presentation language itself. The major advantage of this graphical technique is that it opens up the dialog specification to the non-programmer, which is a very significant contribution.

SUMMARY• Window systems provide only the crudest level of abstraction for the

programmer, allowing her to gain device independence and multiple application control.

• However, they do not provide a means of separating the control of presentation and application dialog.

• Two paradigm for interactive programming are related to two means of controlling that dialog – either internal to the application by means of a read-evaluation loop or external to the application by means of notification-based programming.

• Toolkits used with particular windowing systems add another level of abstraction by combining input and output behaviours to provide the programmer with access to interaction objects from which to build the components of the interactive systems.

• UIMS provide a conceptual architecture for dividing up the relationship between application and presentation, and various techniques were described to implement the logical components of a UIMS.

EXERCISE #8

• A designer described the following interface for a save operation.

The user initially see a screen with a box where they can type the file name (Screen 1). The screen also has a ‘list’ button that they can use to obtain a listing of all the files in the current directory (folder). This list appears in a different window. When the user clicks the save button, the system presents a dialog box to ask the user to confirm the save (see Screen 2).

EXERCISE #8

File name

list save

Screen 1

list

File name fred

Screen 2

OK Cancel

Confirm save of file

EXERCISE #8

• Two programmers independently coded the interface using two different window managers. Programmer A used an event-loop style of program whereas programmer B used a notifier (call-back) style.

a) Sketch out the general structure of each program.

b) Highlight any potential interface problems you expect from each programmer and how they could attempt to correct them.

END OF CHAPTER 4