u-9_e-learning browser comparison
DESCRIPTION
TRANSCRIPT
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 1 Tutor: KW Lam; Student: Sanjoy Sanyal
Comparative Usability Analysis of Two e-Learning Browser Interfaces: A Multi-tiered Methodology
INTRODUCTION Electronic aids to medical education represent a quantum jump over traditional chalk-blackboard teaching. Interactivity holds students� attention longer, enables easier understanding, and its proactive nature engenders self-learning.[1] Creating simulation models, marrying human anatomy with computed 3D-imaging, entails collaboration of anatomists, computer engineers, physicians and educators.[2] Visual displays and direct manipulation interfaces enable users to undertake ambitious tasks. With such designs, the chaotic mass of data and flood of information can be streamlined into a productive river of knowledge.[3] Anatomy of human brain is the Waterloo of most medical students. We therefore decided to critically evaluate and compare two e-Learning interfaces for studying 3D simulations of human brain.[4] The mini-study was conducted at the University of Seychelles, American Institute of Medicine (USAIM) [https://web.usaim.edu] from May 2006 to June 2006. MATERIALS Two interfaces were selected from projects related to Visible Human Dataset of National Library of Medicine.[4] Both are e-Learning tools for studying brain anatomy from a 3D perspective. The first interface, an application for viewing 3D images, is Interactive Atlas (brought by AstraZeneca) from Visible Human Experience (VHE) project of Center for Human Simulation (CHS), University of Colorado.[5] It deals with whole-body anatomy, but for comparison with the second browser in this study, only brain interface was selected. The second is an award-winning 3D browser of the head/brain by Tom Conlin of University of Oregon.[6] Both use dynamic Web pages, where the server executes codes to dynamically deliver HTML-based content to the client browser.[7,8] Colorado browser interface This interface was tested first. It was accessed through VHE link in the CHS homepage. The VHE page[5] opened in a new window. This has to be open for the whole proceedings. The link �Interactive Atlas� led to the dynamic webpage in same window. Finally, �Launch the Interactive Atlas� link on the page initiated the Java-applet (infra) to load the applet-windows [Figure-1].
Non-payment registration
id4914727 pdfMachine by Broadgun Software - a great PDF writer! - a great PDF creator! - http://www.pdfmachine.com http://www.broadgun.com
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 2 Tutor: KW Lam; Student: Sanjoy Sanyal
Java installation Interactive Atlas required a Java-enabled computer and GL4Java. First Java (JRE 1.5.0_06 for<applet>) was downloaded, installed from Sun�s Java website (http://www.java.com) and enabled [Figure-2].
Figure-1: Composite screenshots showing opening of the Interactive Atlas browser in Visible Human Experience website, from the CHS website. See also Java.
Java details
Figure-2: Composite screenshots showing Java download, installation and enabling in the computer. This is an essential pre-requisite for the browsers.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 3 Tutor: KW Lam; Student: Sanjoy Sanyal
Next, GL4Java was installed according to instructions in VHE website, and run on Windows. Each time the 3D interactive atlas browser was launched, the status bar showed the sequence; �Applet web3d loaded�, �Applet web3d inited�, �Applet web3d started�, before the 3-in-1 Java-applet windows simultaneously opened on the whole screen [Figure-3].
Applet-windows The upper-right window gives a comprehensive list of 3D images. Under �Model Available�, �All� was selected from the drop-down list. Double-clicking on the �Brain� option opened a 3D interactive brain simulation in upper-left window through a �Building Brain� sequence. This is the actual browser interface.
Figure-3: Opening of initial Interactive Atlas 3-in-1 applet window.
Model list / Oblique section window
3D model window; the actual browser
Tools window for manipulating above
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 4 Tutor: KW Lam; Student: Sanjoy Sanyal
This has provision for rotations/visualization of the brain-model in any axis/plane. It also has a virtual �plane of section� to �slice� the brain in any plane/axis. Under �Display� in the bottom �Tools� window, �3D and Oblique� option was selected from the drop-down list. This generated a �Getting oblique slice� sequence in the upper-right window and depicted �slices� of brain, selected through the upper-left window. The bottom window is the control panel containing radio-buttons/list-boxes to customize user�s interactivity choices [Figure-4].
Oregon browser interface The 3D brain browser from Oregon University was tested next. This application required Java 1.1-enabled client for online viewing of the webpage. This was downloaded, installed and enabled over about 45 minutes. When the page is opening, it goes through an applet-loading sequence indicated by progress bar, and the status bar indicates �Applet Sushi loaded�. Once the applet had read the data, 3 sectional images of the brain appeared in the same window, indicated by �Applet Sushi started� in the status bar. This was activated by clicking anywhere on the window [Figure-5].
The window has three interactive squares, each depicting an axial/transverse, coronal and sagittal section of the brain, enclosed by red, green and blue lines respectively. Each square contains crosshairs of orthogonal gridlines, their colours being those of linings of other two squares. Moving any crosshair in any square dynamically updates the figures in other two squares to show the appearance of the brain in those sections. There is a fourth optional square for viewing any arbitrary �slice� of brain, selected by checking the �Arb
Figure-4: The final appearance of the browser and output windows. These windows provided the interfaces for the study.
Virtual brain model with virtual plane of section; this is for manipulation
Alpha server output in response to queries sent through upper-left window
Control tools for manipulating browser
Figure-5: Oregon 3D brain browser applet loading sequence; note the indication on the status bar
Progress bar
Java applet loading indicator
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 5 Tutor: KW Lam; Student: Sanjoy Sanyal
slice� check-box. Another check-box enables �depth cuing� of images. Different radio-buttons allow visualisation in black-white (not shown), MRI-image style and infrared colour schemes.[Figures6-9]
All applets are stored in a special folder for quick viewing later [Figure-10].
METHODS We adopted a multi-tiered methodology[9-11] to analyse and compare the two browser interfaces. The underpinning principle was to check the interfaces against the following healthcare user interface design principles; effectiveness, ease of use / learning / understanding, predictability, user control, adaptability, input flexibility, robustness, appropriateness of output, adequacy of help, error prevention and response
Fig-6: Axial, coronal, sagittal brain sections (counter-clockwise), enclosed in red, green, blue squares, respectively. Cross-hairs in each square are of other two colours.
Fig-7: Showing arbitrary slice, enclosed in cyan and magenta
Fig-9: Showing Infrared type of appearance
Fig-8: Showing MRI-type of appearance.
At start-up, clicking anywhere in window activates the controls
Figure-10: Screenshot of Java applet cache, where all applets are stored for quick viewing
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 6 Tutor: KW Lam; Student: Sanjoy Sanyal
times. These principles are enshrined in 17 documents of ISO-9241,[12] in Nielsen�s usability engineering[13] and in TechDis accessibility/usability precepts.[14] Usability inquiry The first was a usability inquiry approach[15] applied to students of USAIM, Seychelles. We followed the first six phases of usability testing as described by Kushniruk et al.[16-18] Testing the usability and usefulness of the two interfaces, both individually and comparatively, were the evaluation objectives. Students from Pre-clinical-1 through 5 were recruited through bulletin-board and class announcements. Both browser interfaces were opened online in a computer that had been prepared by loading/enabling Java applets. Students were demonstrated the use of both interfaces, in small groups and individually. Then each of them was given 30-45 minutes to work on the interfaces, in the students� library. In some cases pairs of students worked together, as in co-discovery learning.[15] They were also given some mock information-finding tasks, viz. locating caudate nucleus. The entire proceedings were with wireless IEEE 802.11g 54Mbps Internet connection at 2.4GHz ISM frequency. They were then given a questionnaire to fill and return.[Appendix] Questionnaire We modified an existing HCM-questionnaire from Boulos,[19] incorporating some principles from NIH website,[20] while adhering to standard practices of designing a questionnaire.[21,22] It contained twenty-seven close-ended questions covering interface usability (effectiveness, efficiency, satisfaction)[23] and usefulness issues, both individually and comparatively.[24] They were mostly on 5-point rating scale, with some on 3- point scale.[22] The data was analysed, tabulated and represented graphically.[9,21] Last six questions were open-ended qualitative types.[22] The responses were analysed and categorized according to main themes; usability and usefulness issues. Under these themes, we searched for patterns[25] pertaining to ISO principles of design.[12] Usability inspection The second step involved a heuristic evaluation under usability inspection approach.[15,16,26]. The author acted as usability-specialist (user interface �heuristic expert�); judging user interface and system functionality against a set of heuristics to see whether they conformed to established principles of usability and good design.[10,15,16] The underlying principle was to counter-balance the usability inquiry approach using the relatively inexperienced students. Ten Nielsen heuristics[15,27,28] were enhanced with five more from Barber�s project[29][Appendix]. For each interface, the 15 heuristics were applied and usability was scored as 0 or 1 (No=0; N/A=0; Yes=1).[27] Next, depending on frequency, impact and persistence of usability problem, a level of problem severity was assigned according to following rating scale.[30](Box-1)
Automated testing In the third step we obtained objective scores from automated online tools; LIDA,[31] Validation Service[32] and WebXACT.[33] These tools utilize automated �Web-crawlers� to check webpages/stylesheets for errors in underlying code and accessibility issues. We used the main page of each resource for the tests.[8]
Box-1
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 7 Tutor: KW Lam; Student: Sanjoy Sanyal
LIDA[Figure-11] is a validation package from Minervation, a company specialising in accessible, usable, and reliable healthcare information resources.[34] It checks these parameters of WebPages under 3, 4 and 3 subheadings respectively, each of which contains several sub-subheadings.[31] We ran LIDA v1.2 [www.minervation.com/validation] to automatically generate the accessibility scores. The usability and reliability scores were calculated �by hand�, and tabulated.
Markup Validation[Figure-12] service from W3C checks HTML/XHTML documents for conformance to W3C recommendations/standards and W3CWAI guidelines.[32] W3CAG attaches a three-point priority-level to each checkpoint, from its impact on Web accessibility. Priority-1 checkpoints demand mandatory compliance; Priority-3 checkpoints are optional.[8] We ran Validator Service v0.7.2 [http://validator.w3.org/detailed.html] through our test sites and generated reports on HTML violations. Bobby was originally developed by CAST and is now maintained by Watchfire Corporation under the name WebXACT[Figure-13]. This automated tool examines single WebPages for quality, accessibility and privacy issues. It reports on W3CAG A, AA, AAA accessibility compliance, and also in conformance with Section-508 guidelines.[33,35,36] It generates an XML report from which violation data can be extracted.[8] It is good for checking accessibility for people with disabilities.[8,37] Bobby-logo is also a kite-mark indicating that the site has been �endorsed� in some way by another organization.[Figure-13]
WebXACT requires JavaScript and can work on IEv5.5+. We enabled scripting in our browser (IEv6.0 SP2), ran WebXACT (http://webxact.watchfire.com/) on our test pages and generated reports on general, quality, accessibility and privacy issues. We simplified the technique described by Zeng to calculate Web Accessibility Barrier (WAB) score.[8] The steps are summarised in Box-2.
Figure-11: Screenshot of Minervation site, showing LIDA validation tool
Figure-12: Screenshot of W3C site, showing Markup Validation Service
Figure-13: Screenshot of Watchfire site, showing WebXACT validation tool. Inset: Sample of Bobby approved kitemark
Bobby-approved kite-mark, taken from BDA website: http://www.bda-dyslexia.org.uk
Box-2: Simplified steps for calculating WAB
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 8 Tutor: KW Lam; Student: Sanjoy Sanyal
Colour testing Finally, a Vischeck analysis was performed to determine appearances of outputs to chromatically-challenged individuals (protanopes, deuteranopes and tritanopes). Vischeck is a way of showing how coloured objects appear to color-blind individuals. It is based on SCIELAB from the Wandell lab at Stanford University.[38] VischeckPS-Win v1.01 was downloaded [http://www.vischeck.com/downloads/] as a .zip file, extracted and installed to run as a plug-in with Adobe Photoshop6.0. For each display by the two browsers, the corresponding �colour-blind appearance� was noted and displayed for comparison purposes. RESULTS Questionnaire analysis User demographics Thirty usability inquiry respondents filled up the questionnaire, equally divided between genders [Appendix-Table-1a; Figure-14]. Their ages ranged from 18 to 22+ (mean=19.2 years). There were proportionately more females (86% vs53%) in 18-19 age-groups. Eighty-three percent (25/30) had PC at home; 67% (20/30) used computers for >2 years and averaged 1.7 hours� Internet-usage day-1. All used Windows OS; 37% (11/30) had 1024x768 pixel resolution; 93% (28/30) used Microsoft IE web-browser; majority (57%;17/30) utilized broadband always-connected Internet, and 80% (24/30) considered Internet reliable for medical information.[Appendix-Table-1b]
Gender-based age distribution
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Age(years) 18
19 20 21 22 orabove
Total
% o
f st
ud
ents
Female
Male
Searchability Sixty-seven percent (20/30) found it easy/very easy to search through Colorado interface, as opposed to 15/30 (50%) through Oregon interface. Nearly four times more students found searchability through the latter difficult/very-difficult (37% vs10%). More females than males experienced various levels of difficulty in searching (M:F=27%:40% (Colorado); M:F=33%:67% (Oregon).[Appendix-Table-1c; Figure-15]
Figure-14: 100% Stacked Column showing age-gender distribution of respondents.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 9 Tutor: KW Lam; Student: Sanjoy Sanyal
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%%
Res
po
nd
ents
Male Female Both Male Female Both
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon)
Searchability
Easy / (Very)Acceptable dif f iculty (Very) / Diff icult
Speed Eighty-seven percent (26/30) found Colorado browser moderately fast compared to 50%(15/30) for Oregon browser. However, almost four times more students felt Oregon browser was very fast (37%:10%). There was no appreciable gender difference[Appendix-Table-1d; Figure-16].
Perception of browser speed (Colorado)
10%3% 0%
87%
Very fast
Moderately fast
Moderately slow
Very slow
Perception of browser speed (Oregon)
37%
50%
13%0%
Very fast
Moderately fast
Moderately slow
Very slow
Success rate Success in finding the required information/�slice� of brain was considered a resultant of interface-effectiveness, reliability, arrangement of information and output. There were no failures with Colorado browser, while 30%(9/30) failed with Oregon browser. Seventy-percent (21/30) succeeded with Colorado browser after one/more attempts, compared to 43% (13/30) with Oregon browser. With the latter browser, 47%(7/15) females failed compared to 13%(2/15) males[Appendix-Table-1e; Figures-17a,b].
Success rate (Colorado)
30%
70%
0%
From 1st attempt
After 1+ failure
Not successful
Figure-15: 100% 3D Stacked Column showing ease of search for information through either interface, divided gender-wise.
Figure-16: Exploded 3D pie charts show comparative browser speeds of both interfaces, irrespective of gender.
Fig 17a
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 10 Tutor: KW Lam; Student: Sanjoy Sanyal
Success rate (Oregon)
27%
43%
30%
From 1st attempt
After 1+ failure
Not successful
Ease of use Hardly anybody (3%;[1/30]) needed extra help with Colorado interface, while 43%(13/30) required more help than was provided by Oregon interface. Almost all (97%;[29/30]) found former interface easy, while 57%(17/30) felt the same with Oregon browser. With the latter browser, 60%(9/15) females needed more help, compared to 27%(4/15) males[Appendix-Table-1f; Figure-18].
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% R
esp
on
den
ts
Male Female Both Male Female Both
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon)
Ease of use and help requirements
Need more help
Easy, instructions useful
Easy, no help needed
Information quality Information quality is an indication of usefulness. Eighty-three percent (25/30) felt Colorado output was useful, vs. 63% (19/30) for Oregon output. Females were evenly divided with respect to Oregon output, with equal proportion (47%;[7/15]) contending that it was useless and useful.[Appendix-Table1g; Figure-19]
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% R
esp
on
den
ts
Male Female Both Male Female Both
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon)
Good information quality
Disagree / (Strongly)
Amiguous
(Strongly) / Agree
Figures-17a,b: 3D exploded pie charts showing success / failure rates with either interface, irrespective of gender.
Figure-18: 100% 3D Stacked Column showing gender-wise distribution of ease of use and help requirements with either interface.
Figure-19: 100% 3D Stacked Column showing gender-wise distribution of opinion about information quality.
Fig 17b
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 11 Tutor: KW Lam; Student: Sanjoy Sanyal
Information overload Thirty-percent (9/30) felt moderately/severely overloaded by information provided through Colorado interface, while 37% (11/30) felt the same with Oregon interface. More females (47%;[7/15]) felt overwhelmed by Oregon information than males (27%;[4/15]); while the reverse was true with Colorado information output (M:F=47%:13%).[Appendix-Table-1h; Figure-20]
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% R
esp
on
den
ts
Male Female Both Male Female Both
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon)
Information overload
Signif icant / Extreme problem
Moderate problem
No / Slight problem Overall usefulness Similar proportions of students found both interfaces very much/extremely useful (Colorado:Oregon =47%:43%). Forty-seven percent (7/15) of each gender opined Colorado browser was very much/extremely useful. For Oregon browser, 60% (9/15) males felt it was highly useful, against 27% (4/15) females sharing the same feeling.[Appendix-Table-1i; Figure-21]
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% R
esp
on
den
ts
Male Female Both Male Female Both
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon)
Comparative usefulness of both browser interfaces
Very much / extremely
Somew hat
Not at all / slightly
Definitive resource Regarding usefulness of either as definitive resource for studying Neuroanatomy, 64% (19/30) stated that they would use them as definitive resources (M:F=80%:47%).[Appendix-Table1j; Figure-22]
Figure-20: 100% 3D Stacked Column showing gender-wise distribution of perception of information overload
Figure-21: 100% 3D Stacked Column showing gender-wise distribution of perception of overall usefulness of either interface.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 12 Tutor: KW Lam; Student: Sanjoy Sanyal
Perceived usefulness of either/both as definitive resource3%
33%
64%(Strongly) / Disagree
Amiguous
Agree / (Strongly) Actual usage Which browser the students actually used to carry out their task provided an estimate of both interfaces� combined usability and usefulness. Forty-four percent (13/30) used Colorado browser, 33% (10/30) Oregon browser predominantly to carry out their task; 23% (7/30) used both [Appendix-Table-1k; Figure-23].
Actual usage proportions
44%
33%
23%
Interactive 3D atlas (Colorado)
3D brain browser (Oregon)
Both interfaces equally
Future prospects Students� opinion regarding future prospects of these interfaces considered aspects like usability, usefulness, robustness, reliability and cost. Sixty-seven percent (20/30) felt Colorado browser interface had very good future prospect, as opposed to 43% (13/30) who felt the same about Oregon browser. More females than males felt Colorado interface had good future prospect (M:F= 47%:86%). The opposite ratio applied to Oregon browser (M:F= 53%:33%).[Appendix-Table-1l; Figure-24].
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% R
esp
on
den
ts
Male Female Both Male Female Both
Interactive 3D atlas (Colorado) 3D brain brow ser (Oregon)
Perceived future prospects
Very / Extreme
Somew hat
No / Slight Questionnaire qualitative analysis
Figure-22: 3D exploded pie chart showing overall distribution of opinion about using either or both browser interface as a definitive Neuroanatomy resource.
Figure-23: 3D exploded pie showing overall distribution of users who actually used either / both interface(s) for performing a task.
Figure-24: 100% 3D Stacked Column showing gender-wise distribution of perception of future prospects of either interface.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 13 Tutor: KW Lam; Student: Sanjoy Sanyal
Appropriate sample user comment(s) (both positive and negative) about each browser interface, and the corresponding pattern to which they fit, based on usability/usefulness themes, are given in Appendix-Table-2. There were constructive criticisms for both, but more for Oregon browser. Generally, respondents cutting across gender-divide showed greater preference for Colorado browser interface. Heuristic violation severity Average heuristic violation severity rating for Oregon interface was three times as much as Colorado interface (2.07 vs0.67) (Appendix-Tables-3a,b). Accessibility for color-blind individuals was severely compromised in Oregon interface. This secured a violation rating of 4 in this category.[Figure-25]
Usability Severity Rating
0
1
2
3
4
Visibil
ity o
f sys
tem
statu
s
Mat
ch b
etwee
n sy
stem
and
real
wor
ld
User c
ontro
l and
free
dom
Consis
tenc
y an
d st
anda
rds
Error p
reve
ntio
n
Recog
nitio
n ra
ther
than
reca
ll
Flexibi
lity a
nd effic
ienc
y of
use
Aesth
etic
and
min
imal
ist d
esign
Help u
sers
reco
ver f
rom
erro
rs
Help
and d
ocum
enta
tion
Naviga
tion
Use o
f mod
es
Structu
re o
f info
rmati
on
Physic
al con
strai
nts
Extrao
rdina
ry us
ers
Heurestics
Vio
lati
on s
ever
ity r
atin
g
Interactive 3-D Atlas (Colorado)
3-D Brain Brow ser Interface (Oregon)
Automated test results LIDA Both browser interfaces failed validation, as quantitatively determined by LIDA.[Figure-26]
Detailed results of LIDA analysis for accessibility, usability and reliability are given in Appendix-Table-4. There is no significant difference in the overall results between Colorado and Oregon interfaces (72% vs. 67%); with comparable means and standard deviations. Probability associated with Student�s t test (2 tailed
Figure-25: Clustered Column showing severity of heuristic violation for each of 15 heuristics, in each browser interface.
Figure-26: Composite screenshots from LIDA tests showing failure of both interface sites to meet UK legal standards.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 14 Tutor: KW Lam; Student: Sanjoy Sanyal
distribution, unpaired 2 sample with unequal variance)=0.92[Figure-27]. However, the break-up showed substantial differences (Colorado:Oregon; Accessibility: 70%:80%; Usability: 72%:48%)[Figures-28,29].
LIDA Results
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
Accessibility Usability Reliability Overall score
Interactive 3-DAtlas (Colorado)
3-D BrainBrow ser (Oregon)
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
% S
core
Page
Setup
Acces
s Restr
iction
s
Outda
ted
Code
Dublin
Cor
e Tag
s
Regist
ratio
n
Ove
rall a
cces
sibilit
y
Break up of accessibility results
Interactive 3D atlas (Colorado)
3D brain brow ser (Oregon)
0%10%
20%30%40%
50%60%70%
80%90%
100%
% Score
Clarity
Consis
tency
Funct
ionali
ty
Engag
ibility
Overa
ll usa
bility
Break up of usability
Interactive 3D atlas(Colorado)3D brain brow ser(Oregon)
Validation Service Both sites failed W3C validation, with 14 and 18 errors for Colorado and Oregon sites, respectively. Additionally, the former was not a valid HTML 4.01 Strict, while in the latter no DOCTYPE was found [Figures-30,31].
Figure-27: Clustered Column showing accessibility, usability and reliability results of both websites, as analysed by LIDA tool. Overall results do not show any significant difference, apparently.
Figure-28: Clustered 3D Column showing break-up of Accessibility results. This was automatically generated by LIDA tool, except the last parameter. Differences between two sites are more apparent.
Figure-29: Clustered 3D Column showing break-up of Usability results. Differences between two sites are even more apparent.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 15 Tutor: KW Lam; Student: Sanjoy Sanyal
This page is not Valid HTML 4.01 Strict!
This page is not Valid (no Doctype found)!
WebXACT Both interface-sites had no metadata description, non-serious quality issues and warnings for their sites, non-serious page encryption level, no P3P compact policy, and issues about third party content. Additionally, Colorado browser site had no author and keywords in metadata summary, and elements missing height-width attributes (page efficiency).[Appendix-Table-5] WAB score There were several instances (Colorado=9; Oregon=2) of Priority 2/3 automatic check-point errors, and several instances (Colorado=36; Oregon=35) of Priority 1/2/3 manual check-point warnings[Figure-32]. Colorado and Oregon pages had modified WAB scores of 86 and 72 respectively.[Appendix-Table-6]
Vischeck results Appearances of each output under normal vision and under red/green/blue-blindness are demonstrated in Figures33-35. The colour red, green, blue borders and cross-hairs in Oregon output are invisible to protanopes, deuteranopes and tritanopes respectively; its infra-red type of output, which also uses these colour-combinations, are also unappreciable to the colour-blind.
Colorado page Figure-32: Composite screenshots showing Priority 1,2,3 automatic and manual checkpoint errors and warnings in both Web pages, as determined by WebXACT. There is no significant difference between them. Oregon page
Figure-30: Screenshot from W3C Markup Validation Service showing result for Colorado site.
Figure-31: Screenshot from W3C Markup Validation Service showing result for Oregon site.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 16 Tutor: KW Lam; Student: Sanjoy Sanyal
A: Normal Oregon browser window B: Protanopic appearance (red missing)
C: Deuteranopic appearance (green missing) D: Tritanopic appearance (blue missing)
A: Normal appearance (infrared type) B: Protanopic appearance
Figure-33: Composite screenshots showing appearance of Colorado applet windows under normal and colour-deficit visions; from left to right, clockwise � Normal, Protanopic and Tritanopic appearances; Deuteranopic appearance is almost same as protanopic
Figure-34: A-D show screenshots of normal and the other 3 forms of colour blindness. For each type of blindness, the outer square lines and internal cross-hairs of that particular colour are invisible. Colours of squares and cross-hairs are essential components of the interface.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 17 Tutor: KW Lam; Student: Sanjoy Sanyal
C: Deuteranopic appearance D: Tritanopic appearance Summary of results All tests results are comparatively summarized in Appendix-Table-7 and Figure-36.
All result summary
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
V eas
y sea
rcha
bility
V diffi
cult s
earc
habil
ity
Mod
erat
ely fa
st
V fast
Task f
ailur
e ra
te
Task s
ucce
ss ra
te
Extra
help
reqd
Ease
of u
se
Info
use
fulne
ss
Info
ove
rload
V use
ful
Task-
base
d us
age
Futur
e pr
ospe
cts
Heuris
tic vi
olatio
n
LIDA a
cces
sibilit
y
LIDA u
sabil
ity
LIDA re
liabil
ity
LIDA o
vera
ll sco
re
W3C
erro
rs
Auto
chec
kpoin
t erro
rs
Man
ual c
heck
point
war
nings
WAB sc
ore
Q'aire, Heuristic, LIDA, W3C, WebXACT, WAB tests
% s
tud
ents
an
d %
of
abso
lute
val
ues
Oregon browser
Colorado browser
DISCUSSION Questionnaires are time-tested usability inquiry methods to evaluate user interfaces.[15,39] Since our interfaces are e-Learning tools, using questionnaires to evaluate their usability and usefulness to students was the most appropriate first step. When an appropriate questionnaire already exists, adapting the same for the current study is better than creating one from scratch.[40] That was our rationale for adapting Boulos� questionnaire.[19]
Figure-35: A-D show screenshots of Oregon interface with the infrared type of settings, as seen normally and in the 3 forms of colour blindness. For each type of blindness, that particular colour is replaced by a different colour-scheme.
Figure-36: 100% Stacked Column comparing the percentage that Colorado and Oregon contribute to total of each score in each test category. First 13 are results of questionnaire, next is heuristic violation score, categories 15-18 are LIDA results, next is W3C result, the two before last are WebXACT results, last is Web Accessibility Barrier score.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 18 Tutor: KW Lam; Student: Sanjoy Sanyal
We secured exactly 30 respondents; the minimum stipulated to get a statistically-valid data.[21]. However, a larger figure would be ideal. We followed all precepts of a good questionnaire[22] except that it had seven pages instead of two. Our last six open-ended questions provided valuable qualitative input vis-à-vis users� perceptions of usability and usefulness of the two interfaces. This played a significant role in recommending practical changes to the interfaces (infra). QDA software (QSR NUD*IST4, Sage, Berkeley, CA) to review and index the patterns and themes would have rendered analysis of our qualitative data more efficient.[25] The rationale behind conducting a heuristic evaluation was to evaluate the two interfaces from a heuristic �expert�s� perspective, namely this author, as opposed to users (students).[10,15,16,26] Moreover, heuristic evaluation is a very efficient usability engineering method.[26,30] It can be conducted remotely, provides indication of effectiveness and efficiency of the interface, but not about user satisfaction.[15] The ideal heuristic evaluation requires 3-5 (average=4) independent actual heuristic experts[15,26]. That was not possible in our �mini� study. Implications of automated tests Automated tools are designed to validate WebPages vis-à-vis their underlying codes, and check their accessibility,[14] rather than determine end-user usability/usefulness. Thus they may give misleading findings, compared to usability testing/inspection/inquiry methods. LIDA and WebXACT/WAB scores showed Colorado accessibility was poorer and usability better than Oregon. However, most students found Colorado interface superior in most categories. Heuristic evaluation also demonstrated three times higher heuristic violation in Oregon interface. However, automated tests served two purposes; they provided means for triangulation (infra), and they formed the basis of suggesting improvements to the sites, discussed later. Four-legged table model Our study reinforced an established principle of evaluation studies; triangulation by several methods is better than one method, because any single method does not give a complete evaluation.[9] The ideal usability evaluation can be likened to a four-legged table. Usability testing methods (viz. usability labs) and usability inquiry approaches (viz. questionnaires) constitute first two legs of the table, enabling one to assess end-user usability/usefulness.[15] Usability inspection methods, viz. cognitive walkthrough (psychology/cognitive experts) and heuristic evaluation (heuristic experts)[16] provide usability from �expert�s� perspective. They constitute third leg of the table. The automated methods give numerical figures for accessibility, usability and reliability, and constitute fourth leg of the table. Therefore one method complements the other in a synergistic way, identifying areas of deficiency that have slipped through the cracks of other methods, besides cross-checking each others validity. We have tried to fit this model as closely as possible by employing a multi-tiered methodology.[9-11] Lessons learned from study End-user characteristics Technological excellence does not necessarily correlate with usability/usefulness. The award-winning 3D Oregon brain browser had ingeniously-coded applets allowing users to perform stunning manipulations. However, as an e-Learning tool for studying brain anatomy, it left much to be desired. Images were too small, without zoom facility. There were no guiding hints/explanations and no search facility. Our pre-clinical undergraduates, reasonably computer/Internet-savvy[AppendixTable-1a], needed instructions and hints/information for manipulating the interfaces and for medical content. Thus, it was a perky tool for playing but not for serious Neuroanatomy study. This was the finding both from end-user perspective as well from heuristic analysis. Gender differences Most usability studies do not explicitly consider gender-differences, as we did. This provided valuable insight [Box-3].
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 19 Tutor: KW Lam; Student: Sanjoy Sanyal
In general terms this relates to improving searchability, providing more help functions, improving information quality, reducing information overload and improving the interface as a whole. These apply more to female students; more to Oregon interface, but also for Colorado interface. The proposed improvements have been considered more explicitly below. Colour-blind students Approximately 8-10% of males and 0.5% of females suffer from some form of colour-deficit. More may have temporary alterations in perception of blue [Box-4].[38,41,42]
The Oregon interface had red, green and blue as essential components. Our Vischeck simulation exercise proved that such an interface would be useless to the colour-blind. Our school of approximately 300 students has about 180 males (M:F=60:40). This translates to 15-16 male and 0-1 female colour-blinds. Therefore the impact is likely to be substantial. Implications for user interfaces Colour: e-Learning tools with multimedia and colour graphics should provide for colour-blind students. Ideally, red-green colour combinations (most common form of colour-blindness)[42] should be avoided. Alternatively, there should be provision to Daltonize the images (projecting red/green variations into lightness/darkness and blue/yellow dimensions), so that they are somewhat visible to the colour-blind.[38] One should also use secondary cues to convey information to the chromatically-challenged; subtle gray-scale differentiation, different graphic or different text-label associated with each colour.[42] Browser compatibility: Two respondents used browsers other than MSIE. Therefore web-designs should be tested to see how they appear in different browsers. Browsershots [http://v03.browsershots.org/] is an online tool for this purpose.[43] Implications for evaluation exercises All accessibility/usability evaluation exercises should mandatorily check for colour-deficient accessibility through colour-checking engines like Vischeck. The systems should be Java-enabled.[38] Practical recommendations
Box-4: Spectrum of colour-deficits in the population
Box-3: Gender-based differences gleaned from study
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 20 Tutor: KW Lam; Student: Sanjoy Sanyal
Colorado interface The recommendations, based on user-feedback and heuristic evaluation, are indicated in Figure-37.
The following recommendations are based on results of automated tests: Improving accessibility/usability[31]
Help function is too cumbersome; render it user-friendly
1. Add clinical correlations, anatomical and functional connections between structures 2. Make search blocks, labelled diagrams 3. Blank areas of labeling should be filled up 4. Correct the errors given by the slices while locating a particular area 5. Give audio help (like a doctor speaking when click on a part)
1. This applet window should be larger 2. Fonts of menu items should be at least 10 points
Zoom function is ornamental; render it functional
�You could always improve anything� (User comment)
All 3 applet windows should load in <10 seconds; Or, provide page loading progress bar
Give notes/explanations for each item
Give a right-click �What�s This?� type of help function for each of these menu buttons
Provide functionality
Provide function button as alternative to table list
Figure-37: Composite screenshots showing all the recommendations for improvements to the Colorado browser are based on user comments and heuristic evaluation studies.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 21 Tutor: KW Lam; Student: Sanjoy Sanyal
-Incorporate HTTP-equivalent content-type in header -Insert table summaries for visually-impaired visitors -Increase font size to at least 10 -Incorporate search facility
Priority-1/2/3 checkpoints[33]
-Provide extended description for images conveying important information -Ensure that pages are still readable/usable in spite of unsupported style sheets. -Add a descriptive title to links -Provide alternative searches for different skills/preferences
Oregon interface Figure-38 highlights the recommendations, based on user-feedback and heuristic evaluation.
Image size This was the most common complaint by students. Fitt�s law states pointing time to target is inversely proportional to its size and directly proportional to its distance.[42,44] Therefore, increasing image size would reduce effort, time and cognitive load. The following recommendations are based on results of automated tests: Improving accessibility/usability[31]
-Eliminate body background colour -Include clear purpose statement in the beginning -Make �block of text� scannable, in short easy-to-understand paragraphs -Include navigation tools for moving through text -Reduce user cognitive load
W3C markup validation[32]
-Place a DOCTYPE declaration [Box-5].
4. Give explanations for items 5. Provide good labeling 6. Give better views 7. Enlarge images (Fitt�s law) 8. Colour-blind feature (see text)
3. Add following items
Save
Search
Run Daltonize!
2. Include under
1. Provide right-click information
Box-5: Document Type Definition
Figure-38: All the recommendations for improvements to the Oregon browser are based on user comments and heuristic evaluation studies.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 22 Tutor: KW Lam; Student: Sanjoy Sanyal
Priority-1/2/3 checkpoints[33]
-Ensure usability of WebPages even if programmatic objects do not function -Provide accessible alternatives to information in Java 1.1 applet -Use CSS to control layout/presentation -Avoid obsolete language features
Both interfaces The following recommendations are based on results of automated tests: Improving accessibility/usability[31]
-Add HTML language definition -Add Dublin core title tags -Present material without necessitating plug-ins
Priority-1/2/3 checkpoints[33]
-Use more simple/straightforward language -Identify language of text -Foreground-background colors should contrast -Validate document to formal published grammars -Provide description of general site layout, access features and usage instructions -Allow user-customisation -Provide metadata that identifies document's location in collection
Conducting better evaluation techniques Using Perlman-type Web-based CGI-scripted questionnaire would enable wider capture.[39] Given the resources of a formal usability lab (viz. Microsoft)[45,46] we would adopt a combined Usability Testing and Inquiry approach. The former would include Performance Measurement of user combined with Question-Asking Protocol (which is better than Think-aloud Protocol per se).[15] Latter would include automatic Logging Actual Use.[15] Hardware requirements and other details[16,18] are in Figure-39. This combined methodology requires one usability expert and 4-6 users. All three usability issues; effectiveness, efficiency and satisfaction are covered. We can obtain quantitative and qualitative data and the process can be conducted remotely.[15]
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 23 Tutor: KW Lam; Student: Sanjoy Sanyal
CONCLUSION Multi-tiered evaluation testing methods, colour-checking and correction facilities are mandatory for all interfaces and evaluation procedures. Both interfaces failed validation. Majority of respondents found Colorado interface much easier to search with than Oregon interface, and former moderately faster than latter. Nobody failed to perform required task with Colorado browser. Very few required extra help with Colorado browser. Majority found the Colorado information useful. More utilized the former for performing task than the latter. Subjectively, most students could not understand the Oregon interface very well. Oregon interface violated heuristics three times more than Colorado. Overall LIDA scores were similar for both, but Oregon usability was significantly lower than Colorado. Colorado site demonstrated substantially higher accessibility barrier by LIDA and WebXACT tests. Thus, Colorado interface had higher usability from users� perspective and heuristic evaluation, and lower accessibility by automated testing. Colorado output was not a significant handicap to colour-blind, but Oregon graphic output was partially invisible to various types of chromatically-challenged individuals. ACKNOWLEDGEMENTS The President and Dean of University of Seychelles American Institute of Medicine kindly permitted this study and the infectious enthusiasm of students of USAIM made this possible. CONFLICTS OF INTEREST Author is employed by USAIM.
USERS
User�s Computer
USABILITY TESTER
VIDEO CAMERA VCR
Automatically collect statistics about detailed use of system
Question-asking protocol
Interface log (Keyboard, Mouse driver etc)
Record user�s facial expressions, reactions etc
Two-way microphone
PC-VCR Converter
Audio-video tape of computer screen + QA protocol conversation
Pre-amplifier / Sound mixer
Logging actual use
Performance Measurement
Figure-39: Composite usability and inquiry method, incorporating features of Performance Measurement, Q-A Protocol and Logging Actual Use.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 24 Tutor: KW Lam; Student: Sanjoy Sanyal
REFERENCE 1. Bearman M. Centre of Medical Informatics, Monash University [homepage on the Internet]. Monash, Au: Monash University; © 1997 [cited 2006 July 1]. Why use technology?; [about 3 pages]. Available from: http://archive.bibalex.org/web/20010504064004/med.monash.edu.au/informatics/techme/whyuse.htm. 2. University of Colorado Health Science Center [homepage on the Internet]. Colorado: UCHSC; [cited 2006 July 1]. Overview; [about 2 screens]. Available from: http://www.uchsc.edu/sm/chs/overview/overview.html. 3. Computer Science, University of Maryland [homepage on the Internet]. Bethesda, MD: UMD; [cited 2006 July 1]. Visualization; [about 1 screen]. Available from: http://www.cs.umd.edu/hcil/research/visualization.shtml. 4. National Library of Medicine, National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH; [updated 2003 September 11; cited 2006 July 1]. The Visible Human Project � Overview; [about 1 page]. Available from: http://www.nlm.nih.gov/research/visible/visible_human.html. 5. Center for Human Simulation, University of Colorado. Visible Human Experience [homepage on the Internet]. Denver, CO: University of Colorado; [cited 2006 July 1]. Available from: http://www.visiblehumanexperience.com/. 6. Conlin T. Sushi Applet. University of Oregon; [modified 2003 September 19; cited 2006 July 1]. Available from: http://www.cs.uoregon.edu/~tomc/jquest/SushiPlugin.html. 7. Boulos MNK. Internet in Health and Healthcare. Bath, UK: University of Bath; [cited 2006 July 1]. Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit5/KamelBoulos_Internet_in_Healthcare.ppt. 8. Zeng X, Parmanto B. Web Content Accessibility of Consumer Health Information Web Sites for People with Disabilities: A Cross Sectional Evaluation. J Med Internet Res [serial on the Internet]. 2004 June 21 [last
update 2006 February 11, cited 2006 July 1]; 6(2):e19: [about 20 pages]. Available from: http://www.jmir.org/2004/2/e19/index.htm. 9. Boulos MNK. A two-method evaluation approach for Web-based health information services: The HealthCyberMap experience. MEDNET-2003; 2003 December 5; University Hospital of Geneva; [cited 2006 July 1] Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit9/KamelBoulos_MEDNET2003.ppt. 10. Beuscart-Zéphir M-C, Anceaux F, Menu H, Guerlinger S, Watbled L, Evrard F. User-centred, multidimensional assessment method of Clinical Information Systems: a case-study in anaesthesiology. Int J Med Inform [serial on the Internet]. 2004 September15 [cited 2006 July 1]; [about 10 pages]. Available from: http://www.e-courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID20041021151346705. 11. Curé O. Evaluation methodology for a medical e-education patient-oriented information system. Med Inform Internet Med [serial on the Internet]. 2003 March [cited 2006 July 1]; 28(1):1-5 [about 5 pages]. Available from: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=12851053. 12. International standards for HCI and usability. UsabilityNet; ©2006 [cited 2006 July 1]. Available from: http://www.usabilitynet.org/tools/r_international.htm. 13. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 July 1]. Available from: http://www.useit.com/.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 25 Tutor: KW Lam; Student: Sanjoy Sanyal
14. TechDis [homepage on the Internet]. Sussex, UK: University of Sussex Institute of Education; (c) 2000-2002 [last major update 2002 Oct 26; cited 2006 July 1]. Web Accessibility & Usability Resource. Available from: http://www.techdis.ac.uk/seven/. 15. Zhang Z. Usability Evaluation [homepage on the Internet]. US: Drexel University; [cited 2006 July 1]. Available from: http://www.usabilityhome.com/. 16. Kushniruk AW, Patel VL. Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform [serial on the Internet]. 2004 Feb; [published online 2004 Feb 21; cited 2006 July 1]; 37:56-76:[about 20 pages]. Available from: http://www.sciencedirect.com/science/journal/15320464. 17. Kaufman DR, Patel VL, Hilliman C, Morin PC, Pevzner J, Weinstock RS, Goland R, Shea S, Starren J. Usability in the real world: assessing medical information technologies in patients' homes. J Biomed Inform [serial on the Internet]. 2003 Feb-Apr; [published online 2003 Sept 4; cited 2006 July 1]; 36(1-2):45-60:[about 16 pages]. Available from: http://www.sciencedirect.com/science/journal/15320464. 18. Kushniruk AW, Triola M M, Borycki EM, Stein B, Kannry JL. Technology induced error and usability: The relationship between usability problems and prescription errors when using a handheld application. Int J Med Inf [serial on the Internet]. 2005 August; [available online 2005 April 8; cited 2006 July 1]; 74(7-8):519-26:[about 8 pages]. Available from: http://www.sciencedirect.com/science?_ob=GatewayURL&_origin=CONTENTS&_method=citationSearch&_piikey=S1386505605000110&_version=1&md5=e950841f1dbf4dd207d9a5d47d311908. 19. Boulos MNK. HealthCyberMap [homepage on the Internet]. HealthCyberMap.org; © 2001, 2002 [last revised 2002 April 17; cited 2006 July 1]. Formative Evaluation Questionnaire of HealthCyberMap Pilot Implementation; [about 6 pages]. Available from: http://healthcybermap.semanticweb.org/questionnaire.asp. 20. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1]. APPENDIX A-2. SAMPLE SURVEY OF WEBMASTERS; [about 15 pages]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a2.htm. 21. Boulos MNK. Royal College of Surgeons of Edinburgh [homepage on the Internet]. Edinburgh, UK: RCSED; [published 2004 June 16; cited 2006 July 1]. Notes on Evaluation Methods (Including User Questionnaires and Server Transaction Logs) for Web-based Medical/Health Information and Knowledge Services; [about 6 screens]. Available from: http://www.e-courses.rcsed.ac.uk/mschi/unit9/MNKB_evaluation.pdf. 22. Eric Bonharme, White I. Napier University [homepage on the Internet]. Marble; [last update 1996 June 18; cited 2006 July 1]. Questionnaires; [about 1 screen]. Available from:
http://web.archive.org/web/20040228081205/www.dcs.napier.ac.uk/marble/Usability/Questionnaires.html. 23. Bailey B. Usability Updates from HHS. Usability.gov; 2006 March [cited 2006 July 1]. Getting the Complete Picture with Usability Testing; [about 1 screen]. Available from: http://www.usability.gov/pubs/030106news.html. 24. Kuter U, Yilmaz C. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods [homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited 2006 July 1]. Survey Methods: Questionnaires and Interviews; [about 6 screens]. Available from: http://www.otal.umd.edu/hci-rm/survey.html.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 26 Tutor: KW Lam; Student: Sanjoy Sanyal
25. Ash JS, Gorman PN, Lavelle M, Payne TH, Massaro TA, Frantz GL, Lyman JA. A Cross-site Qualitative Study of Physician Order Entry. J Am Med Inform Assoc [serial on the Internet]. 2003 Mar-Apr; [cited 2006 July 1]; 10(2):[about 13 pages]. Available from: http://www.jamia.rcsed.ac.uk/cgi/reprint/10/2/188.pdf. 26. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 July 1]. How to Conduct a Heuristic Evaluation; [about 6 pages]. Available from: http://www.useit.com/papers/heuristic/heuristic_evaluation.html. 27. National Institute of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1]. APPENDIX A-3. HEURISTIC GUIDELINES FOR EXPERT CRITIQUE OF A WEB SITE; [about 5 pages]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a3.htm. 28. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 July 1]. Ten Usability Heuristics; [about 2 pages]. Available from: http://www.useit.com/papers/heuristic/heuristic_list.html. 29. Barber C. Interaction Design [homepage on the Internet]. Sussex, UK: [cited 2006 July 1]. Interactive Heuristic Evaluation Toolkit; [about 9 pages]. Available from: http://www.id-book.com/catherb/index.htm. 30. Nielsen J. useit.com: Jakob Nielsen's Website [homepage on the Internet]. Fremont, CA: Nielsen Norman Group; [cited 2006 June 14]. Characteristics of Usability Problems Found by Heuristic Evaluation; [about 2 pages]. Available from: http://www.useit.com/papers/heuristic/usability_problems.html. 31. Minervation [homepage on the Internet]. Oxford, UK: Minervation Ltd; © 2005 [modified 2005 June 6; cited 2006 July 1]. The LIDA Instrument; [about 13 pages]. Available from: http://www.minervation.com/mod_lida/minervalidation.pdf. 32. World Wide Web Consortium [homepage on the Internet]. W3C®; © 1994-2006 [updated 2006 Feb 20; cited 2006 June 14]. W3C Markup Validation Service v0.7.2; [about 3 screens]. Available from: http://validator.w3.org/. 33. Watchfire Corporation. WebXACT [homepage on the Internet]. Watchfire Corporation; © 2003-2004 [cited 2006 July 1]. Available from: http://webxact.watchfire.com/. 34. Badenoch D, Tomlin A. How electronic communication is changing health care. BMJ [serial on the Internet]. 2004 June 26; [cited 2006 July 1]; 328:1564[about 2 screens]. Available from: http://bmj.bmjjournals.com/cgi/content/full/328/7455/1564. 35. World Wide Web Consortium [homepage on the Internet]. W3C; © 1999 [cited 2006 July 1]. Web Content Accessibility Guidelines 1.0 � W3C Recommendation 5-May-1999; [about 24 pages]. Available from: http://www.w3.org/TR/WCAG10/. 36. The Access Board [homepage on the Internet]. The Access Board; [updated 2001 June 21; [cited 2006 July 1]. Web-based Intranet and Internet Information and Applications (1194.22); [about 15 pages]. Available from: http://www.access-board.gov/sec508/guide/1194.22.htm. 37. Ceaparu I, Thakkar P. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods [homepage on the Internet]. College Park, MD: University of Maryland; [last updated 2001 October 28; cited 2006 July 1]. Logging & Automated Metrics; [about 8 screens]. Available from: http://www.otal.umd.edu/hci-rm/logmetric.html. 38. Vischeck [homepage on the Internet]. Stanford, CA: Stanford University; [last modified 2006 Mar 8; cited 2006 July 1]. Information & Links; [about 7 pages]. Available from: http://www.vischeck.com/info/.
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 27 Tutor: KW Lam; Student: Sanjoy Sanyal
39. Perlman G. ACM; [cited 2006 July 1]. Web-Based User Interface Evaluation with Questionnaires; [about 4 pages]. Available from: http://www.acm.org/~perlman/question.html. 40. National Institutes of Health [homepage on the Internet]. Bethesda, MD: NIH [cited 2006 July 1]. APPENDIX A-9: IMPLEMENTATION DETAILS OF WEB SITE EVALUATION METHODOLOGIES; [about 1 page]. Available from: http://irm.cit.nih.gov/itmra/weptest/app_a9.htm. 41. Hess R. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corp; © 2006 [published 2000 October 9; cited 2006 July 1]. Can Color-Blind Users See Your Site?; [about 7 pages]. Available from: http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnhess/html/hess10092000.asp. 42. Tognazzini B. AskTog; copyright 2003 [cited 2006 July 1]. First Principles of Interaction Design; [about 7 pages]. Available from: http://www.asktog.com/basics/firstPrinciples.html. 43. Browsershots.org [homepage on the Internet]. Browsershots.org; [cited 2006 July 1]. Test your web design in different browser; [about 1 page]. Available from: http://v03.browsershots.org/. 44. Giacoppo SA. CHARM: Choosing Human-Computer Interaction (HCI) Appropriate Research Methods [homepage on the Internet]. College Park, MD: University of Maryland; 2001 November 2 [cited 2006 July 1]. The Role of Theory in HCI; [about 11 screens]. Available from: http://www.otal.umd.edu/hci-rm/theory.html. 45. Usability.gov. Methods for Designing Usable Web Sites. Usability.gov; 2006 March [cited 2006 July 1]. Conducting and Using Usability Tests; [about 3 screens]. Available from: http://www.usability.gov/methods/usability_testing.html. 46. Berkun S. Microsoft Corporation [homepage on the Internet]. Redmond, Wash: Microsoft Corporation; © 2006 [published 1999 Nov-Dec; cited 2006 July 1]. The Power of the Usability Lab; [about 3 printed pages]. Available from: http://msdn.microsoft.com/library/en-us/dnhfact/html/hfactor8_6.asp. LIST OF ABBREVIATIONS 3D: Three Dimensional CAST: Center for Applied Special Technology CGI: Common Gateway Interface CHS: Center for Human Simulation (University of Colorado) CSS: Cascading Style Sheets GHz: Giga hertz HCM: Health CyberMap HTML: HyperText Markup Language MS: Microsoft IE: Internet Explorer IEEE: Institute of Electrical and Electronic Engineers ISM: Instrumentation Scientific and Medical ISO: International Organization of Standardization NIH: National Institutes of Health, Bethesda, Maryland QDA: Qualitative Data Analysis QSR: Qualitative Solutions and Research SP: Service Pack UCHSC: University of Colorado Health Science Center USAIM: University of Seychelles American Institute of Medicine v: Version
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 28 Tutor: KW Lam; Student: Sanjoy Sanyal
VHE: Visible Human Experience W3C: World Wide Web Consortium W3CAG: W3C Accessibility Guidelines WAB: Web Accessibility Barrier WAI: Web Accessibility Initiative XHTML: eXtensible HTML, a hybrid between HTML and XML XML: eXtensible Markup Language APPENDIX Appendix-Table-1a: Gender-based age distribution of respondents
Age (years) Male (%) Female (%) 18 5 (33%) 8 (53%) 19 3 (20%) 5 (33%) 20 2 (13%) 1 (7%) 21 1 (7%)
22 or above 4 (27%) 1 (7%) Total 15 15
Appendix-Table-1b: Students� computer/Internet knowledge, skills and experience Male Female PC at home
Yes 12 13 No 3 2
Duration of computer usage Not at all 1
Few weeks 1 3 2-6 months 2
6-24 months 1 2 >2years 12 8
Hours/day Internet usage <1hour 8 8
>1-<2hours 4 6 >2-<3hours 1 1 >3-<4hours
>4hours 2 Type of Internet connection 2 non-responders 1 non-responder
Dialup/modem 5 5 Broadband/always connected 8 9
Web browser Microsoft Internet Explorer 13 15
Netscape Navigator Mozilla Firefox 1
Other(s) 1 Desktop screen resolution 1 non-responder 3 non-responders
640x480pixels 4 3 800x600pixels 7 1
1024x768pixels 3 8 Operating system
Windows 15 15 Others (Mac/Unix/Linux/WebTV)
Internet as reliable source of medical information Not at all 1
To some extent 1 4 Definitely 13 11
Yes, definitely No, not always Internet reliable source of medical information 24(80%) 6(20%)
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 29 Tutor: KW Lam; Student: Sanjoy Sanyal
Prefer conventional search engine 26(86.7%) 4(13.3%) Prefer either/both Neuroanatomy browser as definitive knowledge resource 29(96.7%) 1(3.3%) Appendix-Table-1c: Searchability - including navigation, input flexibility, output
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) (Very) / Difficult 1 (7%) 2 (13%) 3 (10%) 3 (20%) 8 (53.3%) 11 (37%) Acceptable difficulty 3 (20%) 4 (27%) 7 (23%) 2 (13%) 2 (13.3%) 4 (13%) Easy / (Very) 11 (73%) 9 (60%) 20 (67%) 10 (67%) 5 (33.3%) 15 (50%) Total 15 15 30 15 15 30
Appendix-Table-1d: Browser interface speed - page loading, response times
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) Very fast 1 (6.5%) 2 (13%) 3 (10%) 3 (20%) 8 (53%) 11 (37%) Moderately fast 13 (87%) 13 (87%) 26 (87%) 9 (60%) 6 (40%) 15 (50%) Moderately slow 1 (6.5%) 0 1 (3%) 3 (20%) 1 (7%) 4 (13%) Very slow 0 0 0 0 0 0 Total 15 15 30 15 15 30
Appendix-Table-1e: Success rate - Effectiveness, output, logical arrangement of info, reliability
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) From 1st attempt 5 (33%) 4 (27%) 9 (30%) 5 (33.3%) 3 (20%) 8 (27%) After 1+ failure 10 (67%) 11 (73%) 21 (70%) 8 (53.3%) 5 (33%) 13 (43%) Not successful 0 0 0 2 (13.3%) 7 (47%) 9 (30%) Total 15 15 30 15 15 30
Appendix-Table-1f: Adequacy of user help/instructions/hints, ease of learning, understanding, predictability, error recovery/correction/prevention
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) Easy, no help needed 5 (33%) 5 (33%) 10 (33.3%) 1 (7%) 1 (7%) 2 (7%) Easy, instructions useful 9 (60%) 10 (67%) 19 (63.3%) 10 (67%) 5 (33%) 15 (50%) Need more help 1 (7%) 0 1 (3.3%) 4 (26%) 9 (60%) 13 (43%) Total 15 15 30 15 15 30
Appendix-Table-1g: Usefulness - Information Quality
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) (Strongly) / Agree 13 (87%) 12 (80%) 25 (83.3%) 12 (80%) 7 (46.5%) 19 (63.3%) Ambiguous 2 (13%) 2 (13%) 4 (13.3%) 0 1 (7%) 1 (3.3%) Disagree / (Strongly) 0 1 (7%) 1 (3.3%) 3 (20%) 7 (46.5%) 10 (33.3%) Total 15 15 30 15 15 30
Appendix-Table-1h: Usefulness - Info overload
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) No / Slight problem 8 (53%) 13 (87%) 21 (70%) 11 (73%) 8 (53.3%) 19 (63%)
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 30 Tutor: KW Lam; Student: Sanjoy Sanyal
Moderate problem 6 (40%) 2 (13%) 8 (27%) 1 (7%) 2 (13.3%) 3 (10%) Significant / Extreme problem 1 (7%) 0 1 (3%) 3 (20%) 5 (33.3%) 8 (27%) Total 15 15 30 15 15 30
Appendix-Table-1i: Comparison of overall usefulness
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) Not at all / slightly 2 (13%) 2 (13%) 4 (13%) 4 (27%) 8 (53%) 12 (40%) Somewhat 6 (40%) 6 (40%) 12 (40%) 2 (13%) 3 (20%) 5 (17%) Very much / extremely 7 (47%) 7 (47%) 14 (47%) 9 (60%) 4 (27%) 13 (43%) Total 15 15 30 15 15 30
Appendix-Table-1j: Usefulness of either/both as definitive resource
Male (%) Female (%) Both (%) (Strongly) / Disagree 1 (7%) 0 1 (3%) Ambiguous 2 (13%) 8 (53%) 10 (33%) Agree / (Strongly) 12 (80%) 7 (47%) 19 (64%) Total 15 15 30
Appendix-Table-1k: Actual task-based usage - Overall usability and usefulness
Male (%) Female (%) Both (%) Interactive 3D atlas (Colorado) 6 (40%) 7 (47%) 13 (44%) 3D brain browser (Oregon) 4 (27%) 6 (40%) 10 (33%) Both interfaces equally 5 (33%) 2 (13%) 7 (23%) Total 15 15 30
Appendix-Table-1l: Perceived future prospects - robustness, reliability, cost
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Male (%) Female (%) Both (%) Male (%) Female (%) Both (%) No / Slight 5 (33%) 1 (7%) 6 (20%) 6 (40%) 6 (40%) 12 (40%) Somewhat 3 (20%) 1 (7%) 4 (13%) 1 (7%) 4 (27%) 5 (17%) Very / Extreme 7 (47%) 13 (86%) 20 (67%) 8 (53%) 5 (33%) 13 (43%) Total 15 15 30 15 15 30
Appendix-Table-2: Subjective opinions of users pertaining to both interfaces, and their comparisons, based on themes and patterns
User comments
Patterns based on usability and usefulness themes
Interactive 3D atlas (Colorado) 3D brain browser (Oregon) Ease of use �Interface 1 (was) so very user-friendly��
�Interface 1 handling was not so easy� �Second was better as it was more user-friendly�
Effectiveness, fitness for purpose
��helps students to learn better and teachers to teach better.� �Helps student in self-study� ��it is useful, helps us to understand better�� ��it gives us the actual 3-D images which could not be obtained from textbooks.� �Very informative�; �Can relate Anatomy with clinical lesions� �I could not identify exactly where the brain stem is located� �Tracts were bit difficult to understand�
Ease of learning (learnability), predictability, adaptation to user levels and styles
�Only the first time was difficult��
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 31 Tutor: KW Lam; Student: Sanjoy Sanyal
Ease of understanding �Most of the difficulty to understand (is because we don�t) know the basics about the topic, before we go to the program�
�Interface 2: I really didn�t understand anything!!�
Performance, robustness and reliability
�Image planes not always in synch with direction of view�
��very difficult for orientation�
Adequate user help, error correction/recovery (fault tolerance)
�No HELP!� �No info provided�
Estimated user satisfaction �Interface 1 shows good images� �I didn�t like it at all� �Good imagination!�
��brain slice pieces are very good� �Great images� �For both sites, I liked the color 3-D images�
Adequate response times �Interface 1 takes some time to load� �Interface 2 quick to load� Appropriate amount of output ��more explanation needed; needs to be
updated.� �Not accurately labeled�
�� too small for visualization� �Small, not very informative� ��no proper explanation� �no proper labeling, can�t identify the structures� ��(had) CT MRI options�� ��does not give a proper view�
User control, input flexibility �Zoom did not work�
Appendix-Table-3a: Comparison of heuristic violation severity and rating Interactive 3-D Atlas (Colorado) 3-D Brain Browser Interface (Oregon) Visibility of system status 1 2 Match between system and real world 0 3 User control and freedom 0 0 Consistency and standards 0 0 Error prevention 0 2 Recognition rather than recall 0 3 Flexibility and efficiency of use 1 3 Aesthetic and minimalist design 0 3 Help users recover from errors 0 0 Help and documentation 1 3 Navigation 1 2 Use of modes 0 0 Structure of information 0 3 Physical constraints 3 3 Extraordinary users 3 4 Average 0.67 2.07
Appendix-Table-3b: Heuristic evaluation detailed results Visibility of system status (Max score = 3)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is status feedback provided continuously (e.g. progress indicators/ messages)? No Yes Are warning messages displayed for long enough? Yes No Is there provision for some form of feedback? Yes No Usability score 2 / 3 1 / 3 Usability problem severity rating 1 2 Match between system and real world (Max score = 9)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Are the words, phrases and concepts used familiar to the user? Yes No Does the task sequence parallel the user's work processes? Yes No Is information presented in a simple, natural and logical order? Yes No
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 32 Tutor: KW Lam; Student: Sanjoy Sanyal
Is the use of metaphors easily understandable by the user? Yes No Does the system cater for users with no prior experience of electronic devices? No No Does the system make the user's work easier and quicker than without the system? Yes Yes Does the system fit in with the environment in which the user's tasks are carried out? Yes Yes Can the system realistically reflect real world situations and appear to respond to user? Yes Yes Are important controls represented onscreen; is there obvious mapping with real ones? Yes Yes Usability score 8 / 9 4 / 9 Usability problem severity rating 0 3 User control and freedom (Max score = 3)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Are facilities provided to "undo" (or "cancel") and "redo" actions? Yes Yes Are there clearly marked exits (when user finds themselves somewhere unexpected)? Yes Yes Are facilities provided to return to the top level at any stage? N/A N/A Usability score 2 / 3 2 / 3 Usability problem severity rating 0 0 Consistency and standards (Max score = 8)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is use of terminology, controls, graphics and menus consistent throughout the system? Yes Yes Is there a consistent look and feel to the system interface? Yes Yes Is there consistency between data entry and data display? N/A N/A Is the interface consistent with any platform conventions? Yes Yes Have ambiguous phrases/actions been avoided? Yes No Is the interface consistent with standard PC conventions? Yes Yes Is interactive TV consistent with the related TV programmes? N/A N/A Have colour and style conventions been followed for links (and no other text)? Yes Yes Usability score 6 / 8 5 / 8 Usability problem severity rating 0 0 Error prevention (Max score = 8)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is a selection method provided (e.g. from a list) as an alternative to direct entry of information?
Yes No
Is user confirmation required before deleting something? N/A N/A Does the system ensure work is not lost either by user or system error? Yes No Does the system prevent calls being accidentally made? No No Are the options given in dialog boxes obvious? N/A N/A Does the system provide foolproof synchronization with a PC? Yes Yes Has the possibility of the user making errors been removed? Yes Yes Is the system robust and safe enough for its surroundings? Yes Yes Usability score 5 / 8 3 / 8 Usability problem severity rating 0 2 Recognition rather than recall (Max score = 6)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Are help and instructions visible or easily accessible when needed? Yes No Is the relationship between controls and their actions obvious? Yes No Is it possible to search for information (e.g. a phone number) rather than entering the information directly?
N/A N/A
Is the functionality of the buttons on the device obvious from their labels? Yes No Are input formats (e.g. dates or lengths of names) and units of values indicated? N/A N/A Is the functionality of novel device controls (e.g. thumbwheels) obvious? N/A N/A Usability score 3 / 6 0 / 6 Usability problem severity rating 0 3
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 33 Tutor: KW Lam; Student: Sanjoy Sanyal
Flexibility and efficiency of use (Max score = 8)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Does the system allow for a range of user expertise? Yes No Does the system guide novice users sufficiently? Yes No Is it possible for expert users to use shortcuts and to tailor frequent actions? Yes No Is it possible to access and re-use a recent history of instructions? Yes No Does the system allow for a range of user goals and interaction styles? Yes Yes Does the system allow all functionality to be accessed either using function buttons or using the stylus?
Yes Yes
Is it possible to replace and restore default settings easily? Yes Yes Have unnecessary registrations been avoided? No Yes Usability score 7 / 8 4 / 8 Usability problem severity rating 1 3 Aesthetic and minimalist design (Max score = 8)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is the design simple, intuitive, easy to learn and pleasing? Yes No Is the system free from irrelevant, unnecessary and distracting information? Yes Yes Are icons clear and buttons labelled and is the use of graphic controls obvious? Yes No Is the information displayed at any one time kept to a minimum? Yes Yes Is the number of applications provided appropriate (has 'featuritis' been avoided)? Yes No Has the need to scroll been minimized and where necessary, are navigation facilities repeated at the bottom of the screen?
Yes Yes
Is the system easy to remember how to use? Yes Yes Have excessive scripts, applets, movies, graphics and images been avoided? No No Usability score 7 / 8 4 / 8 Usability problem severity rating 0 3 Help users recover from errors (Max score = 3)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Do error messages describe problems sufficiently, assist in their diagnosis and suggest ways of recovery in a constructive way?
N/A N/A
Are error messages written in a non-derisory tone; refrain from attributing blame to user? N/A N/A Is it clear how the user can recover from errors? Yes Yes Usability score 1 / 3 1 / 3 Usability problem severity rating 0 0 Help and documentation (Max score = 3)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is help clear and direct and simply expressed in plain English, free from jargon? Yes No Is help provided in a series of steps that can be easily followed? No No Is it easy for the user to search, understand and apply help text? Yes No Usability score 2 / 3 0 / 3 Usability problem severity rating 1 3 Navigation (Max score = 4)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is navigational feedback provided (e.g. showing a user's current and initial states, where they've been and what options they have for where to go)?
No No
Are any navigational aids provided (e.g. find facilities)? Yes No Does the system track where the user was in the last session? Yes Yes Has opening unnecessary new browser windows been avoided? Yes Yes Usability score 3 / 4 2 / 4 Usability problem severity rating 1 2 Use of modes (Max score = 2)
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 34 Tutor: KW Lam; Student: Sanjoy Sanyal
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Does the system use different modes appropriately and effectively? Yes Yes Is it easy to exit from each mode of use? Yes Yes Usability score 2 / 2 2 / 2 Usability problem severity rating 0 0 Structure of information (Max score = 10)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is there a hierarchical organisation of information from general to specific? Yes No Are related pieces of information clustered together? Yes No Is the length of a piece of text appropriate to the display size and interaction device?
Yes Yes
Has the number of screens required per task been minimized? Yes Yes Does each screen comprise 1 document on 1 topic with the most important information appearing at the top?
Yes Yes
Has hypertext been used appropriately to structure content and are links intuitive and descriptive?
Yes Yes
Have pages been structured to facilitate scanning by the reader? Yes No Are the URLs, page titles and headlines straightforward, short and descriptive? Yes No Has excessive use of white space been avoided? Yes No Has textual content been kept to a maximum of two columns? Yes No Usability score 10 / 10 4 / 10 Usability problem severity rating 0 3 Physical constraints (Max score = 5)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Are function buttons large enough to be usable? No No Is the screen visible at a range of distances and in various types of lighting? No No Does the touch-screen cater for users touching the screen quickly and slowly? N/A N/A Is the distance between targets (e.g. icons) and the size of targets appropriate (size should be proportional to distance)?
Yes No
Has the use of text in images and large or irregular imagemaps been avoided? Yes Yes Usability score 2 / 5 1 / 5 Usability problem severity rating 3 3 Extraordinary users (Max score = 6)
Interactive 3-D Atlas (Colorado)
3-D Brain Browser Interface (Oregon)
Is the use of colour restricted appropriately (and suitable for colour-blind users)? No No Do the buttons allow for use by older, less agile fingers or people wearing gloves? No No Do buttons give tactile feedback when selected? No No Is the touch-screen usable by people of all heights and those in wheelchairs? N/A N/A Are equivalent alternatives provided for visual and auditory content? No No Have accessibility and internationalization guidelines been applied if appropriate? Yes Yes Usability score 1 / 6 1 / 6 Usability problem severity rating 3 4 Appendix-Table-4: LIDA detailed results Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) Accessibility(Max=60) 70%(42/60) 80%(48/60) -Automated tests(Max=57) 72%(41/57) 79%(45/57) -Different Browsers(Max=3) N/A N/A -Registration(Max=3) 33%(1/3) 100%(3/3) Usability(Max=54) 72%(39/54) 48%(26/54) -Clarity(Max=18) 72%(13/18) 28%(5/18) -Consistency(Max=9) 100%(9/9) 100%(9/9) -Functionality(Max=15) 53%(8/15) 20% (3/15)
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 35 Tutor: KW Lam; Student: Sanjoy Sanyal
-Engagibility(Max=12) 100%(9/9) 100%(9/9) Reliability(Max=27) 78%(21/27) 74%(20/27) -Currency(Max=9) 33%(3/9) 22%(2/9) -Conflicts of interest(Max=9) 100%(9/9) 100%(9/9) -Content production(Max=9) 100%(9/9) 100%(9/9) Overall score(Max=141) 72%(102/141) 67%(94/141) Mean 73.7 72.1 SD 28.54 37.26 Probability associated with Student�s t test (2 tailed distribution, unpaired 2 sample with unequal variance) 0.92 Level 1 Accessibility (Max=57) Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) 1. Page Setup (Max=15) 40% (6 / 15) 60% (9 / 15) Document Type Definition 3 0 HTTP-Equiv Content-Type (in header) 0 3 HTML Language Definition 0 0 Page Title 3 3 Meta Tag Keywords 0 3 2. Access Restrictions (Max=12) 66% (8 / 12) 100% (12 / 12) Image Alt Tags 3 3 Specified Image Widths 2 3 Table Summaries 0 3 Frames 3 3 3. Outdated Code (Max=27) 100% (27 / 27) 88% (24 / 27) Body Tags - Body Background Colour 3 0 Body Tags - Body Topmargin 3 3 Body Tags - Body Margin Height 3 3 Table Tags - Table Background Colour 3 3 Table Tags - Table Column Height 3 3 Table Tags - Table Row Height 3 3 Font Tags - Font Color 3 3 Font Tags - Font Size 3 3 Align (non style sheet) 3 3 4. Dublin Core Tags (Max=3) 0% (0 / 3) 0% (0 / 3) Dublin Core Title Tag 0 0 ACCESSIBILITY RATING 72% (41 / 57) 79% (45 / 57) Level 2 Usability (Max=54) Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) 1. Clarity (Max=18) 72% (13 / 18) 28% (5 / 18) Clear statement 3 0 Detail level 2 1 �Block of content� 2 (Font too small) 0 (Not scannable, small graphics) Navigation 3 1 Site location 3 3 Colour scheme 2 (Colour-blind unfriendly)* 0 (Significantly so)* 2. Consistency (Max=9) 100% (9 / 9) 100% (9 / 9) Page layout 3 3 Navigation link 3 3 Site organisation 3 3 3. Functionality (Max=15) 53% (8 / 15) 20% (3 / 15) Search facility 0 0 Browsing facility 3 0 Cognitive overhead 2 0 Browser navigational tools 3 3 Plug-ins 0 0 4. Engagibility (Max=12) 100% (9 / 9) 100% (9 / 9) Effective judgment 3 3 Interactive 3 3 Personalise 3 3 USABILITY RATING 72% (39 / 54) 48% (26 / 54) *See Vischeck results for colour blind accessibility
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 36 Tutor: KW Lam; Student: Sanjoy Sanyal
Level 3 Reliability (Max=27) Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) 1. Currency (Max=9) 33% (3 / 9) 22% (2 / 9) Recent events 1 1 User comments 0 0 Updated 2 1 2. Conflicts of interest (Max=9) 100% (9 / 9) 100% (9 / 9) Who runs site 3 3 Who pays for site 3 3 Objective 3 3 3. Content production (Max=9) 100% (9 / 9) 100% (9 / 9) Clear method 3 3 Robust method 3 3 Original source check 3 3 RELIABILITY RATING 78% (21 / 27) 74% (20 / 27) Appendix-Table-5: WebXACT automated test results Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) General issues Metadata summary Author No author Yes
Description No description No description Key words No keywords Yes
Page content Images: Server side image maps 0 0 Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) Quality issues
Warnings Warnings Content Defects Broken links: 0 0 Broken anchors: 0 0 Links to local files: 0 0 Search and navigation Elements missing Alt text 0 0 Page efficiency Elements missing height and width attributes
1 0
Warnings when accessing this page 0 0 Browser compatibility First-party cookies denied for default Internet Explorer privacy setting
0 0
Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) Accessibility issues Does not comply with all automatic
and manual checkpoints of W3C WCAG, requires repairs and manual verification
Does not comply with all automatic and manual checkpoints of W3C WCAG, requires repairs and manual verification
Priority 1 automatic checkpoints status, 0 errors, 0 instances on page status, 0 errors, 0 instances on page
Priority 1 manual checkpoints status, 7 test warnings,11 instances on page
status, 6 test warnings,8 instances on page
Priority 2 automatic checkpoints status, 2 test errors, 4 instances on page
status, 1 test error, 1 instance on page
Priority 2 manual checkpoints status, 13 test warnings, 15 instances on page
status, 14 test warnings, 18 instances on page
Priority 3 automatic checkpoints status, 4 test errors, 5 instances on page
status, 1 test error, 1 instance on page
Priority 3 manual checkpoints status, 10 test warnings, 10 instances on page
status, 9 test warnings, 9 instances on page
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 37 Tutor: KW Lam; Student: Sanjoy Sanyal
Interactive 3-D Atlas (Colorado) 3-D Brain Browser (Oregon) Privacy issues Data collection Page encryption level: 0 bit 0 bit Forms using GET: 0 0 Visitor Tracking and P3P Compliance First-party cookies denied for default Internet Explorer privacy setting
0 0
Third-party cookies 0 0 P3P compact policy No P3P compact policy No P3P compact policy Web beacons (graphics from external sites)
0 0
Third party content Third-party links 2 3
= Item or issue that is not serious; need not be addressed immediately
= Success of an item or issue
= Quality issues; but not serious, need not be addressed immediately
= Failure of an item/issue in a number of situations Appendix-Table-6: Calculation of Web Accessibility Barrier (WAB) score Colorado Webpage Oregon Webpage Priority 1 manual checkpoints 11 instances on page x Wf 3 = 33 8 instances on page x Wf 3 = 24 Priority 2 automatic checkpoints 4 instances on page x Wf 2 = 8 1 instance on page x Wf 2 = 2 Priority 2 manual checkpoints 15 instances on page x Wf 2 = 30 18 instances on page x Wf 2 = 36 Priority 3 automatic checkpoints 5 instances on page x Wf 1 = 5 1 instance on page x Wf 1 = 1 Priority 3 manual checkpoints 10 instances on page x Wf 1 = 10 9 instances on page x Wf 1 = 9 WAB score 86 72 WCAG attaches a three-point priority level to each checkpoint from its impact on Web accessibility. Priority 1 checkpoints mandate the largest level of compliance while Priority 3 checkpoints are optional for Web content developers. In weighting the calculation of the WAB score, we used the priority level in reverse order. The weighting factor (Wf) for Priority 1 violations is 3, for Priority 2 violations is 2, and for Priority 3 violations is 1. Appendix-Table-7: Summary of results Colorado browser Oregon browser Questionnaire Very easy searchability 67% 50% Very difficult searchability 10% 37% Moderately fast browser speed 87% 50% Very fast browser speed 10% 37% Task failure rate 0% 30% Task success rate (1+ attempt) 70% 43% (M:F=13%:47%) Extra help requirement 3% 43% Ease of use 97% 57% Information usefulness 83% 63% Information overload 30% 37% Very useful 47% (M:F=47%:47%) 43% (M:F=60%:27%) Actual task-based usage 44% 33% Future prospects 67% (M:F=47%:87%) 43% (M:F=53%:33%) Heuristic violation rating 0.67 2.07 LIDA Accessibility Failed validation; 70% Failed validation; 80% Usability 72% 48% Reliability 78% 74% Overall score 72% 67% W3C validation service
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 38 Tutor: KW Lam; Student: Sanjoy Sanyal
Failed validation; 14 errors Failed validation; 18 errors Not Valid HTML 4.01 Strict Not valid (no Doctype)
WebXACT Automatic checkpoints 6 errors; 9 instances 2 errors; 2 instances Manual checkpoints 30 warnings; 36 instances 29 warnings; 35 instances WAB score 86 72 Vischeck Mild difficulty for colour-blind Useless to colour-blind Evaluation Questionnaire for Browser Interface Usability of Neuroanatomy Applications This is a questionnaire to evaluate the two Neuroanatomy browser interfaces that you have worked with. Your personal details and opinions would be strictly anonymised and compiled only for statistical calculations. It should take about 30 minutes to complete the questionnaire. Thank you very much for your cooperation. Questions (You may leave blank any question you don't want to answer) Question 1: Gender:
Male Female
Question 2: Your age is in which of the following ranges? 17 18 19 20 21 22 or more
Question 3: Which class/semester are you studying? (PC=Pre-clinical) PC 1 PC 2 PC 3 PC 4 PC 5
Question 4: How many hours per day, including college and home, do you currently use the Internet: Less than 1 hour Between 1 and 2 hours Between 2 and 3 hours Between 3 and 4 hours More than 4 hours
Question 5: For how long have you been regularly using a computer? Not at all Few weeks 2-6 months 6-24 months >2 years
Question 6: Do you have a working personal computer (PC) in your home? Yes No
Question 7: From which location are you most likely to use the Web? Home Internet café School/College Library Other
Question 8: What type of Internet connection do you have currently (if you use more than one type of connection in different places, select the one you use most):
Modem ISDN/ADSL/Broadband/Always connected
Question 9: What web browser do you use most frequently? Microsoft Internet Explorer Netscape Navigator / Communicator Mozilla Firefox Other
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 39 Tutor: KW Lam; Student: Sanjoy Sanyal
Question 10: The Desktop Area/Screen Resolution you most commonly use is: 640 by 480 pixels 800 by 600 pixels 1024 by 768 pixels or more
Question 11: Which operating system are you currently running on your most frequently used PC? Windows Macintosh Unix/Linux WebTV Other
Question 12: Do you consider the Internet an important source of reliable medical information: Not at all To some extent Definitely
Question 13: Based on what you have seen of the two 3-D Neuroanatomy browsers so far, would you use either or both of them as definitive Neuroanatomy knowledge resources?
Yes No
Question 14: The two browser interfaces meet my information retrieval and navigation needs better than other medical information portals/gateways:
Strongly disagree Disagree Neither disagree nor agree Agree Strongly agree
Question 15: I prefer to use a conventional search engine (e.g., Google) to gather information or to have a librarian, staff member, or family member gather information for me:
True False
Question 16: How well does each of the two browser interfaces that you tried, perform its intended purpose? (Please rate each of the interfaces from �not at all well� to �extremely well� considering aspects like ease of access, use and navigation, and logical arrangement of information on the page(s)) Question 16.1: The interactive Neuroanatomy 3-D browser interface from Visible Human Experience (#1):
Not at all well Slightly well Somewhat well Very well Extremely well
Question 16.2: The Java plug-in 3-D browser from University of Oregon (#2): Not at all well Slightly well Somewhat well Very well Extremely well
Question 17.1: Did you find the medical information factually correct in the Website # 1? Yes Somewhat (not always) No
Question 17.2: Did you find the medical information factually correct in the Website # 2? Yes Somewhat (not always) No
Question 18.1: Searching for the right parts of the brain, and finding the right areas through browser # 1 was: Very difficult Difficult Of moderate (acceptable) difficulty Easy Very easy
Question 18.2: Searching for the right parts of the brain, and finding the right areas through browser # 2 was: Very difficult Difficult Of moderate (acceptable) difficulty
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 40 Tutor: KW Lam; Student: Sanjoy Sanyal
Easy Very easy
Question 19: How important/useful to you is each of the 2 browser interfaces (Please rate each of the information interfaces from �not at all important� to �extremely important�) Question 19.1: Browser interface # 1:
Not at all important Slightly important Somewhat important Very important Extremely important
Question 19.2: Browser interface # 2: Not at all important Slightly important Somewhat important Very important Extremely important
Question 20.1: Did you find interface �Help� (and other instructions/hints) around browser interface # 1 adequate? Yes, it is easy to use once you understand how it works; the online help and instructions provided were useful It is very easy to use; I didn�t need any help (or made very little use of it) No, I need more help to locate the information I need on the Website
Question 20.2: Do you find interface �Help� (and other instructions/hints) around browser interface # 2 adequate? Yes, it is easy to use once you understand how it works; the online help and instructions provided were useful It is very easy to use; I didn�t need any help (or made very little use of it) No, I need more help to locate the information I need on the Website
Question 21.1: The speed at which browser interface # 1 loads on my Internet connection is: Very fast Moderately fast Moderately slow Very slow
Question 21.2: The speed at which browser interface # 2 loads on my Internet connection is: Very fast Moderately fast Moderately slow Very slow
Question 22.1: You tried to find some structures in the brain images using browser interface # 1. Generally speaking, how successful were you in completing this task?
Successful from the first attempt Successful after one / more failed attempts Not successful
Question 22.2: You tried to find some structures in the brain images using browser interface # 2. Generally speaking, how successful were you in completing this task?
Successful from the first attempt Successful after one / more failed attempts Not successful
Question 23: What interface did you use most of the time to accomplish the above task? Browser interface # 1 Browser interface # 2 Both were equally used
Question 24.1: Interface # 1 pointers to information are of good quality, accurate, up-to-date and useful: Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree
Question 24.2: Interface # 2 pointers to information are of good quality, accurate, up-to-date and useful: Strongly agree Agree Neither agree nor disagree Disagree Strongly disagree
Question 25.1: When I looked for information in site # 1 I got overloaded quickly with too much detail: Not at all a problem
RCSEd + University of Bath, MSc Health Informatics Unit 9: Human Computer Interaction; Final Assignment July 2006 41 Tutor: KW Lam; Student: Sanjoy Sanyal
A slight problem A moderate problem A significant problem An extreme problem
Question 25.2: When I looked for information in site # 2 I got overloaded quickly with too much detail: Not at all a problem A slight problem A moderate problem A significant problem An extreme problem
Question 26: Do you think the two 3-D visual sites are useful additions to / improvements over conventional text-based Web portal interfaces?
Yes, definitely Could be an improvement (but not always) No, visual interfaces are useless
Question 27: What do you think are the future prospects of the two interfaces? Question 27.1: Browser interface # 1
Not at all important Slightly important Somewhat important Very important Extremely important
Question 27.2: Browser interface # 2 Not at all important Slightly important Somewhat important Very important Extremely important
Question 28: Were there any parts of the service that you found especially helpful? What do you like most about these sites and why? (Mention each site separately) Question 29: Were there any parts of the service that you found especially difficult to use or understand? What do you dislike most about them and why? (Mention each site separately) Question 30: Should money be invested to continue developing and implementing these interfaces? Why/why not? Question 31: What are your suggestions or comments about what would make these browser interfaces better? (e.g., I would like the following added to, changed in, or deleted from them) Question 32: Could you tell us your thoughts about this questionnaire? Are we asking the right questions? Are we asking the questions in the right way? Your feedback will help us design better questionnaires in the future. Thank you once again very much for your feedback.