question: - pace university webspacewebpage.pace.edu/tvirgona/day 3 notes.doc · web viewculture is...

56
Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information Studies Day 3 Notes – Thomas Virgona Long Island University / C.W. Post Human Computer Interaction Notes Author Document # Annotation (Quesenbery , Whitney 2001) 1 Good technical writing focuses on user's goals. Likeability: Effective, efficient, engaging, error tolerant, easy to learn. IN planning usability evaluations, be sure the most important characteristics are included. (Hildreth, Charles class notes 2004) 2 User centered design and usability Testing (Perfetti, Christine 2003) 3 CUE (Comparative Usability Studies). Another problem with heuristic inspection is based solely on opinions. Not systematic, rough list. Test reports (Keep it short, one page executive summary, include positive findings, classify the comments). She does not define contextual inquiry. (Jacko and Sears 2003) 4 In HCI we attempt to understand the need of some audience and then formulate system designs to meet those needs. The focus has been how well someone can complete a specified task using the technology being evaluated. Shift in focus: people interacting with computers, through people interacting with information, toward people interacting with people through technology. Technology is moving out of the workplace and into the homes and everyday lives. Computer learning is unique: leaning forward versus leaning back. A criticism in HCI evaluation is that it oversimplifies context and can lead to incorrect evaluations. The standard terminology is to talk about users of systems, user goals, context of use, and usability. Design represents synthesis of data observed in the world. Quality of content is not general explored within task-based HCI, but is accommodated within behavioral methods employed in HCI. A final point: the consideration of impact on the overall quality of life. Going to the movies is a part of enjoying a movie. The value does not always come from the content alone. We can measure the time someone spends looking at something as a measure of interest. People are social creatures who do value being around other people at times. Males enjoy "hunting games"; females prefer social play as free time activities. (Verenikina and Gould 1998) 5 HCI treats the computer and its operator as equals. Vygotskian activity theory. IN this approach the main feature of the psyche is the active position of human beings toward the world in which they live. Humans are continually changing the objects and creating artifacts - tools. This complex interaction of individuals with their surrounding has been called activity and regarded as the fundamental unit of analysis of human psyche (Leonitiev 1978 and Tolman 1988). Karpatschof 1992: the machine has many attributes we Page: 1 of 56

Upload: hoangtram

Post on 13-May-2018

215 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostHuman Computer Interaction Notes

Author Document #Annotation(Quesenbery, Whitney 2001)

1 Good technical writing focuses on user's goals. Likeability: Effective, efficient, engaging, error tolerant, easy to learn. IN planning usability evaluations, be sure the most important characteristics are included.

(Hildreth, Charles class notes 2004)

2 User centered design and usability Testing

(Perfetti, Christine 2003)

3 CUE (Comparative Usability Studies). Another problem with heuristic inspection is based solely on opinions. Not systematic, rough list. Test reports (Keep it short, one page executive summary, include positive findings, classify the comments). She does not define contextual inquiry.

(Jacko and Sears 2003)

4 In HCI we attempt to understand the need of some audience and then formulate system designs to meet those needs. The focus has been how well someone can complete a specified task using the technology being evaluated. Shift in focus: people interacting with computers, through people interacting with information, toward people interacting with people through technology. Technology is moving out of the workplace and into the homes and everyday lives. Computer learning is unique: leaning forward versus leaning back. A criticism in HCI evaluation is that it oversimplifies context and can lead to incorrect evaluations. The standard terminology is to talk about users of systems, user goals, context of use, and usability. Design represents synthesis of data observed in the world. Quality of content is not general explored within task-based HCI, but is accommodated within behavioral methods employed in HCI. A final point: the consideration of impact on the overall quality of life. Going to the movies is a part of enjoying a movie. The value does not always come from the content alone. We can measure the time someone spends looking at something as a measure of interest. People are social creatures who do value being around other people at times. Males enjoy "hunting games"; females prefer social play as free time activities.

(Verenikina and Gould 1998)

5 HCI treats the computer and its operator as equals. Vygotskian activity theory. IN this approach the main feature of the psyche is the active position of human beings toward the world in which they live. Humans are continually changing the objects and creating artifacts - tools. This complex interaction of individuals with their surrounding has been called activity and regarded as the fundamental unit of analysis of human psyche (Leonitiev 1978 and Tolman 1988). Karpatschof 1992: the machine has many attributes we do not find in a human being. That is the reason for making it in the first place. Cognitive science asks about thought and thinking, about consciousness and computations. Basler and Boedker found a gap between the way systems development is represented in structured analysis by Yourdon and Demarco and the way in which is was carried out. Designers did not follow the design procedures prescribed but, in general, had a vary pragmatic attitude towards using it. Tools can be external (physical, technical) such as artifacts, instruments, and machines or internal (psychological) such as laws, signs, procedures, methods and language. Actions are goal orientated. Activity theory provides a paradigm for the description and understanding of the way humans deal with computers in the context of the user's environment.

(kaptelinin, Nardi 1997)

6 Activity theory: Hierarchy (Activities are composed of goal orientated actions), Object Orientated (things constitute reality), Internalization/Externalization (internal activities cannot be understood of they are analyzed separately from external activities), mediation, Development (it is a general research methodology). Activity Checklist, focus on structure of users activity, structure of environment, dynamics of interaction, development.

(Nardi - No date on class handout but post 1994)

7 The object of activity theory is to understand the unity of consciousness and activity. The concern is that activity theory is hard to learn, and, because we have not seen its actual benefits realized in specific empirical studies, the time spent learning it would be of dubious benefit. Artifacts are mediations of a systems, they do not occupy the same ontological; space.

Page: 1 of 38

Page 2: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document #Annotation(Verenikina and Gould 1997)

8 Despite this, evidence exists that even when sophisticated efforts are made to extend cognitive psychology in the direction of HCI they have had no effect on design systems. Attempts to apply traditional cognitive psychology have had limited success due to narrow focus of this discipline.

(Uden and Willis - No date on class habdout but post 1999)

9 Researchers in recent years have criticized the gap between research results and practical design in HCI. There is an emerging consensus among researchers that the cognitive approach to HCI may be limited. From Kutti: Activities can be considered as having three hierarchical levels: activity, action and operation. An activity must have an object. Two interfaces to be considered: human computer interface and the computer environment interface. According to Brooks, the hardest single part of building a system is deciding precisely what to build. We believe that most systems fail because they do not include analysis of the motives or goals that stakeholders might have, or the context in which the system exists. One of the limitations of activity theory is that it is not operational zed enough. There are not enough methods and techniques that can be directly utilized to solve specific problems.

(Gray and Altmann 1999).

10 GOMS: goals, simple Operations, Methods of accomplishing a goal, Selection rule for alternatives. Information in the world is useful only if we can find it when we need it. Models vary in their concern with generality versus realism. Cognitive modeling is the application of cognitive theory to applied problems.

(Rodriguez 1998)

11 Actions are goal directed processes.

(Aboulafia, Guold, Spyrou - No date on class handout but post 1994)

12 Artificial intelligence aims at creating computer software and hardware that imitates the human mind or functions of the human brain. Simon (1969) hypothesis is that the human psyche is quite simple with any complexity due to the environment. Humans are goal orientated creatures.

(Landauer - No date on class handout but post 1994)

13 Useful theory is impossible, because the behavior of human-computer systems is chaotic or worse, highly complex, dependent on many unpredictable variables, or just too hard to understand. Theories have minor impact. The keystroke model is successful because it deals with aspects of behavior that are relatively simple and well understood. Let us be realistic, even the best examples of theory applications have produced only small quantities and/or local gains in productivity. The few successful computer and HCI inventions to date have come from lucky hunches.

(Barnard Chapter 8 - No date on class handout)

14 In HCI, two highly complex information processors are effectively conducting dialogue: the user and computer system. People can and do use a variety of names or descriptions when referring to the same concept ("think aloud"). Data suggested the "training wheels" approach saved time and facilitated learning. GOMS is from Card, et al. Card distinguished between different processors and memory systems. Fitt's law governs movement as a function of target size and distance. Hick's law governs the choice between choice time and the number of alternatives. From a purely scientific perspective, it is clear that there are many difficulties with our ideas, methodologies and evidence. In many respects applied research occupies the uncomfortable position of being squarely placed between the deep blue sea of science and the devil of practical application. User cognition is complex.

(Hollan, Hutchins, Kirsh 2000)

15 Computation is become ubiquitous. Distributed cognition: it extends the reach of cognitive beyond the individual to encompass interactions between people and with their resources and materials in the environment. Cognitive process may be distributed across member of a social community, may involve coordination of internal and external structures, processes may be distributed through time. Culture is a process that accumulates partial solutions to frequently encountered problems. Participant observation is such an important component of cognitive ethnography. The newest airline cockpit contains replication of the old electrometrical instrument, now rendered in digital display. Distributed cognition theory principles: people establish and coordinate different types of structure in their environment, it takes effort to maintain coordination, people off-load cognitive effort to the environment whenever possible, there are improved dynamics of cognitive load balancing available in the

Page: 2 of 38

Page 3: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document #Annotation

social organization. Cross-modal representation: Pilots detecting errors without instruments. Screen space often has no natural correlation in physical space, we manipulate icon space. We are spatially located creatures. We must always be facing some direction, have only certain objects in view, be within reach of others. We found subjects using space to simplify choice by creating arrangements that served as heuristic cues, find relevant items, make it easier for the visual system to track.

Hildreth 16 Assorted class notes(Karat and Karat 2003)

17 The maxim: "Easy to use". Usability is the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use. The acceptability is dependent on the way the system fits into the context. Know the users and their tasks - Necessary for design. UCD: User centered design - akin to "family values in nature", a concept to which everyone subscribes, but for which there seems to be no agreed-upon definition. UCS includes: early user focus, early evaluation, iterative design and development, user control of design (My note - What??). HCI research and practice has moved from interface to interaction. Design has emerged as the central focus within HCI. Developing new systems is always a done within a context of design trade-offs and limited resources.

(Preece - No date on class handout but post 1991)

18 Cognitive psychology: explain how human beings achieve goals they set. Such goal activity is comprised of performing cognitive tasks that involve information processing. Multi-store memory: sensory store, short term (working) memory, and permanent long-term memory. Miller 1956: The magical number +- 7. The model human processor has 3 systems: perceptual, motor, and cognitive.

(Myers 1996) 19 A Brief history of HCI Technology.(Myers, Hollan, Cruz 1996)

20 Strategic directions in HCI

(Pew - date unknown)

21 Hopper: Coining the term "bug" (moth on a relay contact). Nelson: created hypertext concept. Licklider wanted to think of computers as aids to human thinking. Englebart: NLS - oNLine System and mouse. Sutherland: Computer games. Kay: parallel processing, windows, OO. Frederick Taylor: scientific management, time and motion studies. Hanson User Engineering Principles: Know the user, minimize memorization, optimize operations, engineer for errors. IBM established usability labs. Without a doubt, this should be characterized as the era of the Internet. DNS come in 1984. Bernes-Lee created mesh, the pre-cursor to the WWW. The fist successful; speech recognition program was used by UPS in the 1980's. Heuristics: Experts exercising the software. Email started in 1972. We appear to be heading into the much-heralded epoch of ubiquitous (invisible) computing.

(Hewett, Baecker, Card, Carey, Gasen, Mantei, Perlman, Strong and Verplank 1996 ACMSIGHCI)

22 HCI: Computer science (application design and engineering of human interfaces), psychology (the application of theories of cognitive processes and the empirical analysis of user behavior), sociology and anthropology (interactions between the technology, work, and organization), and industrial design (interactive products). Future of HCI: Ubiquitous communication, high functionality systems, mass availability of computer graphics, mixed media, high bandwidth interaction, large and thin designs, embedded computations, group interfaces, user tailor ability, and information utilities.

(Unknown: HCI in-class Handout "A Historical and Intellectual Perspective".)

23 Bush: transforming thought and human a creative activity (Memex). Symbiosis: living together in intimate association, or even close union, of two dissimilar organisms. Englebart and Nelson realized most information would be stored digitally, not in microfilm. The discipline that focuses on enhancing the quality of our use of artifacts is called human factors or ergonomics. James Martin landmark work in 1973: Design of Man-Computer Dialogues. The PC really started with Xerox. Every message, in one sense or another, a simulation of some idea.

(Carroll Introduction: Human-Computer

24 The software crisis led to the emergence of software engineering as a discipline. Software designers should always "plan to throw one away". The WWW is vast collection of information, but it is not a library.

Page: 3 of 38

Page 4: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document #AnnotationInteraction, the past and the present - date unknown)(Virgona 2004) 25 Major Figures in HCI History(Virgona 2004) 26 How Cognitive Science Has Influenced the Applied Science Of HCI(Chu, Hyperlinks: How well do they represent the intellectual content of Digital Collections?)

KO #29 Hyperlinks are measured by: exhaustively (indexing) and specificity )Richness, vagueness, context-dependent). The human mind operates by association (Bush). With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails. That kind of association is today vividly and thoroughly reflected in one Internet application, the World Wide Web, via its implementation of hyperlinks. Building quality hyperlinks: choose link names carefully, create vertical and horizontal structures, be exhaustive and specific.

(Roger W. Harris, UNIMAS, 2000)

SAD #84 Schools of thought in research into end-user computing

(Dowell and long, 1998)

HCI# 86 Response: Prospects for consensus and advancing cognitive engineering. Use a discipline matrix adapted from Kuhn as a framework within which to evaluate the new discipline. The four dimensions in this discipline are: ontology, values, symbolic generalizations, and exemplars.

(Flach 1998) HCI #87 Although cognitive systems engineering is a young field, it is more developed than Dowell and Long would suggest. The field places a high value on external or ecological validity and naturalistic observations where cognition is studied in rich semantic contexts. Cause operates from the past forward - this leads to linear, reductionist’s view in which the future is determined by the summation of forces arising from fundamental particles.

(Ayes, Nielson, Ridley, 1999)

HCI #89 BOPAC2: A New Concept in OPAC Design and Bibliographic Control

 (Olson and Boll 2001, Subject Analysis in Online Catalogs)

Text  The cognitive paradigm for information retrieval: Page 266.

The Constructions paradigm: Page 268.

User characteristics: Page 269.

User systems interfaces: Page 275. (Olson and Boll 2001, Subject Analysis in Online Catalogs)

Text   The user of an online catalog is a mythical figure.

Display Guidelines: Page 295.

(Rowley and Farrow 2000; Organizing Knowledge)

Text Usability: ease of learning, ease of use, flexibility, attitude.

Cognitive modeling: The mental model (user’s mental representation of the system), the designer’s conceptual framework for the description of the system – the user model, the image of the system to be presented to the users – the system model, the physiologists conceptual model of the user’s mental model – the conceptual model.

The human information-processing model: perception, attention, information processing, memory, learning strategies.

  (Irina Ceapuru and Ben Shneiderman. “Finding Governmental Statistical Data on the Web: A

 HCI# 99   We believe that dissemination of statistical information should be governed by at least the following design principles: Universal usability: Not only experts, but first and one time users. Easy Navigation: Information available should be presented in a structured way. Common Language: The terminology used to present the information available should be easy to

understand. Comparative search and data tools: Allow a comparative search and other common-use ways of

Page: 4 of 38

Page 5: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document #AnnotationStudy of Categorically Organized Links for the FedStats Topics Page.”

viewing and analyzing statistical data. For example, easily comparing housing costs in two cities. Advanced search: It should support a comprehensive search. Data granularity: Allow users to choose the granularity of the information searched in terms of

geography and time.

Myers, Brad A. “A Brief History ofHuman Computer Interaction Technology”. December, 1996

http://www-2.cs.cmu.edu/~amulet/papers/uihistory.tr.html

Research in Human-Computer Interaction (HCI) has been spectacularly successful, and has fundamentally changed computing. Just one example is the ubiquitous graphical interface used by Microsoft Windows 95, which is based on the Macintosh, which is based on work at Xerox PARC, which in turn is based on early research at the Stanford Research Laboratory (now SRI) and at the Massachusetts Institute of Technology. Another example is that virtually all software written today employs user interface toolkits and interface builders, concepts which were developed first at universities. Even the spectacular growth of the World-Wide Web is a direct result of HCI research: applying hypertext technology to browsers allows one to traverse a link across the world with a click of the mouse. Interface improvements more than anything else has triggered this explosive growth. Furthermore, the research that will lead to the user interfaces for the computers of tomorrow is happening at universities and a few corporate research labs.

Basic Interactions Direct Manipulation of graphical objectsThe Mouse Windows

Application Types Drawing programs Text Editing Spreadsheets HyperText Computer Aided Design (CADVideo Games

Up-and-Coming Areas Gesture Recognition Multi-Media 3-D Virtual Reality and "Augmented Reality" Computer Supported Cooperative Work. Natural language and speech

Software Tools and Architectures UIMSs and Toolkits Interface Builders Component Architectures

Discussion It is clear that all of the most important innovations in Human-Computer Interaction have benefited from research at both corporate research labs and universities, much of it funded by the government. The conventional style of graphical user interfaces that use windows, icons, menus and a mouse and are in a phase of standardization, where almost everyone is using the same, standard technology and just making minute, incremental changes. Therefore, it is important that university, corporate, and government-supported research continue, so that we can develop the science and technology needed for the user interfaces of the future.

Another important argument in favor of HCI research in universities is that computer science students need to know about user interface issues. User interfaces are likely to be one of the main value-added competitive advantages of the future, as both hardware and basic software become commodities. If students do not know about user interfaces, they will not serve industry needs. It seems that only through computer science does HCI research disseminate out into products. Furthermore, without

Page: 5 of 38

Page 6: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document #Annotation

appropriate levels of funding of academic HCI research, there will be fewer PhD graduates in HCI to perform research in corporate labs, and fewer top-notch graduates in this area will be interested in being professors, so the needed user interface courses will not be offered.

As computers get faster, more of the processing power is being devoted to the user interface. The interfaces of the future will use gesture recognition, speech recognition and generation, "intelligent agents," adaptive interfaces, video, and many other technologies now being investigated by research groups at universities and corporate labs [35]. It is imperative that this research continue and be well-supported.

Powerpoint HCI #100 History of HCI; Key people, events ideas.

Page: 6 of 38

Page 7: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post

Neilson, Jacob. Guidelines for Multimedia in the web. 1995. http://www.useit.com/alertbox/9512.html

 AnimationMoving images have an overpowering effect on the human peripheral vision. This is a survival instinct from the time when it was of supreme importance to be aware of any saber-toothed tigers before they could sneak up on you. These days, tiger-avoidance is less of an issue, but anything that moves in your peripheral vision still dominates your awareness: it is very hard to, say, concentrate on reading text in the middle of the a page if there is a spinning logo up in the corner. Never include a permanently moving animation on a web page since it will make it very hard for your users to concentrate on reading the text. Animation is good for:

Showing continuity in transitions.

Indicating dimensionality in transitions.

Illustrating change over time.

Multiplexing the display.

Enriching graphical representations.

Visualizing three-dimensional structures.

Attracting attention. Video

Currently, video is good for: Promoting television shows, films, or other non-computer media that

traditionally have used trailers in their advertising.

Giving users an impression of a speaker's personality. Unfortunately, most corporate executives project a lot less personality than, say, Captain Janeway from Star Trek, so it is not necessarily a good idea to show a talking head unless the video clip truly adds to the user's experience.

Showing things that move. For example a clip from a ballet. Product demos of physical products (e.g., a coin counter) are also well suited for video, whereas software demos are often better presented as a series of full-sized screendumps where the potential customer can study the features at length.

AudioThe main benefit of audio is that it provides a channel that is separate from that of the display. Speech can be used to offer commentary or help without obscuring information on the screen. Audio can also be used to provide a sense of place or mood as done to perfection in the game Myst. Mood-setting audio should employ very quiet background sounds in order not to compete with the main information for the user's attention. Music is probably the most obvious use of sound. Whenever you need to inform the user about a certain work of music, it makes

Page: 7 of 38

Page 8: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Postmuch more sense to simply play it than to show the notes or to try to describe it in words. For example, if you are out to sell seats to the La Scala opera in Milan, Italy, it is an obvious ploy to allow users to hear a snippet of the opera: yes, Verdi really could write a good tune (AU file, 1.4 MB), so maybe I will go and hear the opera next time I am over there. In fact, the audio clip is superior to the video clip from the same opera which is too fidget to impress the user and yet takes much too long to download (QuickTime, 3.6 MB). Voice recordings can be used instead of video to provide a sense of the speaker's personality (AU file, 1.4 MB): the benefits are smaller files, easier production, and the fact that people often sound good even if they would look dull on television. Speech is also perfect for teaching users the pronunciation of words as done by the French wine site: it used to be the case that you could buy good wine cheaply by going for chateaus that were hard to pronounce (because nobody dared ask for them in shops or restaurants) -- no more in the webbed world. Non-speech sound effects can be used as an extra dimension in the user interface to inform users about background events: for example, the arrival of new information could be signaled by the sound of a newspaper dropping on the floor and the progress of a file download could be indicated by the sound of water pouring into a glass that gradually fills up. These kinds of background sounds have to be very quiet and nonintrusive. Also, there always needs to be a user preference setting to turn them off. Good quality sound is known to enhance the user experience substantially so it is well worth investing in professional quality sound production. The classic example is the video game study where users claimed that the graphics were better when the sound was improved, even though the exact same graphics were used for the poor-quality sound and the good-quality sound experiments. Simple examples from web user interfaces are the use of a low-key clicking sound to emphasize when users click a button and the use of opposing sounds (cheeeek chooook) when moving in different directions through a navigation space. Response TimeMany multimedia elements are big and take a long time to download with the horribly low bandwidth available to most users. It is recommended that the file format and size are indicated in parentheses after the link whenever you point to a file that would take more than 15 seconds to download with the bandwidth available to most of your users. If you don't know what bandwidth your users are using you should do a survey to find out since this information is important for many other page design issues. At this time, most home users have at most 28.8 Kb, meaning that files longer than 50 KB need a size

Page: 8 of 38

Page 9: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Postwarning. Business users often have higher bandwidth, but you should probably still mark files larger than about 200 KB. The 15-second guideline in the previous paragraph was derived from the basic set of response time values that have been known since around 1968. System response needs to happen within about 10 seconds to keep the user's attention, so users should be warned before slower operations. On the web, current users have been trained to endure so much suffering that it may be acceptable to increase the limit value to 15 seconds. If we ever want the general population to start treating the web as more than a novelty, we will have to provide response times within the acceptable ranges, though. Design of client-side multimedia effects has to consider the other two response time limits also:

The feeling of directly manipulating objects on the screen requires 0.1 second response times. Thus, the time from the user types a key on the keyboard or moves the mouse until the desired effect happens has to be faster than 0.1 seconds if the goal is to let the user control a screen object (e.g., rotate a 3D figure or get pop-ups while moving over an imagemap).

If users do not need to feel a direct physical connection between their actions and the changes on the screen, then response times of about 1.0 second become acceptable. Any slower response and the user will start feeling that he or she is waiting for the computer instead of operating freely on the data. So, for example, jumping to a new page or recalculating a spreadsheet should happen within a second. When response times surpass a second, users start changing their behavior to a more restricted use of the system (for example, they won't try out as many options or go to as many pages).

http://www.useit.com/

alertbox/9611.html Page TitlesAs part of the HTML standard, every Web page should have a <TITLE> defined in its header. Page titles are important for navigation support since they are normally the default way to refer to pages in various navigation support mechanisms such as bookmark lists, history lists, overview diagrams, etc. Titles are also often used as the best way of listing retrieved pages in search engines. Many of these important uses of the page <TITLE> are taken out of context, and it is therefore important that the title has enough words to stand on its own and be meaningful when read in a menu or a search listing. On the other hand, overly long titles slow down users, so as a guideline aim at titles between four and ten words. Different pages need different titles. It is very unpleasant to visit, say, seven pages with the same title and then try to go back to a specific page from the history list. Also, of course,

Page: 9 of 38

Page 10: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Postbookmarking more than one page from such a site is a guaranteed usability problem, since the bookmark/favorites menu will contain several identical entries with different results. A final point is to optimize titles for quick scanning. This implies moving information carrying terms toward the beginning of the title and preferably

starting with a word that will match the user's needs when scanning down a menu or listing of titles. A classic mistake is to use a title like Welcome to MyCompany. It would be much better to call the page MyCompany - Home Page. Similarly, eliminate articles like The, A, and An from the beginning of the title. Doing so is particularly important because some title listings are alphabetized. In addition to titles, other ways of referring to Web pages include verbal and visual summaries. Normally, such summaries are very difficult to produce algorithmically. The main exception is the miniature as shown by the illustration to the right. The figure shows a miniature of this page, generated by scaling it to 15 percent of full size. In general, page miniatures are only good as representations for highly graphic pages or pages with very characteristic layout. Two of the better uses of miniatures are for building a site map and for the history list when navigating sites that focus on visual arts.

Text Size and ColorTell me, did you feel like clicking on the word "Color" in the above heading? Most likely, the word appeared in blue type, and if so you probably thought it was a hypertext link. On the Web, blue text equals clickable text, so never make text blue if it is not clickable. It is also bad, though not quite as bad, to make text red or purple, since these two colors are often used to denote hypertext links that have already been visited. Another commonly seen mistake in text design is the use of large or small font sizes as the body text of a page. Page designers sometimes think that the default text in their browser is wrong for the effect they want to achieve, and it is certainly acceptable to make a small percentage of the text on a given page large or small, as appropriate. It is not recommended to change the font size of all the text on a page since the user must be assumed to have set the default font size in his or her browser to exactly the size that is most comfortable for

Page: 10 of 38

Page 11: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post

that user on his or her monitor. Any other font size is thus by definition suboptimal for reading body text. Relevance-Enhanced Image Reduction: Better ThumbnailsIt is quite common to use thumbnail versions to represent images that are too large to be downloaded without a specific user request. Thumbnails are smaller, meaning that more can fit on a page and that download times are minimized. Unfortunately, the two most common ways of reducing images, scaling and cropping, both result in thumbnails that can be hard to interpret, as shown in the figure.

Three different ways of making thumbnails. Scaling reduces the image so much that pictures with extensive detail wash out and become too crowded to be meaningful. Cropping preserves those details that are within the new viewport, but at the cost of losing the context of the image as a whole. My recommendation is to use a combination of cropping and scaling, resulting in a technique I call relevance-enhanced image reduction. For example, to get a thumbnail that is 10 percent of the original image, first crop the image to 32 percent of the original size and then scale the result to 32 percent. The final image will be 0.32 x 0.32 = 0.1 of the original. As shown in the figure, relevance-enhanced image reduction results in a pleasant balance between presenting discernible detail and conserving context.

(Baecker, Ronald (Author), Grudin Jonathan (Author), Buxton, William (Author), Greenberg, Saul(Author), Kaufmann, Morgan (Author). “Readings in Human-Computer Interaction: Toward the Year 2000”. 2nd edition. January

Introduction to Human-Computer Interaction

Norman introduces several concepts that he uses in his analysis of both good and bad design: Affordances (buttons are for pushing, menu’s are for choosing), Constraints, Conceptual models, Mappings, Visibility, and Feedback.

Key concepts of user-centered design (Norman):

Page: 11 of 38

Page 12: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post15, 1995). Make it easy to determine what actions are possible at any moment.

Make things visible, including the conceptual model. Make it easy to evaluate the current state of the system. Follow natural mappings between intentions and the required actions.

It has been said that those who do not study the past are condemned to repeat it. Well, there is no chance of that this field will return to its past.

The Apple Lisa and Xerox star GUI base helped to inspire the rapid growth of human-computer interaction in the 1980’s.

(Baecker, Ronald (Author), Grudin, Jonathan (Author), Buxton, William (Author), Greenberg, Saul (Author), Kaufmann, Morgan (Author). “Readings in Human-Computer Interaction: Toward the Year 2000”. 2nd edition. January 15, 1995).

The process of Developing Interactive Systems

It is easy for developers to lose track of the larger context in which users operate. The greatest challenge in developing a good system is often not in knowing what to do, but in finding the time, place and resources to do it.

(Baecker, Ronald (Author), Grudin, Jonathan (Author), Buxton, William (Author), Greenberg, Saul(Author), Kaufmann, Morgan (Author). “Readings in Human-Computer Interaction: Toward the Year 2000”. 2nd edition. January 15, 1995).

Interacting with Computers

Why have we relied overwhelmingly on one sensor modality, the visual modality, and a different modality, manual operation, for communicating in the other direction.

Speech recognition is another difficult Artificial Intelligence frontier. There has been some progress, but the adoption of new technology has been slow.

Schneiderman describes the advantages and disadvantages of interaction styles:

Interaction style Advantages DisadvantagesMenu selection Shortens training, reduces

keystrokes, structures decision making, easy to support error handling

Danger of many menus, may slow frequent users, requires screen space

Form fill in Simplifies data entry, requires modest training, shows context for activity

Requires typing skills, requires screen space

Command languages Flexible, appeals to power users, potentially rapid for complex tasks.

Requires substantial training, poor error handling.

Natural language interfaces

Relieves burden of learning syntax,

Unpredictable

Direct manipulation/graphical interfaces

Easy to learn, visually presents tasks, encourages exploration

More programming effort, may require graphics display/pointing devises.

One trend worth mentioning is the coordinated use of multiple sensory modalities in systems is commonly known as multimedia.

(Baecker, Ronald (Author), Grudin, Jonathan (Author), Buxton, William (Author), Greenberg, Saul (Author), Kaufmann, Morgan

Psychology and Human Factors

Human factors discovers and applies information about human behavior, abilities, limitations, and other characteristics to the design of tools, machines, systems, tasks,

Page: 12 of 38

Page 13: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post(Author). “Readings in Human-Computer Interaction: Toward the Year 2000”. 2nd edition. January 15, 1995).

jobs and environments for productive, safe, comfortable, and effective human use (Sanders and McCormick 1987).

Menu guidelines: how many items should be in each menu, hoe many levels deep menu trees should go, and what is the optimum trade-off between breadth and depth in the design of menus. Unfortunately, all experiments have assumed that each menu item is chosen equally often. In real use, people generally use some menu items frequently and others rarely or never. Such patterns of use will influence the optimal depth, breadth, and organization of a menu, so we cannot use the guidelines with confidence. Innovations such as pie menus or voice selection menus have also typically not been considered in the experiments done to date.

An alternative to relying on empirical results is to develop analytical models that make quantitative predictions about a user’s performance before an interface is built. Such models can minimize the need to build multiple versions and carry out lengthy experiments.

Human error, crucial to understanding how to improve the user’s experience. Building systems that conform or adapt to individual differences is always an issue, and interfaces for the disabled present a special case of designing for human capabilities.

Consider the physical environment and health concerns that arise with intense use of computer keyboards and displays. Problems with vision, hearing, posture-related back problems, and carpal tunnel or repetitive stress syndrome, to cite prominent examples, seem to be on the rise as users increase in numbers and age; dealing with such problems is essential if we are to foster the productive and humane use of computer technology.

(Baecker, Ronald (Author), Grudin, Jonathan (Author), Buxton, William (Author), Greenberg, Saul (Author), Kaufmann, Morgan (Author). “Readings in Human-Computer Interaction: Toward the Year 2000”. 2nd edition. January 15, 1995).

Research Frontiers in Human-Computer Interaction

When a new land is being explored, the frontier is everywhere.

Despite adding novel input and output devices, the focus has remained on individual work. As networks link more and more users, however, this perspective is inadequate. Yet application developers accustomed to thinking in terms of the individual users may not be sensitive to the issues surrounding cooperative work and may therefore be ill prepared for the new challenges presented by groupware. (Virgona note: Despite no studies, this appears to be a very negative bias against programmers).

When the phone company had insufficient operators to serve its growing community of users, it solved the problem by using technology to turn each user into an operator. Likewise, many envision ‘end-user programming’ as the solution to application development. With the proper tools, this argument goes, users can create their preferred environments without having to learn formal programming languages. Others feel that the remaining developers will be put out of business by introducing artificial intelligence into interfaces, interface agents that will adapt to our needs and do our bidding.

Ubiquitous computing represents a profound shift in perspective. Computers are no longer boxes with displays and keyboards, but specialized and often visible computational elements populating many niches in purour environment and acting synergistically with one another.

Page: 13 of 38

Page 14: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post

Systems Analysis and Design

Author Document # AnnotationKim, Daniel H. 1999. Introduction to Systems Thinking Pegasus Communications.

1999 Systems thinking is a way of seeing and talking about reality that helps us better understand and work with systems to influence the quality of our lives. System levels: Events (occurrences encountered), Patters (Accumulated memories of events), Systemic structures Ways a system is organized). Mental models are the beliefs and assumptions we hold about how the world works. Balancing loops are continually trying to keep a system at some desired level of performance. Every link in a system contains a delay. Working on the system, not in the system (designers .vs. operators).

Osborne, Larry N., and Margaret Nakamura. 2000. Systems Analysis for Librarians and Information Professionals 2nd edition. Libraries Unlimited.

2000 Systems analysis is a means of viewing circumstances realistically and designing practical solutions. Steps in modern systems analysis: problem definition, data collecting and analysis, analysis of alternatives, feasibility determination, systems proposal, system design (my note: WRONG), pilot study, implementation (my note: WRONG), system review and evaluation. There is no guarantee the solution may be found. Not ethical to hide issues. Sources of problems: response time, throughput, economy, validity, reliability, security, and quality of information, efficiency. Taxonomy of problems: Organization wide, functional, operational, activity. Use flowcharts, data dictionaries, decision tables, Nassi-Schneiderman charts, ER diagrams, UML, cost-benefit analysis, plan networks, GANNT charts. Data flows from Yourdon and Gane & Sarson. Object Oriented Reusable, reliable, seamless integration with GUI. Speedier design. Plan for implementation, security and DR/COB and Vendor management.

Richmond, Barry. 2000. The "Thinking" in Systems Thinking: Seven Essential Skills Pegasus Communications.

2000 Systems Thinking: Set of tools, unique perspective on reality, a specific vocabulary. Steps in systems thinking: Specify a problem/Issue, construct hypothesis, test hypothesis, implement changes (Included looping feedback). Systems skills thinking: Dynamic thinking (Framing a problem of behavior over time), Systems as a Cause thinking (placing responsibility for a behavior on the internal actors who manage the policies and plumbing of a system), Forest Thinking (Believing that, to know something, you must understand the context of relationships), Operational thinking (concentrating on getting causality and understanding of how a behavior is actually generated), Closed Loop thinking (Viewing causality as an ongoing process, not a one time event, with the effect feeding back to influence the causes, and the causes affecting each other), Quantitative thinking (Accepting that you can always quantify, but you can't always measure), Scientific thinking (recognizes that all models are working hypothesis that always have limited applicability).

Whitten, Jeffrey L., et al., Lonnie. 2001. Systems Analysis and Design Methods 5th ed. New York: McGraw-Hill/Irwin.

2001  

Introduction to Systems Thinking. http://www.thinking.net/Systems_Thinking/systems_thinking.html.

  Austrian Biologist Ludwig von Bertalanffy: A system is an entity which maintains its existence through the mutual interaction of its parts. (Daniel Aronson) Systems thinking makes it extremely effective on the most difficult types of problems to solve: those involving complex issues, those that depend on a great deal of dependence on the past or on actions of others, and those stemming from ineffective coordination among those involved. Traditional analysis focuses on the separating of the individual pieces of what is being studied. Analysis comes from the root: break into pieces. Systems thinking focuses on how the thing being interacts with the other constituents of the system of which it is a part. Gene Bellinger - model to mean: A simplification of reality intended to promote understanding. Simulation: A simulation is the manipulation of a model in such a way that it operates on time and/or space to compress it, thus enabling one to perceive the interactions that would otherwise not be apparent because of their separation in time or space. This compression also provides a perspective on what happens within the system, which, because of the complexity of the system, would probably otherwise not be evident. Archetypes: Interaction Structures of the Universe.

Introduction to Systems Sciences: Online Course. http://www.online-

  The page cannot be found

Page: 14 of 38

Page 15: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotationeducation.ch/course/Seiten/intro.htm.Principia Cybernetica Web. http://pespmc1.vub.ac.be/CYBSWHAT.html.

J. de Rosnay Norbert Wiener had been teaching mathematics at MIT since 1919. Soon after his arrival there he had become acquainted with the neurophysiologist Arturo Rosenblueth, onetime collaborator of Walter B. Cannon (who gave homeostasis its name) and now at Harvard Medical School. Out of this new friendship would be born, twenty years later, cybernetics. With Wiener's help Rosenblueth set up small interdisciplinary teams to explore the no man's land between the established sciences. One man working with Rosenblueth in getting these seminars under way was the neurophysiologist Warren McCulloch, who was to play a considerable role in the new field of cybernetics. In 1948 two basic publications marked an epoch already fertile with new ideas: Norbert Wiener's Cybernetics, or Control and Communication in the Animal and the Machine, and The Mathematical Theory of Communication by Claude Shannon and Warren Weaver. The latter work founded information theory. As head of the Lincoln Laboratory, Forrester was assigned by the Air Force in 1952 to coordinate the implementation of an alert and defense system, the SAGE system, using radar and computers for the first time. Its mission was to detect and prevent possible attack on American territory by enemy rockets. Forrester realized the importance of the systemic approach in the conception and control of complex organizations involving men and machines in "real time": the machines had to be capable of making vital decisions as the information arrived.

Information Systems Analysis (Sauter). http://www.umsl.edu/~sauter/analysis/intro/system.htm

This page was last modified on:

08/14/2000 00:14:24

URL: http://www.umsl.edu/~sauter/analysis/intro/system.htm

Page Owner: Professor Sauter

([email protected])

© Vicki L. Sauter. All rights

Reserved.

In order to understand the relationship between inputs, outputs and processes, you need to understand the environment in which all of this occurs.

The environment represents everything that is important to understanding the functioning of the system, but is not part of the system. The environment it is that part of the world that can be ignored in the analysis except for its interaction with the system. It includes: competition, people, technology, capital, raw materials, data, regulation and opportunities.The boundary defines the difference between the environment and the system; the correct boundary is a function of the problem under consideration.

Thinking About Organizations as Systems (McNamara). http://www.mapnp.org/library/org_thry/org_sytm.htm.

Carter McNamara, PhD

Very simply, a system is a collection of parts (or subsystems) integrated to accomplish an overall goal (a system of people is an organization). Systems have input, processes, outputs and outcomes, with ongoing feedback among these various parts. If one part of the system is removed, the nature of the system is changed.There are numerous other systems principles, e.g.,- Systems tend to seek balance with their environments- Systems that do not interact with their environment (e.g., get feedback from customers) tend to reach limits

A circular relationship exists between the overall system and its partsEver notice how an organization seems to experience the same kinds of problems over and over again? The problems seem to cycle through the organization. Over time, members of the organization come to recognize the pattern of events in the cycle, rather than the cycle itself. Parents notice this as they mature as parents. Over time, they recognize the various phases their children go through and consider these phases when dealing with the specific behaviors of their children.

Systems Thinking Press. http://www.systemsthinkingpress.com/.

  Cybernetics (Norbert Wiener) Chaos Theory (Jack Cohen, Margaret Wheatley) Gestalt Therapy (Fritz Peris) General Systems Theory(Ludwig von Bertalanffy) Complexity Theory (M. Mitchell Waldrop, Stuart Kauffman) Socio-Technical Systems Theory (Eric Trist) Project Managers (various)

TQM (Deming, Juran, etc.) Operations Research (U.S. Navy) Geodesic Domes (Buckminster Fuller) and lots of his other interdisciplinary works

Page: 15 of 38

Page 16: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

Tao of Physics (Fritjof Capra) Mind and Nature (Gregory Bateson) Systems Thinking versus Analytic Thinking (Russ Ackoff) through his numerous books The Structure of Scientific Revolution (Thomas Kuhn) Organization Development (Barry Oshry) Human Resource Management (Robert Brinkerhoff) Biology (David Wann) Physics (Murray Gell-Mann) Mathematicians (Jay Forrester and others)

Astronomers

Neuroscientists

Philosophers (Russell Ackoff) Economists (Roger Terry, Michael Rothschild, Kenneth Boulding) Futurists (Joel Barker, John Naisbitt) Educators (Richard Herrnstein and others) Modern Artists (Tyler Volk and others) Architects Mythology (James Moore) Leadership (Warren Blank) Business/Management (Peter Drucker and others) Atmospheric and Oceanographic Sciences Strategic Planning (Steve Haines) Government (Alice Rivlin) Psychology (Steven Covey) Community Development (Don Eberly) Spiritual (various) System Dynamics (Jay Forrester) Soft Systems Methodology (Peter Checkland) Accelerated Learning (Dr. Georgi Lozanov)

International Society for the Systems Sciences (ISSS). http://www.isss.org/.

  The founders of ISSS felt strongly that the systemic (wholistic) aspect of reality was being overlooked or downgraded by the conventional disciplines, which emphasize specialization and reductionist approaches to science. The founders stressed the need for more general principles and theories, and sought to create a professional organization that would transcend the tendency toward fragmentation in the scientific enterprise. In the half century since the founding of ISSS, humanity has achieved a remarkable synthesis of science and technology. Some fragmentation has been overcome enabling us to apply science and technology to the construction of our physical, social, and cultural reality on a massive scale. However, significant, and deep-rooted fragmentation remains. We have been less successful in establishing a graceful or even, workable relationship between humanity, nature, science, and technology.

Information Research: Digital Publications and Resources. http://InformationR.net.

Name Thomas Daniel Wilson

Nationality British Qualifications 1961 Fellow of the Library Association 1970 B.Sc.(Econ) University of London 1975 Ph.D. University of Sheffield 1993 Honorary Fellow of the Institute of Information Scientists Present position Professor Emeritus in Information Management, University of Sheffield.

http://pespmc1.vub.ac.be/CYBSWHAT.html

F. Heylighen, C. Joslyn, V. Turchin,

1999

Cybernetics and Systems Science (also: "(General) Systems Theory" or "Systems Research") constitute a somewhat fuzzily defined academic domain, that touches virtually all traditional disciplines, from mathematics, technology and biology to philosophy and the social sciences. It is more specifically related to the recently developing "sciences of complexity", including AI, neural networks, dynamical systems, chaos, and complex adaptive systems. Its history dates back to the 1940's and 1950's when thinkers such as Wiener, von Bertalanffy, Ashby and von Foerster founded the domain through a series of interdisciplinary meetings. Systems theory or systems science argues that however complex or diverse the world that we experience, we will always find different types of organization in it, and such organization can be described by concepts and principles which are independent from the specific domain at which we are looking. Hence, if we would uncover those general laws, we would be able to analyse and solve problems in any domain, pertaining to any type of system. The systems approach distinguishes itself from the more traditional analytic approach by emphasizing the interactions and connectedness of the different components of a system. Although the systems approach in principle considers all types of systems, it in practices focuses on the more complex, adaptive, self-regulating systems which we might call "cybernetic".

http://www.thinking.net/Systems_Thinking/st_innovation_990401.pdf

Daniel Aronson Systems thinking focuses on the feedback relationships between the thing being studied and other part of the system.,

Page: 16 of 38

Page 17: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # AnnotationIntroduction to Systems Thinking. http://www.thinking.net/Systems_Thinking/systems_thinking.html.

Gene Bellinger: Gene Bellinger: Organizations often have conflicting goals not as easily realized as those in the heating system. Consider the differing emphasis caused by short term profits and long term growth and development. Internal factors being goals and objectives of individuals, established policies and procedures, structure of the organization, job responsibilities, appraisal systems, reward systems, management and leadership styles. External factors consist of market conditions, competition, politics, economic conditions, technological change, sociocultural factors, and imposed rules.

http://pespmc1.vub.ac.be/CSTHINK.html

Cybernetics and Systems Thinkers

W. Ross Ashby psychiatrist; one of the founding fathers of cybernetics; developed homeostat, law of requisite variety, principle of self-organization, and law of regulating models. Further info: ASC biography - Shalizi's notes - his book "Introduction to Cybernetics" - [Find Books] - [Find in Google]Henri Atlan studied self-organization in networks and cells. Further info: home page - biography (French) - [Find Books] - [Find in Google]Gregory Bateson anthropologist; developed double bind theory, and looked at parallels between mind and natural evolution. Further info: biography - ASC biography - the Tangled Web - Ecology of Mind page - < [Find Books] - [Find in Google]Stafford Beer management cyberneticist; creator of the Viable System Model (VSM). Further info: official website - ASC biography - ISSS primer - ISSS luminaries - Team Syntegrity biography -[Find Books] - [Find in Google]Kenneth E. Boulding economist; one of the founding fathers of general system theory. Further info: ASC biography - ideas and works - dedication - [Find Books] - [Find in Google]Donald T. Campbell social scientist, founded evolutionary epistemology and quasi-experimental methodology. Further Info: PCP's In memoriam - Special Issue of "Evolution and Cognition" - Influence on Organization Science - Obituary - Life - [Find Books] - [Find in Google]Peter Checkland creator of soft systems methodology. Further info: home page - Profile - Soft Systems Methodology - [Find Books] - [Find in Google]Jay Forrester engineer; creator of system dynamics, applications to the modelling of industry development, cities and the world. Further info: Home page - ASC biography - short bio - excerpts from books and papers - [Find Books] - [Find in Google]George Klir mathematical systems theorist; creator of the General Systems Problem Solver methodology for modelling. Further info: home page - [Find Books] - [Find in Google]Niklas Luhmann sociologist; applied theory of autopoiesis to social systems. Further info: Obituary - Obituary2 - philosophy - bibliography - [Find Books] - [Find in Google]Warren McCulloch neurophysiologist; first to develop mathematical models of neural networks. Further info: ASC biography - McCulloch and Pitts neurons - von Foerster's tribute - [Find Books] - [Find in Google]James Grier Miller biologist, creator of Living Systems Theory (LST). Further info: living systems theory - Living System Model - intro to Miller's LST - Miller on "The Earth as a System" - Applications of LST - [Find Books] - [Find in Google]Edgar Morin sociologist, developed a general transdisplinary "method": Further info: biography - summary - interview - bibliography - [Find Books] - [Find in Google]Howard T. Odum creator of systems ecology: Further info: biography - home page - [Find Books] - [Find in Google]Gordon Pask creator of conversation theory: second order cybernetic concepts and applications to education. Further info: Pangaro's archive - In Memoriam - ISSS luminaries - ASC biography - [Find Books] - [Find in Google]Howard Pattee theoretical biologist; studied hierarchy and semantic closure in organisms. Further info: home page - [Find Books] - [Find in Google]William T. Powers engineer; creator of perceptual control theory. Further info: home page - introduction to perceptual control theory - definition of control - [Find Books] - [Find in Google]Ilya Prigogine Nobel Prize in chemistry; studied thermodynamical self-organization, irreversibility and dissipative structures. Further info: home page - home page2 - autobiography - Shalizi's notebooks - Curriculum Vitae - various links -[Find Books] - [Find in Google]Robert Rosen theoretical biologist; first studied anticipatory systems, proposed category theoretic, non-mechanistic model of living systems. Further info: Requiem - bibliography - [Find Books] - [Find in Google]

Page: 17 of 38

Page 18: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

Claude Shannon founder of information theory. Further info: biography - biography 2 - History of mathematics biography - biography4 - a personal biography - biography and achievements - Shannon's information theory - photos - [Find Books] - [Find in Google]Herbert A. Simon: Nobel prize in economics, made fundamental contributions to Artificial Intelligence, Cognitive Psychology, Management, philosophy of science, and complex systems. Further Info: - Home Page - Obituary - another home page - autobiography - [Find Books] - [Find in Google]Francisco Varela biologist; creator, together with H. Maturana of the theory of autopoiesis. Further info: biography - The Observer Web: autopoiesis theory - [Find Books] - [Find in Google]Ludwig von Bertalanffy biologist; founder of General System Theory. Further info: biography - 100th Birthday celebration - [Find Books] - [Find in Google]Ernst von Glasersfeld psychologist; proponent of radical constructivism. Further info: biography & contact info - SRRI biography - [Find Books] - [Find in Google]Heinz von Foerster one of the founding fathers of cybernetics; first to study self-organization, self-reference and other circularities; creator of second-order cybernetics. Further info: overview - biographical interview - Varela's personal introduction - [Find Books] - [Find in Google][Find in Google]>John von Neumann mathematician; founding father in the domains of ergodic theory, game theory, quantum logic, axioms of quantum mechanics, the digital computer, cellular automata and self-reproducing systems. Further info: biography - bio with bibliography -History of mathematics biography - biography3 - biography4 - [Find Books] - [Find in Google][Find in Google]>Paul Watzlawick psychiatrist; studied role of paradoxes in communication. Further info: ASC biography - [Find Books] - [Find in Google]Norbert Wiener mathematician; founder of cybernetics. Further info: ideas - biography - Shalizi's notes - Notices of the AMS bio - bio (mathematicians) - MathematicalWork - his Cybernetic Delirium - his activism - in K. Kelly's "Out of Control" - memoir - [Find Books] - [Find in Google]

http://pespmc1.vub.ac.be/SYSTHEOR.html

. Heylighen, C. Joslyn,

Systems theory was proposed in the 1940's by the biologist Ludwig von Bertalanffy (: General Systems Theory, 1968), and furthered by Ross Ashby (Introduction to Cybernetics, 1956). von Bertalanffy was both reacting agaInst reductionism and attempting to revive the unity of science. He emphasized that real systems are open to, and interact with, their environments, and that they can acquire qualitatively new properties through emergence, resulting in continual evolution. Rather than reducing an entity (e.g. the human body) to the properties of its parts or elements (e.g. organs or cells), systems theory focuses on the arrangement of and relations between the parts which connect them into a whole (cf. holism). This particular organization determines a system, which is independent of the concrete substance of the elements (e.g. particles, cells, transistors, people, etc). Thus, the same concepts and principles of organization underlie the different disciplines (physics, biology, technology, sociology, etc.), providing a basis for their unification. Systems concepts include: system-environment boundary, input, output, process, state, hierarchy, goal-directedness, and information. The developments of systems theory are diverse (Klir, Facets of Systems Science, 1991), including conceptual foundations and philosophy (e.g. the philosophies of Bunge, Bahm and Laszlo); mathematical modeling and information theory (e.g. the work of Mesarovic and Klir); and practical applications. Mathematical systems theory arose from the development of isomorphies between the models of electrical circuits and other systems. Applications include engineering, computing, ecology, management, and family psychotherapy. Systems analysis, developed independently of systems theory, applies systems principles to aid a decisIon-maker with problems of identifying, reconstructing, optimizing, and controlling a system (usually a socio-technical organization), while taking into account multiple objectives, constraints and resources. It aims to specify possible courses of action, together with their risks, costs and benefits. Systems theory is closely connected to cybernetics, and also to system dynamics, which models changes in a network of coupled variables (e.g. the "world dynamics" models of Jay Forrester and the Club of Rome). Related ideas are used in the emerging "sciences of complexity", studying self-organization and heterogeneous networks of interacting actors, and associated domains such as far-from-equilibrium thermodynamics, chaotic dynamics, artificial life, artificial intelligence, neural networks, and computer modeling and simulation.

Thinking About Carter McNamara, Benefits of Systems Thinking for Leaders and Supervisors in Organizations

Page: 18 of 38

Page 19: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # AnnotationOrganizations as Systems (McNamara). http://www.mapnp.org/library/org_thry/org_sytm.htm.

PhD Management sciences have learned a great deal lately about organizations and how they work. Much of this learning has come from adopting the perspective that organizations are systems, much like people, plants and animals. There are many benefits to leaders who adopt this systems view of their organizations.

1. More effective problem solving2. More effective leadership3. More effective communications4. More effective planning5. More effective organizational development6. Avoiding Founder's SyndromeFounder's Syndrome occurs when an organization operates primarily according to the personality of one of the members of the organization (usually the founder), rather than according to the mission (purpose) of the organization. When first starting their organizations, founders often have to do whatever it takes to get the organizations off the ground, including making seat-of-the-pants decisions in order to deal with frequent crises that suddenly arise in the workplace. As a result, founders often struggle to see the larger picture and to be able to effectively plan in order to make more proactive decisions. Consequently, the organization gets stuck in a highly reactive mode characterized by lack of funds and having to deal with one major crisis after another. The best "cure" for this syndrome is accomplishing a broader understanding of the structures and processes of an organization, including an appreciation for the importance of planning.

Thinking About Organizations as Systems (McNamara). http://www.mapnp.org/library/org_thry/org_sytm.htm.

Carter McNamara, PhD

Much of the work involving mental models comes from Chris Argyris and his colleagues at Harvard University. A mental model is one's way of looking at the world. It is a framework for the cognitive processes of our mind. In other words, it determines how we think and act.

(Verenikina and Gould 1998)

HCI Note #5 Basler and Boedker found a gap between the way systems development is represented in structured anaslysis by Youdeon and Demarco and the way in which is was carried out. Designers did not follow the design procedures prescibed but, in general, had a vary pragmatic attitude towards using it.

(Uden and Willis - No date on class handout but post 1999)

HCI Note #9 Activity theory helps use with the identification of users and stakeholders of a system.

Aboulafia, Guold, Spyrou - No date on class handout but post 1994)

HCI Note #12 The basic idea of structured analysis is to model organizations and work processes as information processing systems with the aim of producing detailed functional descriptions of tasks and operations.

(Barnard Chapter 8 - No date on class handout)

HCI Note #14 If the designer finds it difficult to program the task, then by implication, users are also likely to find the task difficult.

(Hollan, Hutchins, Kirsh 2000)

HCI Note #15 Ethnography offers clever ways of getting things done that can be incorporated in new designs.

(Karat and Karat 2003)

HCI Note #17 The accessibility is dependent on the way the system fits into the context. Know the users and their tasks - Necessary for design. Approaches have been developed to enable the users to take active roles in many design decisions. Good design: fitting a purpose that can be emprically validated.

(Carroll Introduction: Human-Computer Interaction, the past and dthe present - date

HCI Note #24 Software designers should always "plan to throw one away". An example of recognition that the earliest point for impact is requirements analysis.

Page: 19 of 38

Page 20: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotationunknown)(Trudi Bellardo Hahn 1996)

ISR Note 22 Bayer: It was common practice to fake "representative" services by kicking off users to make the response time better for a demonstration before important potential LEXIS customers.

(Ravichandran and Rai, 2000)

#1 Quality Management in Systems Development: An organizational System Perspective. All elements of the organization need to be developed in order to attain quality goals and that piecemeal adoption of selected quality management practices are unlikely to be effective.

(Ravichandran and Rai, 2000)

#1 IS Quality phenomenon has focused on four main areas: software quality measurement and control, development infrastructure (methodologies and tools), software process management, participation design. Process improvement software quality assurance and customer satisfaction are important TQM concepts. CMM is now popular and has been effective in emphasizing the importance of process improvement.

(Ravichandran and Rai, 2000)

#1 Demming (1986) asserts that without senior managements leadership and visible signaling of their commitment to quality improvement, an organization will not be able to change its practices that lead to poor quality.

(Ravichandran and Rai, 2000)

#1 Poor quality is largely attributes to design problems (Cole 1981), which can be avoided: attention paid to quality problems during design, understand customer requirements, design is modularized for reuse. Participation of users, vendors, and developers in the core design and development process promotes mutual understanding of issues and constraints to be addressed to improve quality.

(Middleton 1997) #2 The conclusions are that prescriptive information system methodologies are unlikely to cope well with strategic uncertainty, user communication or staff development. The recommendations are to focus more on soft organizational issues and to use approaches tailored to each project.

(Middleton 1997) #2 Project Problems: Users did not know what they wanted, users did not know the possibilities of the technology, users perceptions changed while the system was being developed, the developers did not understand the intricacies of the user's work, and there were constant changes in the external environment that were not anticipated.

(Middleton 1997) #2 The drawbacks of the 'waterfall' approach have been well documented - the difficulties confirmed by this research were those of managing ever shifting requirements, poor relationships with the users and the emergence of serious problems late in a project.

(Middleton 1997) #2 Recommendations to develop staff: Project register for funds to be released, process maturity, user communication, tailored life cycles, project managers.

(Middleton 1997) #2 Sauer, et al make a distinction between traditional and new mind-sets; the traditional mind-set, knowledge is seen as 'well-defined, explicit, articulate', compared to the new mind-set where it is seen as 'ill-defined, tacit, diffuse, embedded'.

(Morley, Petrie, O'Neill and McNally 1998)

SAD # 83 Design suggestion for web browsers relying on auditory navigation: Users involved throughout design, interactions with devices should be simple, consistent and provide effective feedback, non-speech sounds should provide information, user configuration should be easy, structure the material carefully, etc.

(Roger W. Harris, UNIMAS, 2000)

SAD #84 Schools of thought in research into end-user computing

(Post, Kagan, Leim, 1997)

SAD #85 Support for data flow diagrams and the data dictionary were revealed as key factors for improving productivity. CASE tools evaluation: graphics features, prototyping, data dictionary, design analysis, code generation, and general features. Vessey: CASE tools are used to enforce support for a defined methodology. You must always consider the infrastructure, organization, management, and people costs in using case tools. Lewis argues that CASE tools do not deliver their intended software development productivity.

(Post, Kagan, Leim, 1997)

SAD #85 In short, CASE tools are being used for two purposes: larger firms are emphasizing the prototyping and code generation facilities and using them to build completed systems. Smaller firms are primarily using the tools for analysis, design and to share development work across teams.

(Nesi and Querci, 1998)

SAD #88 The introduction of OOP is not immediate because it involves managers, analysts, designers, developers, etc. Software complexity evaluation is useful for predicting maintenance costs, comparing productivity among different projects and learning the development process efficiency and parameters. From the cognitive point of view, the observable complexity can be regarded as the effort to understand subsystem/class behavior and functionalities.

(Nesi and Querci, 1998)

SAD #88 Thomas and Joaconsom have suggested to estimate class complexity as the sum of attribute and method complexities.

Page: 20 of 38

Page 21: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation(Marty 2004) KO #93 As web master I have to manage the flow of information from server to workstation,

managing and coordinating the review policies and review committees. This demonstrates the diverse skills and capabilities required of museum Web masters. There is a great importance on requirements analysis as well as the inherent difficulties of assessing and meeting user needs.

  (Williams, Howard. “Emerging Content Requirements for News Products”. ASIST Bulletin. August September 2004. Volume 31, Number 6.)

 SAD #89 Content requirements for news products: Use of headlines, titles, headers and categories, to be visually interesting and intuitive, summarization of essential points, attention to the continuity of a story over time, inclusion of highlighted and causal links, and explicit identification of explanatory information.

TREC content requirements:Information extraction.Text categorization.Topic detection, event detection & novelty detection.Profiling and filtering.Question answering.Summarization.

XML should be viewed as an enabling technology for purposes of addressing emerging content requirements for news products.

 (Van de Sompel, Herbert, Payette, Sandy, Erickson, John, Lagoze, Carl, Warner, Simeon. “Rethinking Scholarly Communication Building the System that Scholars Deserve.”  D-Lib Magazine September 2004 Volume 10 Number 9)

  (This Opinion piece presents the opinions of the author. It does not necessarily reflect the views of D-Lib Magazine, its publisher, the Corporation for National Research Initiatives, or its sponsor.)IntroductionThere is growing dissatisfaction with the established scholarly communication system. This dissatisfaction is the result of a variety of factors including rapidly rising subscription prices, concerns about copyright, latency between results and their actual publication, and restrictions on what can be published and how it can be disseminated. The result is a global debate on how to remedy the system's deficiencies, and that debate has inspired concrete initiatives aimed at reforming the process. These are concerned mainly with access issues and seek to alleviate two longstanding problems. The first, known as the "serials crisis," addresses the often prohibitive prices of journal publications that impede access to scholarly materials. The second, known as the "permissions crisis," addresses the restrictions on use of publications once access has been obtained. The "Open Access" movement focuses primarily on these two problems with two different strategies. The self-archiving school strives for a scholar's right to make traditional journal publications freely available in an open repository. The journal-reform school promotes the emergence of new types of journals that are free at the point of use.

While the open availability of the results of scholarly endeavors is indeed of fundamental importance to the future of scholarship, it is only one dimension of how the scholarly communication process can be transformed. As Geneva Henry [Henry 2003] has observed, opportunities abound in the world of 21st century publishing and the discussion on transforming scholarly communication must move beyond the debate of subscription-based vs. open access publication. In this article we consider the changing nature of scholarly research, the demands these changes place on the scholarly communication system, and our technical proposals to meet these demands.

The changing nature of scholarly researchThe manner in which scholarly research is conducted is changing rapidly. This is most evident in Science and Engineering [Atkins et al. 2003], but similar revolutionary trends are becoming apparent across disciplines [Waters 2003] [note 1]. Improvements in computing and network technologies, digital data capture techniques, and powerful data mining techniques enable research practices that are highly collaborative, network-based, and data-intensive. These dramatic changes in the nature of scholarly research require corresponding fundamental changes in scholarly communication. Scholars deserve an innately digital scholarly communication system that is able to capture the digital scholarly record, make it accessible, and preserve it over time.

The established scholarly communication system has not kept pace with these revolutionary changes in research practice. Changes thus far have mainly been small

Page: 21 of 38

Page 22: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

technological improvements. For example, a system that offers interoperability across publishing venues has yet to be realized. Admittedly, there is some level of interoperability, but it is relatively modest. Most publishers support PDF [note 2] as a standard interchange format, achieving a level of interoperability comparable to agreeing to print on paper in the pre-digital era. Some publishers have bought into the idea of assigning unique persistent identifiers to publications, and some have jointly chosen to use the DOI [note 3] for that purpose. Some publishers support the OpenURL [note 4] to allow users to more easily navigate across publishing venues, and a few publishers use the OAI-PMH [note 5] to support metadata sharing. While these efforts represent progress, their limited scope demonstrates that the scholarly communication system is still in an early phase of absorbing the digital technologies that have disrupted the paper-based status quo. Interoperability is one dimension of a larger technical challenge involved in designing a natively digital scholarly communication system. Other challenges include issues of workflow, service sharing, and information modeling. We propose a more fundamental re-engineering to a network-based system that addresses these challenges and provides interoperability across participating nodes.

Our vision is based on our belief that the future scholarly communication system should closely resemble—and be intertwined with—the scholarly endeavor itself, rather than being its after-thought or annex. We consider in this article the aspects of the established system that constrain the scholarly endeavor. Based on those considerations, we describe the desired technological characteristics of a future system of scholarly communication. We argue for a scholarly communication system composed of an interoperability substrate allowing flexible composition of the value-adding services that up to now have been vertically locked in the journal publication milieu. In this loosely coupled system, the units of scholarly communication (i.e., data, simulations, informal results, preprints, etc.) could follow a variety of scholarly value chains in which each hub provides a service such as registering results, certifying their validity, alerting scholars to new claims and findings, preserving the scholarly record, and ultimately rewarding scholars for their work. New units of scholarly communicationIn the established scholarly communication system, the concept of a journal publication dominates our definition of a unit of communication. Such publications come with well-known characteristics, some of which are unattractive in light of the changing nature of research. For example, publications are unable to adequately deal with non-textual materials, which are generally regarded to be add-ons rather than essential parts of the publication [Lynch 2003], let alone be publications in their own right. Furthermore, significant communication delays are introduced as the result of the integration of peer-review in the publication process.

These problems suggest a revised perspective on what constitutes a unit of communication in a future scholarly communication system:

The system should consider datasets, simulations, software, and dynamic knowledge representations as units of communication in their own right. The system should accommodate complex documents that flexibly aggregate the products of the scholarly endeavor, regardless of their format or location. These compound objects must themselves be considered units of communication and, therefore, be recursively available for inclusion into other compound units. Such technology would provide for the reuse and derivation of existing results that is an integral part of the scholarly process. The system must facilitate the early registration (and ultimately preservation) of all units in the system, regardless of their nature or stage of development. This would facilitate collaborative network-based endeavors and increase the speed of discovery. Preprints, raw datasets, prototype simulations, and the like should be afforded the ability to proceed through the scholarly value chain in the same manner that only journal publications are afforded in the current system. Hence, our proposal is to revise the notion of a unit of communication in both a technological and a systems sense. In a technological sense, a future unit of communication should not discriminate between media types and should recognize the compound nature of what is being communicated. Such revision would allow for conveying multiple heterogeneous data streams as a single communication unit, as well as to recognize references to previously communicated units as formal components of a new

Page: 22 of 38

Page 23: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

unit.

From a systems perspective, the concept of registering a communication unit in the scholarly communication process remains in place. However, we propose that a new system allow for more flexibility regarding the moment at which a unit can enter the communication process. We anticipate that such flexibility would empower individual scholarly communities to decide which actions constitute registering a unit of communication, as well as what the community deems acceptable with respect to the timing of registration and how that relates to the quality of what is to be registered. Apart from facilitating an increased speed of discovery, we feel a more flexible environment would allow scholars to officially incorporate materials in the system of communication that are currently largely living in a grey literature area.

New ways to combine the functions of scholarly communicationBased on an analysis of formal scholarly communication since its emergence in the 18th century, Roosendaal and Geurts distinguish the following functions that must be fulfilled by every system of scholarly communication regardless of its actual implementation [Roosendaal and Geurts 1997]:

Registration, which allows claims of precedence for a scholarly finding. Certification, which establishes the validity of a registered scholarly claim. Awareness, which allows actors in the scholarly system to remain aware of new claims and findings. Archiving, which preserves the scholarly record over time. Rewarding, which rewards actors for their performance in the communication system based on metrics derived from that system. By linking these functions together we adopt a value chain perspective of the scholarly communication system. In the established system, this value chain has largely been implemented in a vertically-integrated manner through the traditional publication process, in particular through journal publication. The registration date is recorded by a journal publisher as the date the manuscript was received. The peer-review process, conducted under the auspices of the journal publisher certifies the claims made in the manuscript. The eventual published journal article, supported by the availability of secondary finding aids, fulfills the awareness function. Rewarding is based on the mere fact of publishing in a certain class of journals and on being referenced in articles by other scholars, both metrics directly derived from the scholarly communication system itself. In the paper-based era the published article itself, bundled into a journal issue, was archived in an ad hoc fashion as it was shelved by libraries across the world.

It is noteworthy to point out that archiving is the only function of scholarly communication that, in the paper-based system, is implemented by many parties at the same time. With this exception, the paper-based nature of scholarly communication does not provide the flexibility for the functions of scholarly communication to be fulfilled by separate parties, nor for the same function of scholarly communication to be implemented in different ways by different parties for the same unit of communication.

The digital, networked environment has fewer restrictions. As an illustration of this argument, let us examine the scholarly ecology that has already emerged around arXiv [note 6] since its inception in 1991, and let us speculate about things we may expect to emerge in due course. Figure 1 depicts the information flow of a unit of communication—an electronic manuscript—as it enters the arXiv and proceeds through multiple services hubs that fulfill functions of the scholarly communication process. Each step in the information flow is shown as a numbered arrow. The directionality of the arrows depicts the evolution of the communication unit through one or more pathways in the system.

Figure 1: arXiv ecology and the emergence of service pathways

The arXiv itself provides an implementation for most of the functions of the scholarly communication process, as can be seen from the pathway through arXiv which covers registration, certification, awareness, and archiving:

Page: 23 of 38

Page 24: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

Some scholarly functions are implemented in other ways by other service hubs, resulting in alternative or parallel pathways, as highlighted in the discussion below.

arXiv is a hub in the scholarly communication system that allows scientists to deposit manuscripts and, in doing so, to register a claim. In the diagram, an e-print enters the arXiv and is processed by the arXiv registration service. arXiv provides a basic form of certification via the endorsement of potential submitters by peers, and lightweight filtering by volunteers [note 7]. For a more thorough certification of submitted materials, arXiv relies on other hubs in the scholarly communication system. For example, the established physics journals frequently publish versions of manuscripts previously posted to arXiv that have been certified in a traditional peer-review process. Also, a few overlay journals provide another type of certification by selecting manuscripts from arXiv. The result is multiple parallel certification methods for the same registered unit of communication, each of which has its own characteristics, and each of which may or may not satisfy the needs of a potential reader. In the diagram, we see two alternative pathways for certification:

arXiv fulfills the awareness function by making manuscripts freely available via the network, by allowing search engines to index content, and by sending alerts to interested scholars. The awareness function is also fulfilled by physics journals, by overlay journals, and by citation services, each through different means. For example, in the Figure 1 diagram, we can see that a journal overlay provides an enhanced implementation of the awareness function that results from monitoring arXiv registrations, providing an alternative certification approach, and by listing the results of this certification. This can be seen in the following pathway:

arXiv's archiving strategy is largely based on ensuring adequate redundancy through the operation of a network of separately controlled mirror systems. The archiving strategy of physics journals can be considered more elaborate, as they typically transfer published digital content to national libraries and rely on the national library's services for long-term digital preservation. Although there is no evidence of this currently happening, overlay journals could rely on the LoCKSS framework [Reich and Rosenthal 2001] [note 8] to ensure redundancy. And, one can imagine that both arXiv, and the LoCKSS framework itself, would eventually rely on the services of yet other hubs in the system for the fulfillment of tasks such as digital format migration, which will be an essential part of the archiving function in the digital realm. In the diagram in Figure 1, potential LoCKSS-based service nodes are depicted as shaded boxes connected into the journal overlay hub. The availability of these new service nodes offers the prospect of a new preservation solution for documents via following pathways:

In the current environment, as reflected in academic policy, rewarding of scientists is largely based on their performance in the journal system. Important dimensions of this system include in which journals the scholar publishes and how many times the scholar is referenced in ISI-selected journals. As far as we are aware, no formal rewarding is currently based on submission to arXiv, on citations to arXiv, on the number of downloads of a manuscript from arXiv, or on the number of downloads of a version of that manuscript from a physics journal. But it is possible to imagine the emergence of hubs that would collect such metrics to support fulfilling the rewarding function in novel ways. CiteBase [Brody et al. 2003] [note 9] is an indication of the possible emergence of such alternative rewarding hubs. In the diagram, CiteBase fulfills the rewarding function by monitoring both arXiv registrations, and the actual use of arXiv's communication units, followed by distilling metrics from citation and usage information. This can be seen in the pathways:

This example demonstrates how the basic functions of scholarly communication can

Page: 24 of 38

Page 25: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

potentially be implemented by multiple parties in different ways, and then offered together as alternative or companion services. As illustrated by means of arXiv, existing hubs are already devising loose, informal connections among services within the constraints of the existing scholarly communication system.

Other recent developments are changing the technical and social landscape of the scholarly communication process, and at least suggest a trend that parallels arXiv. The "institutional repository" movement [Lynch 2003, Van de Sompel 1999] is leading to the creation of many new hubs for scholarly content. Universities, libraries, research institutions, and scholarly societies are employing systems such as DSpace [Smith et al. 2003] [note 10], EPrints.org [note 11], Fedora [Payette and Staples 2002, Staples et al. 2003] [note 12], and others to register, disseminate, and preserve documents, datasets, and other media as valuable scholarly assets. At the same time, Grid technologies are being developed to provide network-based services for data sharing and information integration [Frey et al. 2002, Williams et al. 2003]. As materials in those heterogeneous repositories become openly accessible, the emergence of a variety of value chains with those materials at their starting point is quite predictable. Indeed, in the Grid environment, units of communication of a very different nature—say datasets—already proceed through value chains in which hubs fulfill functions such as quality control (certification), discovery (awareness), and archiving.

Therefore, we can imagine a future scholarly communication system in which many distributed hubs exist, and where each hub is a service that performs a specific scholarly communication function in a particular way. These hubs may then be composed in multiple combinations to form different pathways through which a unit of scholarly communication may proceed. Each pathway consists of a sequence of distributed service hubs implementing the required functions of scholarly communication in a different way. In such an environment, a single unit of scholarly communication may proceed simultaneously through different value chains implemented across the network.

We argue that in order for a distributed service approach to be worthy of the name scholarly communication "system" (rather than scholarly "chaos"), the service hubs need to be interconnected, as if they were part of a global scholarly communication workflow system. Such a workflow system would allow the construction of macro-level workflows for streamlining and concatenating the fulfillment of the various implementations of the functions of scholarly communication. That is, it would allow the chaining of specific implementations of the registration, certification, etc. functions into a pathway that could be followed by a unit of communication.

This workflow system could also be implemented at the micro level for streamlining and concatenating the different steps involved in the fulfillment of a given function of scholarly communication by a specific hub. For example, a micro-level workflow could chain a set of migration tasks to fulfill the digital preservation requirements of the archiving function. Or a micro-level workflow could chain tasks involved in an open peer-review implementation of the certification function: make a unit of communication available for review, interactively discuss the paper, propose resolution by the editor, etc. [Pöschl 2004]

We believe that a next-generation network-based communication system designed to accommodate these flexible combinations of the functions of scholarly communication will provide the following benefits:

Innovation: With more flexibility in how and where services are implemented, there can be more experimentation with new ways of fulfilling the functions of scholarly communication. Adaptability: More innovation may result in alternative solutions to fulfilling key functions, which in turn may help the scholarly communication system to evolve as the scholarly process itself evolves. Democratization: As multiple service providers implement functions of the scholarly communication process, we may see the emergence of competition in a largely monopolized market. The traditional vertically-integrated system may give way to a distributed, loosely coupled system of alternative and complementary services. Recording the dynamics of scholarship

Page: 25 of 38

Page 26: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

The established scholarly communication system does not record an unambiguous and visible trace of the evolution of a unit of communication through the system, nor of the nature of that evolution. Consider the following simple example:

At a certain point, a scholarly manuscript makes its public appearance in the system as an electronic preprint. Next, it is peer-reviewed and published in a journal. Then some secondary publishers create and publish a metadata record describing the paper. Some scholars discover and read the paper, build on it and hence cite it. Later, services need to go through enormous pains to computationally derive the relationships between the preprint, the journal publication, the metadata records, and the citations. The problem addressed in the above example can be misread to be one of computing power, algorithms and access rights. In actuality, the problem is one of relationships among units of scholarly communication. Many important relationships are known at the moment a communication unit goes through a step in a value chain, but these relationships are not recorded in the existing scholarly communication system. The result is that the very dynamics of scholarship—the interaction and connection between communication units, authors, readers, quality assessments about communication units, scholarly research areas, etc.—are lost and are extremely hard or impossible to recover after the fact.

We feel this loss needs to be remedied in a future scholarly communication system by natively embedding the capability to record and expose such dynamics, relationships, and interactions in the scholarly communication infrastructure. Recording this body of information is synonymous to recording the evolution of scholarship at a fine granularity. This will allow tracing the origins of specific ideas to their roots, analyzing trends at a specific moment in time, and forecasting future research directions. It will also provide the means to start defining and extracting new metrics to assess the quality of scholarly assets and for the evaluation of the performance of actors in the scholarly system. Such metrics are crucial to avoid information overload and to pave the way toward acceptance of a new scholarly communication system at the socio-political level.

ConclusionBy considering the changing nature of research, exploring characteristics of the established scholarly communication system, and observing emerging trends, we have tried to distill some core characteristics of a future scholarly communication system. We have argued for a revised notion of the unit of communication so that in a new scholarly communication system the unit more accurately reflects the changing nature of the information assets produced and consumed in scholarly endeavors. We have argued that the system should allow for—though not mandate—the early registration of scholarly assets in the system to support collaborative and networked-based endeavors, and to increase the speed of discovery. We have argued for technology that allows units to follow a variety of pathways through the system, with distributed nodes fulfilling the different functions of the value chain. We have also argued for technology that records the flow of units through the system.

In a spirit similar to the one that led to the creation of the Open Archives Initiative [note 13], our proposals are mainly technical and architectural, but with wide ranging social and organizational implications. Like any technology, success will depend not only on technical soundness but on the willingness of the participants in the system—publishers, scholars, academic institutions, funding institutions, and others—to adopt new tools and develop new organizational models on top of them.

Although the proposals described here indeed challenge existing models, we believe that they also provide novel opportunities for all participants in the system. The changes we propose will permit experimentation with novel ways to implement the functions of scholarly communication, for the system to evolve as the scholarly process itself evolves, and for the emergence of competition in a largely monopolized market. The changes will also create a body of information that can be reused, mined, and analyzed, forming a foundation from which new knowledge can be generated.

The task of implementing a new scholarly communication system holds many complex

Page: 26 of 38

Page 27: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAuthor Document # Annotation

technical and organizational challenges. While many new systems are emerging, they tend to offer little or no interoperability among them at this time. There exists no generally accepted information model for the domain of scholarly publishing. In terms of the vision of distributed services that can act as hubs in a future system, there is no common workflow model to build upon. A necessary technical step is the development of information models, process models, and related protocols to enable interoperability among existing repositories, information stores, and services. The NSF has recently recommended funding the authors of this paper to investigate these problems, building on our collective research and development. In a future article we will discuss our current work in moving toward a network overlay that promotes interoperability among heterogeneous data models and system implementations. We will describe our architectural vision for addressing the fundamental technical requirements of a next generation system for scholarly communication.

 Sauter, Vicki. Information Systems AnalysisSystems Theory. This page was last modified on: 08/14/2000 00:14:24URL: http://www.umsl.edu/~sauter/analysis/intro/system.htm

Information Systems Analysis

Systems Theory

A system is composed of interacting parts that operate together to achieve some objective or purpose.

a system is intended to "absorb" inputs, process them in some way and produce outputs

outputs are defined by goals, objectives or common purposes

In order to understand the relationship between inputs, outputs and processes, you need to understand the environment in which all of this occurs.

The environment represents everything that is important to understanding the functioning of the system, but is not part of the system. The environment it is that part of the world that can be ignored in the analysis except for its interaction with the system.

It includes: competition, people, technology, capital, raw materials, data, regulation and opportunities.

The boundary defines the difference between the environment and the system; the correct boundary is a function of the problem under consideration.

One way to define a system is to work backwards from the definition of outputs to that of inputs.

     

(“Bulletin of the American Society for Information Science and Technology; Special Section; Portals in Libraries”. October/November 2004. Volume 31. Number 1.)

KO #95 Amos A. Lakos: Web portals are seen as positive potential frameworks for achieving order out of chaos. For many library customers, if what they need is not on the web, it does not exist. If information is difficult to find using library tools and services, customers are looking for alternative sources – if they even think of libraries at all.

A library needs to leverage the libraries expertise and also be a part of the owning institutions “enterprise”.

A portal is a customized learning and transaction Web environment, designed purposely to enable an individual end-user to ‘personalize’ the content and look of the website for his/her individual preference. It is a service environment and should be designed from the customer perspective.

Key principles govern any portal rollout: Simplicity, dependability, quantifiable value, personalization, and systematic management.

(“Bulletin of the KO #95 Robert H. McDonald: Portalization/Web Access can be thought of in three

Page: 27 of 38

Page 28: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAmerican Society for Information Science and Technology; Special Section; Portals in Libraries”. October/November 2004. Volume 31. Number 1.)

distinct categories: Customizable E-resource portals. Integrated E-resource portals. Metasearch systems that contain portal features.

The service will need to interact with other systems or portals within the infrastructure of the larger parent organization.

(“Bulletin of the American Society for Information Science and Technology; Special Section; Portals in Libraries”. October/November 2004. Volume 31. Number 1.)

KO #95 Krisellen Maloney: How many users are deterred by each barrier or how many users make the optimal selection?

p-portals: provide intergrated access to content at the user-interface level. Ease the burden to the user by providing a place to organize important information resource. http://my.lib.ncsu.edu/

m-portals: integrate information at the programming level. It offers the promise of simplifying the multi-step user process by providing integrated and aggregated access to information at the programming level. www.google.com

The challenge is to identify user-level services (applications) that are important to the users and to identify generalizations of common programming-level services that can be developed to support those applications.

(“Bulletin of the American Society for Information Science and Technology; Special Section; Portals in Libraries”. October/November 2004. Volume 31. Number 1.)

KO #95 Sarah Michalak and Mary E. Jackson: Seven members of the Association of Research Libraries implemented Zportal from Fretwell-Downing Inc. Each site has the ability to configure and customize the software to meet local needs. Since authentication is a key component of the implementation, several variations have emerged (Virgona note: Support/upgrades/versioning then becomes a nightmare). Dartmouth adopted PKI and USC adopted Shibboleth.

(“Bulletin of the American Society for Information Science and Technology; Special Section; Portals in Libraries”. October/November 2004. Volume 31. Number 1.)

KO #95 Marianne Afifis: USC’s reasons for slow progress are as follows:Internal resource allocationDelays due to the time lag inherent to the in the managed service contractGetting the Shobboleth authentication to work.Not releasing the software to the public due to authentication development time.Wanting to perform a user evaluation study in the controlled environment.

(Virgona Note: just plain and simple poor project planning)

(“Bulletin of the American Society for Information Science and Technology; Special Section; Portals in Libraries”. October/November 2004. Volume 31. Number 1.)

KO #95 Eric Lease Morgan: In the phrase controlled vocabulary, the operative word is not control but quality.

The system needs to know a bit about the users before useful services can be provided.

Privacy is more a legal and self-imposed philosophical issue than a technical one.

(“Bulletin of the KO #95 Roy Tennant and Sarah Michalak: Problems and delays arising from internal

Page: 28 of 38

Page 29: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostAmerican Society for Information Science and Technology; Special Section; Portals in Libraries”. October/November 2004. Volume 31. Number 1.)

politics and the “war for screen real estate”.

Key concepts: usability, self-navigation, self-sufficiency, personalization, and identifying content that is vital to the users.

Integration requires extensive configuration of multiple kinds of software, but the benefits to users are worth the investment.

(Large, Andrew, Beheshti, Jamshid, Nesset, Valerie, Bowler, Leanne. “Designing Web Portals in International Teams: Two Prototype Portals for Elementary School Students”. In JASIST 2004. Volume 55. Number 13)

Children like to retrieve from the Web images, animation and sound sequences, but they do not want such multimedia features to be integrated into a portal if this means sacrificing operational clarity and response time.

The name of the portal is important. Team and session designs: 1 hour sessions and no more than 12 sessions.

Collaboration and Teamwork: Students take pleasure in developing a new portal as a member of an intergenerational team.

(Cyr, Diane, Trevor-Smith, Haizley. “Localization of Web Design: An Empirical Comparison of German, Japanese, and United States Web Site Characteristics”. In JASIST 2004. Volume 55. Number 13)

Languages and script appear to vary across cultures. All three writing style variables (headlines, point forma and paragraph) are significantly differently among the three countries. Japanese have strong preference for point form. Paragraph format is used almost twice as much in Germany than in the United States. The Japanese show much higher occurrence of banners on the top and left. The United States and Germany place menus on the left and bottom, of the page. Germany and Japan used a “return to home” button twice as much as the U.S. sites. There was considerable variation in the use of color (Japan used twice as much red as Germany/US).

(Fidel, R. & Pejtersen, A.M. (2004) From information behaviour research to the design of information systems: the Cognitive Work Analysis framework. Information Research, 10(1) paper 210 [Available at http://InformationR.net/ir/10-1/paper210.html])

Human information behaviour is a highly active area of research within Information Science and other fields. Indeed, the significant body of research that has been carried out to date has contributed greatly to our understanding of human-information interaction. Yet, very few studies have generated results that are directly relevant to the design of information systems.

Clearly, information systems would be most effective if their design is informed by an understanding of the human-information interaction of their intended users. Yet, information systems have been designed—and widely used—almost completely unaffected by results of studies in human information behaviour 1. It is important, therefore, to examine how human information behaviour research could inform design. A variety of reasons have probably motivated systems designers to ignore this research, such as pressure to design systems quickly, no obvious relevance of research results to design, and lack of appreciation of soft research. Instead of analysing these reasons, it might be useful to examine how results of human information behaviour research projects can increase their applicability to systems design. This will address a standing concern: bridging the gap

Page: 29 of 38

Page 30: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post

between designers and researchers, and increasing the relevance of academic research to the practitioners' work (Dervin, 2003).

Cognitive Work Analysis dimensions for analysis would be:

The work environment. Investigates the environments in which the school operates. Examples of questions: What are the federal, state, and school district regulations under which the school operates? What is the state policy and standards for the school's curriculum? What is the population from which the school can recruit students?

Work-domain analysis. Studies the work that is done at the school and the school library. Examples of questions: What are the goals of each organization? What are the constraints within which it has to operate? What are the activities in which each organization is involved? What tools and technologies it uses to perform these activities?

Task analysis. Looks at specific tasks and analyses them with the same questions. Examples of questions: What are a teacher's goals for lessons? What are the constraints a teacher faces in preparing and delivering a lesson? What information sources does a teacher consult?

Organizational analysis. Examines the management style, the organizational culture, the social conventions, and how roles are allocated. Examples of questions: How does the teacher communicate with the principal? Why was the teacher allocated to teach a course? Who decides whether or not the librarian should give a presentation in a class session? What procedure does this process follow?

Decision analysis. Provides a more specific analysis of individual decisions. Examples of questions: for a librarian's decision whether certain images would be relevant for a lesson, for instance the issues involved might be: what information does a school librarian need to make this decision? What information sources are available to her? What sources are desirable but not available?

Strategies analysis. For each task and decision, examines which strategies are possible. Examples of questions: How can a teacher who is looking for an image to use in her lesson find it? For instance, can she ask a colleague to think about an image? Can she browse in a book in the library? Can she go to a site she knows on the Web? Can she search art databases?

User's resources and values analysis. Identifies characteristics of each group of users. Examples: What is the experience a teacher has in looking for visual information? What is the knowledge a teacher has of the arts requirements standards? What are the most important values a teacher holds? What is the knowledge of a school librarian about art? What is the degree of importance a school librarian attributes to including art in the curriculum?

Page: 30 of 38

Page 31: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post

(“Pioneer’s Reminiscences”. in Bellardo Hahn, Trudi and Williams, Robert V. – editors. “Proceedings of the 1998 Conference on the History and Heritage of Science Information Systems”. ASIST. 1998).

Dale Baker: Chemical Titles was the first periodical to be organized, indexed, and composed by a computer.

Everett Brenner: “Brenner’s Law”: Determine the best system you can foresee before designing the system you can afford.

Helen L. Brownson: Organizing Scientific information after Sputnik.

Melvin S. Day: World War II gave the technical report a life of its own-even within an environment of controlled access to security-classified information, Nowhere was this limitation more pronounced than on the nation’s Manhattan Project (atomic bomb project) from 1942 to 1946. Compartmentalization of information was a way of life for all on the project. I knew the details of what I was doing, and I knew what my staff was doing. I did not know, nor was I supposed to know what my immediate management was doing or what my colleagues in other laboratories wee doing.

Allen Kent: Early in February 1958 my colleagues and I organized a national meeting at Western Case University in Cleveland to discuss a proposal to establish a national center for scientific and technical information. The stimulus for this proposal was the Soviet’s launching of Sputnik in October 1957. Manu U.S. scientists suggested that one of the reasons for the Soviets’ taking the lead in the space race was the existence of their Institute of Scientific Information – which was characterized by a British scientist who visited as “really shattering… No other agency in the world is doing this job.”

David A. Kronick: One of the conclusions I reached in my dissertation was that the scientific journal, as it was invented, fulfilled two distinct and different functions. First, it served as a vehicle to disseminate information, and second it served as a depository, from which relevant items could be retrieved on demand.

J. Mills: We cannot remember the two most important events in our lives our birth and our death. My favorite professional aphorism has long been Jesse Shera’s observation that two things distinguish the librarian’s job: bibliography and retrieval. Bibliography stands for all of the problems relating to the information-bearing materials themselves. Retrieval summarizes the central problem in the use of materials (the information store), which is to find relevant items.

Roger Kent Summit: With the rapid growth of the Web, some have been predicting the demise of traditional online services. I don’t agree. Recently, I was doing some research in preparation for a speech I presented in Stockholm. I determined that DIALOG contains more than twenty times the amount of information accessible through the web. Furthermore, the two have grown at roughly the same rate over the past year, based on Alta Vista statistics.

Robert S. Taylor: Our professional responsibility is to understand the technologies and to use them effectively to help people in whatever setting. Without people at the center we become another technology-driven vocation.

Page: 31 of 38

Page 32: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostBrian Campbell Vickery: delivered an inspiring paper to an Aslib conference, asserting, “information service is essential to the progress of science.”

Herbert S. White: I am brought to mind of the fact the we is presently face a more modern version of the same dilemma. E-mail and list-servs allow us to communicate meaningful information rapidly and to a large audience. They allow us to communicate trivia and garbage in the same manner. The limitation in all o f this is that while technology has progressed rapidly, people have remained pretty much the same.

(ASIST annual Conference, 2004)

(Tim Bernes-Lee, ASIST 2004)Tension: Conflict arises between groups.Culture happens in a group.Barriers: language.Semantic web: blending without barriers,Myth: top down system.Declaration of independence – outwardConstitution – inward.W3c: interoperability, international, standards.Semantic web: What can a computer do for you.

e.g.; digital photos with timestamp go into calendars.Attendees of an event automatically have access to pictures.

Web changes topology, computers put of the way, independent, focus on information.Spam is a problem,“Oh Yeah” for trust.There is plenty of room for industry to charge for value.Internet service providers: Platinum partners with political parties and companies.

(HCI ASIST 2004)Herbert Simon: Satisficing and bounded rationale.Agusto: satifycing behavior; reduction and termination.Picard: wearable biometrics.Danielle Goleman: changed think and feel; examine actions.Seligman; optimism is a learned skill.Bandura; coping-efficacy, mentioned sports.Isen; Positive affect and cognitive organization.Triad of information seeking; cognitive, physical and affective.Norman; emotional designBilal; 7th graders are very persistent.Allision Druin; emotional metadata.

Books that make kids happy.Books that can be sad and happy.Metadata; color and feeling.

Erdelez; Information exploration; serendipityFischer; emotional design of everyday life.

People turn to other people for information (accessible, convenient, emotional support).

People get upset if you don’t take their advice.

(Hjorland, Asist 2004)Knowledge in minds and brains of living creatures.Clear terminology is important.

(Hunter, ASIST 2004) – RecordRegardless of physical form, created or received, transaction, organic (flowing), informational values, memory aid, trust.

Page: 32 of 38

Page 33: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostDigital; physical to logical, non-linear, de-centralized, time independent, breakdown of control, disappear.Part of social memory.SOX

Special Notes (Carroll, John M. “Introduction; Human-Computer Interaction, the past and present). The software crisis was never resolved per se. Rather, it helped establish design and development methods as a central topic in computing. Early approaches emphasized the structural decomposition and representation of requirement and specification, and a discipline workflow of stages and hand-offs called the “waterfall”. Brooks (1975/1995) observed that critical requirements often emerge during system development and cannot be anticipated. He concluded that software designers should plan to throw one away. That lesson continues, and design is now a seen as an opportunistic, concrete, and necessarily iterative.

(Yourdon 1989)

The waterfall model of system development: System requirements, software requirements, analysis, program design, coding, testing, operations.

Prototyping life cycle: Definition of the system occurs through gradual and evolutionary discovery as opposed to omniscient foresight.

(Pew, Richard W., “Evolution of human-computer interaction; from memex to bluetooth and beyond).

19070s, Alan Kay PhD thesis introduced some of the concepts of parallel processing, windows and message passing that is the foundation of object-orientated software. His own object-oriented approach creates structures whose complexity could be built up incrementally using a language that emphasized symbolic logic rather than computation. He cast all computation in Smalltalk in the form of self-contained objects, each with its own interface, to the world that allowed partitioning into local and public information.

(Flank , class notes 2004, Object Orientated Analysis and Design)

The purpose of this document is to present how the Unified Modeling Language (UML) can be used to support Object Oriented Project Development initiatives. This document first identifies the Object Oriented Project Development Phases and the associated artifacts that are used in each phase. Each artifact is then detailed presenting the artifact’s purpose, use and dependencies. It is important to note that the UML is a modeling language, not a methodology. The UML has no notion of process, which is an important part of a methodology.

This document is not an attempt to represent a Object Oriented Project Development Standard. Rather, this document presents a perspective that can be applied in its entirety or in parts based on the size and complexity of a

Page: 33 of 38

Page 34: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Postspecific project. Also, specific analysis and design artifacts described in this document have the potential to be used to support non-distributed mainframe project initiatives.

It is also important for the reader to note, that this document is not intended to be a substitute for a source book or class on the UML or Object Oriented Development Methodology. This document is a high level survey of these subjects to give the reader an overview and appreciation for the Object Oriented Development Methodology and to identify the artifacts that one can employ when using the UML. Therefore, an individual should perform the following prerequisite steps prior to using Object Oriented Development Methodology or the UML and any of its artifacts:

Attendance of an Object Oriented Analysis and Design course Reading a book on the subject (see glossary for list of titles) Working with an experience mentor Choosing a small not complex project for one’s first attempt at

using the methodology

(Fowler, Martin. “UML Distilled”. 2004).

(Fowler, Martin. “UML Distilled”. 2004).

The Unified Modeling Language (UML) is a family of graphical notations backed by a single meta-model that help in describing and designing software systems; particularly software systems built using the object-orientated (OO) style.

The UML is a relatively open standard, controlled by the Object Management Group (OMG).

The waterfall style breaks down a project based on activity. The iterative style breaks down a project by subsets of functionality. With iteration, you may not put the system into production at the end of each iteration.

You can have hybrid approached also. McConnell describes the stages delivery life cycle where analysis and high-level design are done first, in a waterfall style, and then the coding and testing are done in iterations.

Requirements churn: Changes in requirements in the later stages of the project.

Many people draw UML diagrams on whiteboards only during a meeting to help communicate their ideas.

In general, the OO style is to use a lot of little objects with a lot of little methods that can give us a lot of plug points for overriding and variations. This style is very confusing to people used to long procedures; indeed, this change is the heart of the paradigm shift of object orientation.

(Osborne, Larry N., Nakamura, Margaret. “Systems Analysis for Librarians and Professionals”. 2000)

(Osborne, Larry N., Nakamura, Margaret. “Systems Analysis for Librarians and Professionals”. 2000).

The SDLC consists of 5 steps: preliminary investigation, systems analysis, systems design, systems development, systems implementation and evaluation (Shelley 1991).

There are two basic styles of symbols used in Data Flow Diagrams: Yourdon and Gane / Sarson . Semantically, these styles are the same, but many analysts use both styles at different stages in the project.

Page: 34 of 38

Page 35: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post

Shlaer and Mellor (1988) define an object as an “abstraction of a set of real-world things such that all of the real-world things are I the set – the instances – have the same characteristics and all instances are subject to the same set of rules”

The technique of keeping data with objects and, if necessary, providing techniques for making it available is called encapsulation and has been part of OO since its inception. Another core concept of OOP is polymorphism, the idea that a super-class defines a generic behavior, while specific instances of that behavior are refined when that super class is referred to by a class.

Object Orientated Analysis (Coad and Yourdon 1990): = Objects + Classification + Inheritance + Communication with messages.

Benefits of object orientated analysis and design: Reusability, reliability, and seamless integration with GUI’s, speedier design.

PHD Program focus: Analysis and Design; While researchers are vigorously pursuing the other element of the SDLC, such as coding, testing, deployment and post project reviews, time and space considerations dictate the response focus on the analysis and design phases of the SDLC. Object Orientated analysis and design is not a mature as other structured techniques.

Steps in the Object Orientated Analysis and design: Prototyping Diagramming tools UML

It is not the technique that matters, but the analyst. Individual techniques are merely ways of thinking, not the thoughts themselves.

Designing the system: Functional Specification Determination of alternatives Conceptual design (inputs, outputs, processes, files) System integration

Page: 35 of 38

Page 36: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. Post

Theoretical SDLC deliverables and approvers

Approvals by Phase SummaryInitiation Deliverables:

Deliverable Approver

Business Case Project Sponsor

Technology Area/Department Manager

Feasibility Study Business Sponsor(s)

Technology Management

Level 0 Estimate Project Sponsor(s)

Definition Deliverables:

Deliverable Approver

Business Requirements Document (BRD)

Business Sponsor(s)

Stakeholder(s)

Technology Management

Level 1 Estimate (and Project Plan) Business Sponsor(s)

Technology Management

Functional Requirements Document (FRD)

Business Sponsors (for business- related functionality only such as screen design and report layout)

Stakeholders

Technology Management

Function Point Count (for Eligible Projects)

Function Point Coordinator

Level 2 Estimate (and Project Plan) Business Sponsor(s)

Technology Management

Software Project Plan (SPP), as appropriate

Business Sponsor(s)

Stakeholders

Technology Management, as appropriate

Information Security Process Standard Deliverables Strategy

GISO/BISO

Page: 36 of 38

Page 37: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostTechnical System Design Deliverables:

Deliverable Approver

Technical System Design (TSD) Document

Technology Management

Development Test Plan Technology Management

User Acceptance Test Plan Technology Management

Business Sponsor (Client Manager)

Stakeholders

Relevant Departments: data center, network, business operations, etc.

Contingency Approach/Disaster Recovery

COB Coordinator

Business Head

Project Manager

Peer Review – Technical Design, as appropriate

Architect

Level 3 Estimate (and updated project plan)

Business Sponsor(s)

Technology Management

Construction Deliverables: Deliverable Approver

Development Testing Technology Management

Release/Backout Plan Business Sponsors

Stakeholders

Technology Management

User Acceptance Test Plan Project Sponsor(s)

Stakeholders

Validation Deliverables:

Deliverable Approver

User Acceptance Test Project Sponsor(s)

Stakeholders

Parallel or Production Assurance Test (PAT), as appropriate

Project Sponsor(s)

Stakeholder

Technology Management

Information Security Review Process (ISRP)

GISO/BISO

Page: 37 of 38

Page 38: Question: - Pace University Webspacewebpage.pace.edu/tvirgona/Day 3 Notes.doc · Web viewCulture is a process that accumulates partial solutions to frequently encountered problems

Fall 2004 Comprehensive Exams – Doctor of Philosophy in Information StudiesDay 3 Notes – Thomas Virgona

Long Island University / C.W. PostDeliverable Approver

Continuity of Business (COB) Contingency Test, as appropriate

Senior Client Manager

Senior Technology Management

Implementation Deliverables:

Deliverable Approver

User Sign-off / Implementation approval

Business Sponsor(s)

Stakeholders

Technology Management

Change Request As required by Change Management Policy

Post-Project Review Deliverables:

Deliverable Approver

Post Project Review Document Business Sponsor(s)

Stakeholders

Technology Management

Page: 38 of 38