technical manual - university of kansashoward/icis material/technical man…  · web viewshe then...

56
TECHNICAL MANUAL Howard Ebmeier, Ph.D. American Association of School Personnel Administrators 533-B North Mur-len Olathe, Kansas 66062 913-829-2007 www.aaspa.org ICIS INTERACTIVE COMPUTER INTERVIEW SYSTEM

Upload: others

Post on 27-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Version 37/1/2006

TECHNICAL MANUAL

Howard Ebmeier, Ph.D.American Association of School Personnel Administrators

533-B North Mur-lenOlathe, Kansas 66062

913-829-2007www.aaspa.org

[email protected]

ICISINTERACTIVE COMPUTER INTERVIEW SYSTEM

Page 2: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 2

Table of ContentsTable of Contents.........................................................................................................................................2

List of Tables and Figures............................................................................................................................3

Introduction..................................................................................................................................................4

Description of the Instrument.......................................................................................................................5

Development............................................................................................................................................5Administration..........................................................................................................................................7Reports......................................................................................................................................................8

Technical Qualities.....................................................................................................................................10

Validity...................................................................................................................................................10Content Validity.................................................................................................................................10Construct Validity..............................................................................................................................11Criterion Related Validity..................................................................................................................12

Concurrent Validity........................................................................................................................12Predictive Validity..........................................................................................................................13

Reliability...............................................................................................................................................14Reliability of Instrument....................................................................................................................14Reliability of Interviewer...................................................................................................................14

Bias Analysis..........................................................................................................................................15Rater Gender Bias..............................................................................................................................15Experience and Age Bias of at the Elementary Level........................................................................16Gender Bias in Practice at the Elementary Level...............................................................................18Level of Education Bias at the Elementary Level..............................................................................19

How the Training Program Works.............................................................................................................19

Advantages.............................................................................................................................................19

Suggestions for Training............................................................................................................................20

Installation..............................................................................................................................................20Using the Interview System...................................................................................................................20Using the Training Module....................................................................................................................21

Cautions......................................................................................................................................................23

Page 3: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 3

List of Tables and Figures

Table 11 Interview Instrument Question Allocation........................................................................................7

Table 12 Example Summary Report Displayed On-Screen.............................................................................8

Table 13 Example of Information Written to Hard Drive................................................................................9

Table 14 Coefficient Alpha Reliabilities of Various Sub-Scales...................................................................14

Table 15 Mean Scores and Frequency Counts Across Actors and Raters.....................................................15

Table 16 Tests of Between Subject Effects....................................................................................................16

Figure 1 Scatterplot of Interviewee Years of Experience and Corresponding Total ICIS Score...................16

Table 17 Pearson Correlations Coefficients for Years of Experience and ICIS Scores.................................17

Figure 2 Scatterplot of Interviewee Age and Corresponding Total ICIS Score.............................................17

Table 18 Pearson Correlation Coefficients for Age of Candidate and ICIS Score........................................17

Table 19 Regression Model Summary of Age and Experience.....................................................................18

Table 19 Regression Model Coefficients of Age and Experience.................................................................18

Table 20 Solution to Common Problems.......................................................................................................23

Table 1 Question Alignment to Teachers of the Future Document...............................................................24

Table 2 Question Alignment to Praxis III Document.....................................................................................25

Table 3 Levels of Development--Working with Others.................................................................................26

Table 4 Levels of Development--Knowledge of Content..............................................................................27

Table 5 Levels of Development--Knowledge of Instruction (Delivery of Instruction).................................28

Table 6 Levels of Development--Knowledge of Instruction (Planning)........................................................29

Table 7 Levels of Development--Knowledge of Instruction (Climate Development)...................................30

Table 8 Levels of Development--Knowledge of Instruction (Assessment)...................................................31

Table 9 Levels of Development--Knowledge of Instruction (Interactions)...................................................32

Table 10 Levels of Development--Knowledge of Students...........................................................................33

Page 4: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 4

Introductionhe employment interview has long been one of the most important aids in employee selection. Indeed, it is the most commonly used method to gather data about prospective employees in both industry and government. Most interview systems consist of a defined set of questions that are

asked in a sequential manner with responses recorded manually on paper. While this system has been satisfactory for many years, limitations are also apparent. First, these interview systems are not adaptive. The same set of questions must be asked of everyone, whether appropriate or not, and in the same order. Additional questions covering a specific topic cannot be added without great difficulty. The paper system actually may not ask enough questions to gather reliable indicators of a candidate's abilities and interests in some areas and may ask too many questions in other areas. Second, because the recording of responses is completed manually, grouping various questions into scales and computing means and standard deviations is not possible. If these tabulations are to occur, they must be done manually after the interview has been completed and the candidate is no longer available for follow-up questions.

T

Third, the manual nature of the interview recording process makes it difficult and time consuming to digitize the results for additional analysis and efficient record keeping. Fourth, when questions are limited in number and always asked sequentially, teacher candidates can work together to duplicate all the questions in a district's interview protocol. This means the security of the interview system can be jeopardized. Lastly, most existing instruments were developed prior to the process-product research initiated in the 1980s, which spent considerable time and money carefully categorizing and defining elements of effective instruction. Much of our understanding of effective classroom practices has been derived from these investigations.1 Theoretically and practically these sets of studies should serve as the basis of any selection process since they have great job validity.

To overcome some of these identified deficiencies, Dr. Howard Ebmeier from the University of Kansas developed a computer-assisted interviewing process. This system, designed for use in K-12 schools, was developed in 2002-03 through many interactions with practitioners and field trials and is described in this document. This interview system uses the power of a laptop computer to allow the interviewer to focus on evaluating the candidate's responses while the computer tracks response patterns, suggests potential questions based on these response patterns, and constructs detailed summary reports to capture various aspects of the interview. The advantages of this new instrument over typical paper-based systems are described below.

1. The ICIS computer interview system is adaptive in nature, which helps assure reliability and conserves time. Questions covering the same theme are not asked if unnecessary. Conversely, if additional questions are needed to gain reliability, they are added on an "as needed" basis. It is a very efficient process.

2. Because the computer is able to perform many calculations very quickly, sub-scales (Knowledge of Teaching, Knowledge of Students, Knowledge of Content, and Work with Others) are included on reporting forms. This allows a more detailed analysis of a candidate's interview responses than a simple total score common to paper instruments. It may be, for example, that the overall score is less useful to a principal than how the candidate scored on the "Works with Others" scale. In these cases,

1 Exemplars of this work can be found in Enhancing Professional Practice: A Framework for Teaching, Charlotte Danielson, ASCD Alexandria, VA, 1996; Praxis Series, Educational Testing Service, Princeton, NJ (see www.ets.com for a list of the many publications); Effective Schooling Practices: A Research Synthesis 1995 Update. Northwest Regional Educational Laboratory, Portland, Oregon (www.nwrel.org); Teachers of the Future: A Continuous Cycle of Improvement (1997), American Association of School Personnel Administrators 533-B North Mur-len, Olathe, Kansas 66062; What Teachers Should Know and Be Able to Do, (1997). The National Board for Professional Teaching Standards (www.nbpts.org); and Florida Performance Measurement System: Domains, Knowledge Base, (1992). Division of Human Resource Development, Florida Department of Education, Tallahassee, FL.

Page 5: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 5

the principal can examine the individual scale scores separately. In addition, the standard deviation is included with all the scale scores indicating the consistency of the candidate's response pattern.

3. All the information is recorded digitally; therefore, it is available for future reference and downloading to a database. Scores need not be entered at a later time. This allows the development of an extensive historical database, which might be useful in justifying selection practices to the EEOC or other governmental bodies. In addition, it allows HR departments to better evaluate their selection practices.

4. A computer-based training module is available to help prepare administrators to use the interview system. This system presents video clips of the interview process using questions included in the computer interview system and cycles administrators through various training modules until mastery is achieved. This is the only computer-based training system available for any teacher interview process. It can be conducted either individually or in a group setting.

5. The large bank of questions included in this system and the random presentation enhances question security. In addition, it is less likely that an interviewer will leave a laptop computer on the table at the end of a college recruitment day than a few pieces of paper with the district's interview questions. Even if the computer was accidentally left behind, to gain access to the secured questions an individual would need to know the district's unique password.

Description of the Instrument

DevelopmentTwo documents derived from national studies served as the basis of question selection and development. The first publication, titled Teacher of the Future, was derived from the work of a national commission of school personnel officers. After an extensive two-year review of the existing literature and practitioner advice, the commission identified nine areas of knowledge and eleven areas of skills needed by all teachers. These knowledge and skill areas are listed on the right side of Table 1. Interview questions were then constructed to measure the essence of these knowledge and skill areas.

The second major document that served as the basis for construction of interview questions was the Praxis III: Classroom Performance Assessments. This set of instruments and procedures was developed to use with beginning teachers for a variety of evaluation purposes, including licensure and professional development. It builds on a base of knowledge of general principles, particular students and their backgrounds, content and its organization, and other specific knowledge and skills necessary for effective teaching. The research and development activities that led to the creation of these assessments were carried out by Educational Testing Service in collaboration with a group of practicing teachers over a ten-year period and under the direction of a National Advisory Committee. Nineteen assessment criteria, organized into four interrelated domains, form the content core of Praxis III. Table 2 lists these 19 assessment criteria along with the questions from the interview instrument constructed to measure each dimension of the Praxis III.

Importantly, to ensure content validity, questions constructed for use in the ICIS instrument were required to measure constructs important to both the Praxis III and Teacher of the Future documents. Representation in both documents assured that two national commissions had identified each concept measured by the interview instrument as critical to teaching excellence. After all the questions had been constructed, they were grouped into four clusters (Working with Others, Knowledge of Teaching, Knowledge of the Content Field, and Knowledge of Students) with each question being assigned uniquely to one cluster.2

2 The cluster representing Knowledge of Teaching was composed of questions from five domains: planning, delivering instruction, assessment, student interactions, and climate development.

Page 6: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 6

To assist with reliability, scoring rubrics for each question were then devised. Before the actual writing proceeded, however, an overall framework for effectiveness/ineffectiveness in each of the four cluster areas was first constructed. Tables 3-10 provide this framework in each area. Descriptions of effective/ineffective practice were primarily based on the process-product research conducted over the last 30 years.3 Using each of the tables plus guidance from the two documents described above, scoring rubrics were constructed for each question as illustrated below.

What are the main ingredients for creating positive classroom relationships between teachers and students?

Level 3 Applicant stresses a genuine concern for caring and respect for individual students. Applicant cites several methods for developing mutual respect and rapport among the teacher, student, and peers.

Level 2 Applicant mentions that care and respect between the teacher and student are important but is less certain about how to achieve these ends.

Level 1 Applicant states that relationships between the teacher and students are less important than control and order.

Upon completion of the construction of the questions and the scoring rubrics, a computer program was written using Authorware software from Macromedia to present and manipulate the questions and responses. One unique feature of the computer program is its ability to remember how the candidate answered previous questions and to use that information to select new questions from the bank.4 For example, if the candidate provided very strong answers to six questions in a row on the Knowledge of Subject Area scale, it is very likely additional questioning in this area would only yield additional strong answers. Rather than gathering redundant information, the computer program (based on an analysis of the standard deviation) moves to questions from a different theme area. This saves precious interviewer time and results in a more reliable interview system. If, however, the candidate's responses are varied with some strong answers and some weak answers, the computer program will continue to generate questions in that theme area until stability is obtained or a fixed number of questions has been reached. Technically, this process is a variation of adaptive testing and has been used for a number of years by ETS on the SAT, GRE, LSAT, and MCAT college admission tests. Several reports are generated during the interview process and presented visually on the computer monitor and written to a file on the computer's hard drive accessible through Microsoft Word (or any other word processing program that can read text files).

AdministrationUsing this computer-based interview system is very similar to a paper-based system with a couple of exceptions. Upon entry into the computer interview system, the user is asked for an identification password. This protects the security of the system should the laptop computer be stolen or should a

3 Exemplars of this work can be found in Jere Brophy, (1987). Research on Teacher Effects: Uses and Abuses, Institute for Research on Teaching, College of Education, Michigan State University, East Lansing MI: Jere Brophy, Evertson, Anderson, Baum, and Crawford (1981). Student Characteristics and Teaching, Longman, New York; Jere Brophy and Thomas Good, (1984). Teacher Behavior and Student Achievement, Institute for Research on Teaching, College of Education, Michigan State University, East Lansing, MI; Michael Dunkin and Bruce Biddle (1974). The Study of Teaching, Holt, Rinehart, and Winston, New York; Andrew Porter and Jere Brophy, (1988). Synthesis of Research on Good Teaching: Insights from the Work of the Institute for Research on Teaching, Educational Leadership, 45(8), 74-85; Carol Dwyer, (1991). Toward High and Rigorous Standards for the Teaching Profession (3rd ed.) National Board for Professional Teaching Standards, , Detroit, MI.

4 After each response the computer calculates the standard deviation for the series of questions on a particular scale. After a fixed number of responses, depending on the version selected, if the standard deviation is less than 0.575, the program branches to the next theme area.

Page 7: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 7

candidate simply wanted to peruse the questions when the interviewer was absent. Next the computer asks the interviewer to select the desired version of the interview (short, normal, or long) as indicated in Table 11 below. Next, the program asks the user if they want questions appropriate for experienced or novice teachers. The questions from these two banks are identical with the exception of tense. Questions for experienced teachers typically ask the candidate how they have handled a particular situation while questions for novice teachers inquire about how the teacher would handle the situation. Finally, one begins the interview in the normal fashion and reads the first question from the computer screen (instead of a paper document). After carefully listening to the candidate's response and comparing it to the scoring rubric, the interviewer selects the level that best matches the candidate's response and enters his or her evaluation via the keyboard. If the interviewer believes the question is not applicable or the candidate's response does not address the question, the interviewer can call for another question. Once the answer is evaluated, another question is then randomly selected from the bank of questions measuring that theme. Questions are, however, never repeated.

Table 11 Interview Instrument Question Allocation

Scale Short Version Normal Version Long Version

Minimum Questions

Maximum Questions

Minimum Questions

Maximum Questions

Minimum Questions

Maximum Questions

Working with Others 3 5 4 6 6 8

Knowledge of Content 3 5 4 6 6 8

Knowledge of Teaching(1)

6 10 8 14 12 20

Knowledge of Students 3 5 4 6 6 8

Total 15 25 20 32 30 44

(1) Questions in this field are represent knowledge of teaching in five distinct areas: planning, delivering instruction, assessment, interactions with students, and classroom climate development.

Available Questions from BankKnowledge of the Content 12Knowledge of the Students 12Knowledge of Teaching 44Working with Others 12Total 80

This process continues until the minimum number of questions has been answered and stability has been achieved. If the candidate's answers are still varied after the minimum number of questions has been answered, the program continues to ask questions in that same theme area until stability is attained or the maximum number of questions has been reached (depending on the version selected initially). It then branches to the next theme area and repeats the process. The interviewer can enter any information he or she wishes via the keyboard at any time. These comments will be included as part of the printout and future database. All the information from the interview session is recorded on the laptop's hard drive in text format (readable by Microsoft Word, Microsoft Excel, Microsoft Access, and most word processors) and presented visually on the computer's monitor.

Page 8: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 8

ReportsThree distinct reports are associated with this interview system. First, as questions are posed to the candidate, the computer displays on the monitor the name of the theme from which the questions are drawn, the current number of questions asked in the theme, and the current standard deviation. Second, upon completion of the interview, the computer displays the number of questions asked, the mean, and the standard deviation for each theme as well as a weighted average and standard deviation for all questions asked during the interview. Questions that are skipped during the interview process are ignored in the tabulations. An example of the display on the monitor appears below.

Table 12 Example Summary Report Displayed On-Screen

Candidate's Name: Tom Smith

Interviewer: Joe Bladden

ScaleNumber of Questions

Asked

Scale Mean

Scale Standard Deviation

Knowledge of the Subject Area 6 2.9 0.30

Knowledge of Teaching 10 1.8 0.58

Knowledge of Students 6 2.2 0.57

Ability to Work with Others 7 2.8 0.51

Total/Average 29 2.6 0.55

Third, the above information plus the interviewer’s numerical evaluation of the candidates responses to each question and the actual questions are written to a file on the user's hard drive for a permanent record. This external file uses the name of the candidate as its label and is located in the same directory as the ICIS program itself. This means that while you can run the interview program directly from the CD distribution disk, if you want a permanent record of the results you will need to first drag the folder containing the ICIS program to your hard drive (or desktop) and run the program from that location. (It is not possible to write to the CD distribution disk.) An example of this printout appears below. Please note that you can manipulate the external file in your word processor to obtain the exact appearance you wish. Experience indicates setting the word processor to the "Times" font and 8 pt will probably produce the most satisfying appearance. Another alternative is to use the landscape mode.

Table 13 Example of Information Written to Hard Drive_________________________________________________________________________ INTERVIEW RESULTS for Smith, Jill_________________________________________________________________________ 2/13/031:51 PMINTERVIEWER EbmeierPOSITION High School English and possibly coaching basketball

Normal Interview Form ________________________________________________________________________

Page 9: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 9

SCORE QUESTION________________________________________________________________________ WORKING WITH OTHERS

1 What kinds of information do you provide to parents?2 How do you resolve peer conflicts?3 If you believe a given school policy is not in the best interest of some students in your class, what would you do?2 How should parents be involved in the instructional process?2 If you were given one community volunteer to work with your class for two hours each week, what would you do?3 Describe how you incorporate the contributions of people from diverse backgrounds into your lessons.

KNOWLEDGE OF CONTENT

1 Identify a particular lesson you have taught. What prerequisite knowledge would students need?2 How do you provide your students with opportunities to use life experiences in the classroom?NA What are the major concepts students should learn in your class?NA Describe how you provide examples from life experiences to your students.2 What concepts are not important to teach in your area and why?2 How do the content recommendations in the teacher's guide influence your decisions about what to teach?

KNOWLEDGE OF TEACHING

3 How do you decide what examples or metaphors to use to illustrate points in the lesson?3 How confident are you in your ability to teach all children?2 What kind of assessment feedback do you provide to students?NA What factors do you take into account when planning daily, weekly, and course lesson plans?2 How do you decide how fast or slow to move through the lesson?2 How do you choose the activities that you will use to teach a particular concept?2 A variety of approaches to instruction may be utilized. What approaches have you used to deliver instruction?3 What resources do you plan to use and/or make available to the students in your classrooms?2 Describe the teaching strategies you have used to present subject matter. How do you know which one to select?

KNOWLEDGE OF STUDENTS

2 What kinds of skills would you expect typical students in your class to possess?2 What do you feel is important for you to know about the students with whom you work?3 What specific student activities and teacher strategies are needed to help special education students?3 What kinds of teaching models or methods would work best for typical students in your classes?________________________________________________________________________SUMMARY REPORT for Smith, Jill________________________________________________________________________Scale Responses Mean S.D.

Working With Others 6 2.17 .75Knowledge of Content 4 1.75 .5Knowledge of Teaching 8 2.38 .52Knowledge of Students 4 2.5 .58

Total/Weighted Average 22 2.23 .59________________________________________________________________________COMMENTS

Bright and energetic individual. Possibility of sponsoring the drill squad as well as the senior English position. Her e-mail has changed to [email protected] so the district office needs to make a note here. Jill has some family connections to this area and a brother teaching in District 200.

Technical Qualities

ValidityValidity refers to whether an instrument measures what it claims to measure. There are several different approaches to determining the overall validity of an interview instrument: content validity, construct validity, and criterion related validity.5 Evidence for each is discussed below.

5 Face validity (the instrument looks as if it is measuring the trait desired) is often cited in measurement textbooks as another form of validity. Appearances are no substitute for evidence, however, thus many measure experts regard face validity as no indication of validity at all.

Page 10: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 10

Content ValidityContent validity refers to the match between the questions on the instrument and the underlying domain that the instrument purports to measure. For example, if the instrument was designed to assess the suitability of an individual for teaching, then one would assume the questions asked during the interview were directly related to the future duties of the teacher as opposed to general personality characteristics or duties of a principal. To obtain a broad view of the knowledge base and skills needed by classroom teachers, documents generated by the National Board for Professional Teaching Standards, National Council of Accreditation of Teacher Education, National Commission of Teaching and America's Future, Interstate New Teachers Assessment and Support Systems Consortium, Association of Teacher Educators, National Association of State Boards of Education, American Federation of Teachers, National Education Association, Homes Group, American Association of Supervision and Curriculum Development, Florida Domain of Teaching Competencies, Praxis Series, and the American Association of School Personnel Administrators were examined.

These documents along with input from two recent publications described below served as the basis of question selection for the ICIS interview system. The first publication, which served as a primary backbone of the ICIS interview instrument, titled Teacher of the Future, was derived from the work of a national commission of school personnel officers representing large and small districts, urban, suburban, and rural districts, and ethnically and culturally diverse districts. Individuals serving on this commission were members of the American Association of School Personnel Administrators (AASPA), an organization whose members are responsible for the selection and recommendation for employment of more than 80% of the nation's teachers. After an extensive two-year review of the existing literature and practitioner advice, the commission identified nine areas of knowledge and eleven areas of skills needed by all teachers (Table 1). This document has been described as an international job description for the educator of the future and has since become an important foundation and base for use beyond educator pre-service programs. Examples of these uses involve the human resource processes of hiring, induction, assessment, and professional development.

The second major document that served as the foundation for selection of interview questions was the Praxis III: Classroom Performance Assessments. This set of instruments and procedures was developed to use with beginning teachers for a variety of evaluation purposes, including licensure and professional development. It builds on a base of knowledge of general principles, particular students and their backgrounds, content and its organization, and other specific knowledge and skills necessary for effective teaching. The Praxis III is an extension of the Praxis I and Praxis II, which have served as the basis for licensure in most states and are widely regarded for their technical sophistication and soundness. The research and development activities that led to the creation of these assessments were carried out by Educational Testing Service in collaboration with a group of practicing teachers over a ten-year period and under the direction of a National Advisory Committee. Since one important aspect of the development of the Praxis Series was the need for the assessments to meet the highest standards of technical and legal defensibility for high-stakes decision-making, the technical quality, procedures, and the documentation process that accompanied the development of all three instruments were carefully specified. In addition, adherence to the development protocols was carefully followed. Extensive reports have been written about the technical qualities of these documents.6

Nineteen assessment criteria, organized into four interrelated domains, form the content core of Praxis III. Useful evidence for each criterion is expected to be evidenced across a wide range of teaching contexts, and many iterations of field-testing were carried out to ensure that this was, in fact, the case. Table 2 lists these 19 assessment criteria along with the questions from the interview instrument constructed to measure each dimension of the Praxis III.

6 For technical reports about the Praxis III see Dwyer & Stufflebeam, 1996, Teacher Evaluation. In D. Berliner & R. Calfree (eds.), Handbook of Educational Psychology (pp. 765-786). New York: Macmillan; Hood & Parker, 1991, Minorities, Teacher Testing, and Recent U.S. Supreme Court Holdings: A Regressive Step. Teachers College Record, 92, 603-618; and, Dwyer, 1998, Psychometrics of Praxis III: Classroom Performance Assessments, Journal of Personnel Evaluation in Education, 12(2), 163-187.

Page 11: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 11

The competencies, upon which questions were developed, must have been identified by both documents (Praxis III and Teachers of the Future). Some items (knowledge of computers for example) did not meet that specification and were deleted from consideration. In addition, each suggested competency must have appeared in the majority of publications from the national organizations referenced above.

General scoring rubrics for each of the major clusters of questions (themes) used in the ICIS instrument (Knowledge of Content, Knowledge of Teaching, Knowledge of Students, and Working with Others) were then constructed and appear in Tables 3-10. The scoring rubrics for each question were based on these generalized ideas of desired/undesired responses for each of the instrument's theme areas. These rubrics were derived from three sources. First, both the Teachers of the Future and Praxis III documents suggest scoring guidelines for each of the domains they identify. The Teacher of the Future documents provides these scoring guidelines in the form of suggested responses to "look for" while the Praxis III provided a five point scoring rubric for each domain. Second, scoring rubrics for similar questions to those in the ICIS instrument from an ASCD publication titled Enhancing Professional Practice: A Framework for Teaching were consulted. This document is widely used for teacher induction and staff development programs and was constructed based on extensions of the competencies identified in the Praxis III projects. In this document, the author Charlotte Danielson provides descriptions of teacher behavior for each of the Praxis III domains under four headings: unsatisfactory, basic, proficient, and distinguished. Comparing these suggested rubrics to those developed for the ICIS interview instrument helped ground the scoring in a second document. Third, the construction of the scoring rubrics was aided immensely from consulting the process-product research as summarized in the Florida Domain Publications, Effective Schooling Practices: A Research Synthesis 1995 Update, and the publications of Tom Good and Jere Brophy.7

Construct ValidityConstruct validity is defined as the extent to which the instrument measures the theoretical concept or trait. For example, in this particular case, one of the concepts the instrument purports to measure is a candidate's "Knowledge of the Content". Thus, high scores on the ICIS interview instrument on the "Knowledge of the Content" theme should correlate highly with the candidates GPA in the subject field, standardized testing measuring knowledge in the content field such as the Praxis II, or actual work experience in the field. The construct validity is demonstrated by accumulating patterns of expected correlations. Reason dictates that some variables should be positively related to the ICIS instrument (or various themes on the instrument) and other variables should show few relationships. Thus, we might expect a measure of anxiety to correlate positively with a questionnaire of perceived threats, but have no relationship with measured of intelligence or educational level. Approaches to construct validity are generally referenced as convergent and/or discriminant validity. That is, the set of data collected should show a cluster of high correlations among factors where one would theoretically expect relationships and little or no correlations among scores on unrelated measures.

To date, four studies of construct validity of the ICIS instrument are in progress and two studies, one by Tim Allshouse8 and the other by Leslie Evans9 have been completed. Allshouse compared the scores from the Knowledge of Content sub-scale between experienced teachers with great and little knowledge of the content field. Allshouse defined teachers with great content knowledge as those individuals who had majors in the teaching field. In contrast, teachers in his study defined as possessing little knowledge were

7 Thomas Good and Jere Brophy, (1995). Contemporary Educational Psychology, Longman Publishing, White Plains, NY; Bruce Biddle, Thomas Good, Jere Brophy, (1997). International Handbook of Teachers and Teaching, Kluwer Academic Publishers Norwell; Mary McCaslin and Thomas Good, (2000). Listening in Classrooms, Diane Publishing Company, Collingdale; Thomas Good and Jere Brophy, (2002). Looking in Classrooms, Allyn and Bacon, Boston.

8 Tim Allshouse, (2003). Construct Validity of the Knowledge of Content Scale from the AASPA Interactive Computer Interview Instrument, Unpublished dissertation, University of Kansas, Lawrence, KS.

9 Leslie Evans, (2003). Construct Validity of the Working with Others Scale from the AASPA Interactive Computer Interview Instrument, Unpublished dissertation, University of Kansas, Lawrence, KS.

Page 12: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 12

individuals who had very few hours of college credit in the field. Comparison of the average scores on the Knowledge of Content sub-scale revealed significant differences (p<0.0001). Teachers who majored in their teaching field had vastly superior scores. Indeed, of the 41 teachers interviewed, the instrument was able to correctly predict group membership (high or low content knowledge) 100% of the time.

Evans blindly interviewed a total of 40 elementary, middle, and high school teachers using the ICIS. Their principals had previously identified these teachers as low or high in terms of their ability to work with others using a provided rubric (Table 3) upon which to classify these teachers. Comparison of the average scores on the Working With Others sub-scale revealed significant differences (p<0.001). Teachers identified by their principal as having skills working with their peers and community members scored much higher. The correct classification rate was 80% in this study.

Criterion Related ValidityCriterion related validity refers to the relationship between scores and a criterion. A criterion is an established index or measure of the trait or behavior that the interview instrument claims to measure. For example, an instrument measuring fishing ability should exhibit higher scores for individuals who have won the Bass Master's Tournament than for weekend fishing novices. Criterion related validity has two subdivisions depending on when the criterion data are collected, concurrent validity and predictive validity.

Concurrent ValidityConcurrent validity is established when the instrument's scores and the criterion data are collected at about the same time, i.e., concurrently. For example, concurrent validity could be established if teachers' scores on the ICIS instrument were highly correlated to their principal's evaluation of their instructional skills in the classroom. Fortunately, several studies that employed similar questions and format to the ICIS's instrument have been completed and lend credence to the concurrent validity of the instrument. In the first supportive study, Cowan identified 10 "effective" and 10 "less effective" teachers. Each teacher was interviewed using an instrument composed of 12 job-related questions (similar and in some cases identical to those found on the ICIS instrument). Twenty-seven principals using scoring rubrics, similar to those found in the ICIS instrument, rated the 20 teachers’ responses from the transcripts. Examination of the resultant data revealed an 86% correct classification rate of the "effective" and 83% correct classification rating for the "less effective" teachers.10 Emley employed a branching interview format with structured questions and scoring rubrics. He compared the scores obtained from this instrument with principal evaluations of the teachers. Results indicated the interview instrument was able to distinguish between strong and weak teachers with over 95% accuracy.11 Shirk used a computer interview program consisting of 56 questions to see if he could separate effective from ineffective teachers (N=48).12 Again, the results supported the feasibility of using a computer with defined questions and scoring rubrics to identify effective teachers. Shirk was able to correctly classify 100% of the teachers in the study (effective or ineffective) using the results from the computer-based interviews. The best predictors in the Shirk study were the teacher's academic ability, general knowledge of pedagogy, relationships with students, students' need to be challenged, and the importance of enthusiasm for effective teaching. All five of these variables are included to a prominent degree in the ICIS instrument.

Lastly, Ebmeier and Ng13 examined the concurrent validity of the ICIS interview instrument that had been slightly modified by the inclusion of a scale specifically targeted to urban teaching. By comparing the interview scores of 30 teachers with varying effectiveness ratings provided by administrators in one urban 10 Patrick Cowan, (1999). A Comparison of the Predictive Power of Competency-based and Personality-Based Structured Interviews in Identifying Successful Teachers, Dissertation Abstracts. ISBN: 0-599-86322-6.

11 Emly, K., and Ebmeier, H. (1997). The Predictive Validity of Branched and Structured Employee Interviews. Journal of Personnel Evaluation in Education, 11(1), 39-56

12 Larry Shirk, (1997). Predictive Validity of a Computerized Interview Process (Teacher Selection), Dissertation Abstracts AAG9811580.

Page 13: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 13

district, significant correlations were found. Regression analysis indicated a significant amount of variance in teachers' effectiveness ratings could be predicted from their scores on the interview instrument.

Predictive ValidityIn predictive validity studies, the instrument is administered at one date and used to predict performance at a later date. There have been no quality studies of this type reported in the literature involving teacher interview instruments. This is due to three factors: restricted range problems, covariance problems, and time commitment issues associated with multi-year studies. First, since most school districts use the interview as a selection device, only teachers who obtained high scores on the screening instrument are usually employed. This presents problems with any research design because the full range of scores is not available for analysis.

Second, since the scoring high on the interview instrument is correlated with employment, studies that attempt to evaluate the quality of candidates employed by examining their scores on the interview instrument are really looking at data which are co-dependent. In addition, many studies depend on principal ratings of these teachers, the same individuals who likely conducted the initial interview. Lastly, because predictive studies require several years to complete, they do not fit the real-world requirements of timely publishing and dissertation completion. As a result, little is currently known from well-conceived and executed studies in this area about the predictive nature of teacher interview systems. A reasonable study would be to identify 200-400 teacher candidates from a normal distribution of education majors, interview these individuals, and the follow their careers for 2-5 years using a number of outcome measures (satisfaction, performance, principal ratings, achievement test results, etc.) as the dependent variable. To date this effort has not been undertaken. Such a study is planned, however, using the ICIS interview instrument beginning in 2003.

ReliabilityReliability of a given instrument needs to be considered from two perspectives. The first is the extent to which the instrument can provide consistent measurements on repeated occasions. The term reliability is not an overall assessment of the goodness of a test, nor does it address the problem of whether the test measures what it claims to measure (a validity concern). Reliability answers the single question: If I administer this interview instrument again to the same applicant, is the score likely to move up or down by a large amount or to remain relatively stable? The second consideration is the question of the reliability of the interviewer across time periods and among different interviewers. Differing scores of the same candidate's responses from different interviewers spells trouble for future decision making.

Reliability of InstrumentAlthough there are a number of procedures for estimating the internal reliability of a given instrument, one of the most common is the use of Coefficient Alpha. Reliabilities greater than 0.80 are considered acceptable for use in selection instrument with scores above 0.90 considered excellent. Table 14 below indicates the reliability of the four scales in the ICIS instrument along with the estimated Standard Error of Measurement.

Table 14 Coefficient Alpha Reliabilities of Various Sub-Scales

Sub-Scale Coefficient Alpha Reliability Standard Error of Measurement

Working with Others 0.90 0.08

Knowledge of Content 0.97 0.06

13 Nr, J. and Ebmeier, H. (2006). The Development and Field Test of and Employment Selection Instrument for Urban Teachers, Journal of Personnel Evaluation in Education, Fall.

Page 14: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 14

Knowledge of Teaching(1) 0.93 0.09

Knowledge of Students 0.90 0.11

(1) Estimated from preliminary data and subject to revision. The reliability estimates are derived from the complete scale (all questions). The short version estimates are around 0.70: normal length around 0.80 and the long version around 0.90.

Reliability of InterviewerThe usefulness of the interview, whatever system is used, is only as good as the training and preparation of the interviewers. If several interviewers observe the same interview session but score it differently, the resultant scores have little reliability and thus almost no meaning. Before a district's administrators can have any confidence in the soundness of their screening interview process, they must be convinced that the gathered information is reliable and independent of the person who conducted the interview. In other words, whoever conducted the teacher interview should not matter. The scores obtained should be the same.

Before the questions and scoring rubrics on the instrument were finalized, a team of five administrators viewed videotapes of 160 vignettes of teachers responding to the questions on the ICIS interview instrument. The intent was to see if the administrators could consistently select the same rubric after viewing the tape segments. Results indicated the questions and assigned rubrics were clear and functional. Overall agreement across the five administrators was 93%. Obviously, the more training involved the higher the inter-rater reliability. Experience to date indicates that the typical administrator will achieve about 80% accuracy without any training.

Bias AnalysisInterview instruments are all subject to bias caused in part by rater's entering dispositions about a candidate's age, experience, race, and gender in addition to the potential bias inherent in the questions and scoring methods themselves. An example of the former would be female elementary principals favoring male candidates or male secondary principals giving higher ratings to male secondary candidates. Questions more easily answered by female candidates than male candidates would be an example of the latter type of bias. To estimate the magnitude of various types of potential bias, several studies have been conducted using the questions from the ICIS instrument.

Rater Gender BiasTo estimate gender bias associated with the rater a series of video clips portraying good, average, and poor responses were shown to 23 graduate students in educational administration. Students were asked to rate the response of each actor according to a provided rubric taken from the ICIS instrument. The film clips were selected randomly from a bank of 60 possible questions and responses, thus each student received a mixture of good, average, and poor responses from both male and female actors. Table 15 below presents the results of this experiment. A score of zero indicates correct identification of the indented quality level of response. As can be observed from the table, both male and female raters (graduate students) were relatively accurate in their identification of the intended quality level of the actor's response. In general, the average graduate student was only off 2.5 percent for each video clip and only 5.3 percent (0.11 actual points) in error across all the clips they rated. Overall, the graduate students achieved over 80% accuracy for all the video clips. This is quite good since the graduate students received no training prior to engaging in this exercise. It indicates that the typical school administrator can achieve a good accuracy rating with minimal or no training. Table 15 also indicates little difference in rating scores across male and female actors, thus there does not appear to be any gender bias associated with the ICIS instrument. Female graduate students rated the clips more accurately than did the male students but not at a statistically significant level as can be observed from Table 16. There were no interaction effects between graduate student gender and actor gender. Overall the results seem to indicate that the ICIS instrument is gender bias free and requires little training to accurately use.

Page 15: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 15

Table 15 Mean Scores and Frequency Counts Across Actors and Raters

Male Rater Female Rater Total

Male Actor Mean=0.1100

N=109

Mean=0.0075

N=129

Mean=0.0560

N=238

Female Actor Mean=0.1100

N=100

Mean=0.0027

N=146

Mean=0.0610

N=246

Total Mean=0.1100

N=209

Mean=0.0182

N=275

Mean=0.0579(1)

N=484

(1) Average error rate was 2.5 percent for each video clip evaluated. Average rater total error was 0.11 points from the designated correct answer or 5.3 percent on a 1-3 scoring scale.

Table 16 Tests of Between Subject Effects

Source Sum of Squares df Mean Square F Significance

Corrected Model

Intercept

Actor Gender

Rater Gender

Actor * Rater

Error

Total

1.029(2)

1.929

0.011

1.013

0.015

151.352

154.000

3

1

1

1

1

480

484

0.343

1.929

0.013

1.013

0.011

0.315

1.087

6.117

0.036

3.212

0.037

0.354

0.014

0.850

0.074

0.848

(2)R Squared = 0.007 (Adjusted R Squared = 0.001)

Experience and Age Bias of at the Elementary LevelTo examine possible age and experience bias inherent in the questions and scoring rubrics, a sample of 50 practicing elementary teachers from one suburban district were randomly selected and interviewed by a single individual14 using the standard length of the ICIS instrument. Five groups were initially formed based on teaching experience and then 10 teachers were randomly selected interviewed from each subgroup. A scatter plot and Pearson correlations were calculated and appear below.

Figure 1 Scatterplot of Interviewee Years of Experience and Corresponding Total ICIS Score

14 Additional information can be found in Gary Stevenson's dissertation. Age and Experience Effects on the ICIS, University of Kansas, 2005.

Page 16: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 16

0

0.5

1

1.5

2

2.5

3

0 10 20 30

Years of Experience

ICIS

Sco

res

Table 17 Pearson Correlations Coefficients for Years of Experience and ICIS Scores

ICIS cluster Pearson Correlation Coefficient P-Value

Working with Others -0.01 0.95

Knowledge of Content 0.02 0.89

Knowledge of Teaching -0.10 0.48

Knowledge of Students -0.20 0.16

Total ICIS Score -0.12 0.42

Figure 2 Scatterplot of Interviewee Age and Corresponding Total ICIS Score

0

0.5

1

1.5

2

2.5

3

0 20 40 60 80

Age of Interviewees

ICIS

Sco

res

Page 17: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 17

Table 18 Pearson Correlation Coefficients for Age of Candidate and ICIS Score

ICIS clusterPearson Correlation Coefficient P-value

Working with Others -0.18 0.21

Knowledge of Content -0.08 0.58

Knowledge of Teaching -0.31 0.03

Knowledge of Students -0.38 0.01

Total ICIS Score -0.34 0.01

Additionally, regression analyses were conducted using both the variable of experience and the variable of age along with the total ICIS scores earned by the interviewees. This model produced an r value of 0.465 and the value of r-square was 0.22.  The regression curve equation was (Total score) = 2.727 - 0.019 (Age) + 0.019 (Yrs Teaching). Once again, while significance is evident, given such a low r-square value, which describes how much of the variability it explains, the usefulness of these results is questionable.

Page 18: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 18

Table 19 Regression Model Summary of Age and Experience

Model R R Square

Adjusted R Square

Std. Error of the

Estimate

Change Statistics

R Square Change

F Change df1 df2 Sig. F Change

1 .47 .22 .183 .23 .22 6.48 2 47 .003

Table 19 Regression Model Coefficients of Age and Experience

Model

Unstandardized Coefficients

Standardized Coefficients

T Sig.

B Std. Error Beta

Constant

Experience

Age

2.73

.02

-.02

.15

.008

.005

.57

-.82

18.15

2.42

-3.49

.00

.02

.001

Based on the results of this set of analyses, it appears age and experience matter but probably not sufficiently to warrant adjusting the interview scores based on these factors. Clearly little experience bias exists on any of the scales. Some age bias exists on the Knowledge of Teaching, Knowledge of Students, and Total scales where older teachers seem to score less well. Therefore, some care should be given to over-reliance on these ICIS score when comparing candidates of greatly differing ages.

Gender Bias in Practice at the Elementary LevelBased on the Stevenson data described above, tests were run to determine if there was a significant difference between the mean scores earned on the ICIS interview by males and females. Initially, the mean scores for the female and male interviewees were separated and examined. Levene’s Test of Equality of Variances was performed to determine if there was sufficient evidence to conclude the variances between these two groups were unequal. The results of this test showed the p-value to be 0.56. This indicated insufficient evidence was found to conclude the variances are unequal.

Next, a test of hypothesis was conducted to determine if the mean score for women was different than the mean score for men. This test of hypothesis was assessed with a t-test using a null hypothesis that the mean scores were equal versus the alternate hypothesis that the mean scores were different. Given the results from the Levene’s test, it was assumed that the two populations had equal variances and a pooled two-sample t-test was used. The t value produced was 1.54, which has a p-value of 0.13. Since this p-value is greater than the alpha of 0.05, it can be concluded that the mean scores for men and women do not significantly differ.

Page 19: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 19

Level of Education Bias at the Elementary LevelBased on the Stevenson data described above, the ICIS scores were separated in order to compare those scores earned by interviewees with only undergraduate degrees to the scores earned by interviewees with graduate degrees. The results of Levene’s Test for Equality of variances indicated there was not enough evidence to conclude the variances between these two groups were unequal.

Once again, a test of hypothesis was done in order to determine if the mean score for those with only undergraduate degrees was different than the mean score for those with graduate degrees. This test of hypothesis was assessed with a t-test using a null hypothesis that the mean scores were equal versus the alternate hypothesis that the mean scores were different. Based on the results from Levene’s test, it was assumed that the two populations had equal variances and once again a pooled two-sample t-test was used. The value of t in this analysis was -1.17, which has a p-value of 0.25. Since this p-value is greater than the alpha of 0.05, it can be concluded that the mean scores for teachers with graduate degrees and teachers with only undergraduate degrees do not significantly differ.

How the Training Program WorksWhile the study described above indicates the potential soundness of the questions and rubrics, each district will need to establish the same level of consistency across those individuals selected as interviewers. The only way to develop consistency in scoring across interviewers is to practice using the same training material until mastery and/or consistency is achieved. That is exactly the purpose of the ICIS computer-based training system (ICIS Interviewer Training System). This training program provides over 160 video clips of teachers responding in very positive to very negative ways to the interview questions found on the ICIS Interview System (described above). The learner is asked to classify these responses, and the results are compared to the correct answers. The whole purpose of the system is to build a high level of mastery. If incorrect responses are given, the computer recycles the trainee back through more example video clips and mini quizzes.

The training can be completed either individually, through group meetings, or through some combination of the two. If completed individually, the interviewer would be given a copy of the training program on DVD to take back to his or her office for individual tutoring. The individual would use his or her own computer to work through the differing training modules until mastery is obtained and he or she can successfully pass the exit exam (passing scores are defined by the individual school district). The individual can teach himself or herself at his or her own pace. Importantly, this process can be completed at any time. For example, the system could be used to prepare a new principal to use the district's system or to refresh the administrative staff's skills prior to the spring recruitment period.

If used in a group setting, the training material on the DVD could be displayed using a projection device. The prospective interviewers could view the video clips together and discuss what they think are correct responses and why. These sessions could operate much like cooperative learning lessons in the schools. After this joint training, each administrator could return to his or her office to complete the training process and successfully exit the mastery test or an exit proficiency exam can be administered to the entire training group at the end of the training session. (A unique proficiency exam is generated randomly from the 160 available video clips; thus, the opportunity to memorize the answers is non-existent.)

Advantages1. All the training can be completed internally without the need for expensive outside training

consultants.

2. Demonstrated mastery is required to complete the training program. A special code is generated when mastery is achieved thereby assuring the HR department that the individual has the specified level of skills.

Page 20: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 20

3. The training program can be completed at any time either in a group setting or individually.

4. The video clips are the best possible simulation of real interviews. The scripted responses from the actors in the video clips demonstrate a wide variety of responses to the interview questions. The training program really challenges the learner to carefully discriminate among possible answers.

5. Working through the training program often is an excellent way to teach administrators about effective and ineffective classroom practices. The rubrics have been designed to reflect what is known about effective instruction and are supported by a substantial body of research in the fields of educational psychology and human learning.

6. There are multiple modules within the Training CD to facilitate many different ways of learning the material. Learners can take practice tests that include feedback about the correctness of the responses, view positive and negative examples, repeatedly view video clips illustrating responses to specific questions, practice using all the video clips and/or view just a random selection of video segments.

Suggestions for TrainingThe primary purpose of training is to increase inter-rater reliability across all users within the school district. There are many ways to achieve this objective; each district should experiment until they find one or more effective ways to accomplish this goal. Consistency is the goal--not necessarily getting the “right” answer. The suggestions listed below are training steps that have been effective in the past.

InstallationInstallation of both the ICIS interview program and the associated training program is relatively simple. Insert the respective disk into the appropriate disk drive. Open the “My Computer” icon on the desktop and then the DVD/CD disk. Copy or drag the entire program file to whatever location you wish on your hard drive (typically desktop or “C” drive). If you wish, you can create a shortcut icon and place it on your desktop for quick program access. To start the program, simply double click the brown “Double Click to Start” icon.

If your computer does not have a DVD drive, you can still use the ICIS interview program without modification; however, to install the training module you will need to follow one of the four procedures below. One, you can simply call AASPA and they will mail you two CDs with the training program spread across both disks and instructions how to load the program from both disks. Two, you can load the program on a computer that has a DVD reader and then burn two CDs with all the files from the “Training” folder and then transfer the files to your computer via the CDs and reassemble the program putting all files in the same folder. Three, you can load the training program on a computer with a DVD reader and then use a 2GB flash drive to transfer the program to your computer via a USB port. Lastly, you can have your system administrator install the training program on the school district’s server and download the program to your computer through the internet/intranet.

Using the Interview System1. Make sufficient copies of the ICIS Interview disk for all administrators in the training session. Your

“user” license allows the duplication of disks for internal use in the district. Please inform your administrators of this “internal use only” policy. It would be a copyright infringement to make duplicates for individuals outside your district and would damage the security of the ICIS system. If administrators outside your district are curious how the ICIS system works, have them contact AASPA (www.aasps,org) for a free copy of the demonstration program.

2. If the ICIS Interview program is not already installed, then have the administrators in the training session copy the ICIS Interview file from the CD/DVD onto their hard drive of a portable laptop

Page 21: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 21

computer they brought to the training session (or onto the computer in a lab). Simply follow the instructions on the CD, which are repeated below.1. Insert the CD/DVD into appropriate drive2. Copy "interview" folder to hard drive3. Open the folder on the hard drive4. Double click "start" to initiate program

3. Have the administrators open up the program and look at the first screen. Note that to advance the program a district password is necessary. This password is supplied with the initial material and is designed to be a protective measure should a computer be lost. It also prevents unauthorized access.

4. Have the administrators enter some of the background information as requested on the next screen. Note that once you hit the “enter” key you cannot go back and edit previously typed material. The information entered on this screen is written to the hard drive and is included in the printout.

5. The next screen asks the administrator if he or she wishes to use questions designed for novice or experienced teachers. Basically the questions are identical except for verb tense. Questions for experienced teachers ask candidates how they have addressed a particular situation. Questions for novice teachers ask candidates how they would handle a particular situation.

6. The next screen asks you to select the length of the interview (short, normal or long form). The short form is intended for quick screening interviews you might encounter during teacher career fairs. The long form might be used for final screening where time is of less concern but accurate scores are essential. For training purposes, the short form is easiest to work with because it takes less time.

7. Have the administrators form pairs and practice interviewing each other using the ICIS interview instrument. Have them consider the placement of the computer during the interview and how to enter the scores. Please note that the program allows you to enter the numerical ratings with your left hand using the tab key (on left side of keyboard) to enter the data. This should allow the administrator to maintain maximal eye contact yet still enter the scores. Be careful not to position the computer between you and the interviewee. You don't want the computer to block communications.

8. Please point out that if the question is inappropriate, the administrator can have the computer generate an alternative question by entering a "4".

9. When all the administrators have had some success using the ICIS interview program, see if they can find the file written to their hard drive with the information captured during these practice sessions. Have them practice re-formatting the text to the font "Times" 8 pt. to make the report more readable.

10. To finish this first session, you might describe the process you wish the administrators to follow to transmit the data to the HRM department or other record keeping office.

Using the Training ModuleLoad the training module onto the hard drive or desktop using the instructions above. If you intend to use the program for large group training purposes, be sure to run through the entire program to make sure all the sections work properly. If you intend to have administrators individually work through the program, go over the installation process slowly. If the training is completed in a large-group setting by one trainer, then the following guidelines are helpful.

1. Be sure you set up the projection equipment so that everyone can easily read the words and see the video segments. Be sure the sound is on and sufficiently loud for all to hear.

2. Provide written copies of the overall rubrics (from the Technical Manual enclosed on one of the disks) for each of the theme areas for the participants to reference during the session.

3. Break up the training into 3-4 sessions of 1-2 hours duration. Each session could cover one theme area. Don’t try to do too much at one time, since retention will be limited if the training is too long.

4. Encourage the administrators to carefully read the rubrics provided for each question. Don’t just have them respond using their own frame of reference.

5. Please note that new film clips will not appear until the right answer has been entered.

Page 22: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 22

6. To repeat individual video clips, click the "back" button then the "forward" button. The system works much like a tape recorder where you rewind and then play forward.

7. The program provides suggested answers to each of the video segments. These answers were derived from extensive discussions with a five-member scoring panel of administrators. Your administrative staff might score the video clips differently. That is acceptable as long as you have consensus within your staff about how that type of answer should be scored. Consistency is more important than agreeing with the answers provided by the computer. Don’t get into long discussions about the “correct” answer. In some cases, the computer will accept more than one answer.

8. There are several ways the training sessions could be conducted. Use your own best judgment about how to proceed. One possibility is to spend several sessions on the various theme areas then administer the short quiz. For those administrators who encounter difficulty scoring satisfactorily, you could have them attend subsequent sessions until their skills improve.

9. With the exception noted below, the computer program will not advance until the correct answer has been entered. Every guess counts as an attempt.

10. Within the "Large Group" training module you can present the videos on any of the quizzes without revealing the answers by entering “0”. This will advance the program to the next video. A scoring key will be written to the hard drive that you can use to check individual papers. This checking can be completed at a later date or during the training session.

11. If the training is done individually by the administrator at his or her own computer, then you might encourage the administrators to do the following:

Focus on one theme area at a time until mastery is obtained (The district will probably need to provide some guidelines in terms of what is expected. An overall percentage correct of 80% or better is desirable.)

Carefully read the overall guidelines for each theme area (provided via the computer program) before attempting the section.

Space out the study sessions over three or four time periods. Carefully read the rubrics provided for each question. Don’t just respond using you own frame of

reference. Remember that the computer program will not advance until the correct answer has been entered.

Every guess counts as an attempt. Attempt the quiz after sufficient practice has been completed. (The district will need to establish

an acceptable pass rate.) Repeat the quiz as many times as is necessary until a desirable score is obtained then call on the

computer to produce a file to be printed at a later time indicating proficiency has been attained.

CautionsDespite its wide acceptance, the employee interview has definite limitations. Researchers over the last fifty years have cautioned employers not to overemphasize the interview's importance. Estimates indicate that even the best interview process generally can only account for about fifty percent of the variance in predicting future employee quality. Prior on-the-job performance is by far the best predictor followed far behind by performance in simulations, structured interviews, tests, academic performance, unstructured interviews, and recommendations.

These cautions are mentioned here simply to warn the user of the limitations of this or any other interview instrument. Setting specific cut-off scores or making final judgments about the employability of a candidate should not be made based on the numerical scores generated by this instrument alone. Although every effort has been made to craft a reliable, valid, and user-friendly instrument, the final selection process should consist of a number of data gathering activities including background information, interviews, reference checks, simulations, test results, and qualitative feedback from potential colleagues.

Page 23: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 23

Table 20 Solution to Common Problems

Possible Problem Solution

What if I have a Mac? There is a special version of the ICIS interview program for Mac users. Call AASPA and have them send you this modified version. There is currently no Mac version of the training DVD. We do have a videotape version of all the training segments if that might help.

I get an error message that says I don’t have the “XTRA” file available.

Reinstall the program. All the files must reside in the same folder. The programs need all the driver files and XTRA file folder to properly function.

I get error messages from my computer or it freezes.

You probably have a program installed on your computer that is incompatible with the ICIS program. This is very rare, but we have run into this situation once. The best solution is to find another computer to use. The ICIS programs will function correctly with very early versions of Windows (Version 95) and only a reasonable amount of memory.

My Mac version asks me about the “Classic” operating system

The Mac has had a number of differing operating systems over the years. To make the Mac version functional on most Macs, the ICIS interview program was designed for use on one of the early operating systems. If you have a newer Mac, you might need to install the older “Classic” system, which can generally be found on the disks that came with your Mac. If you have difficulty, take you Mac to someone in your district with computer “smarts” about Macs.

Page 24: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 24

Table 1 Question Alignment to Teachers of the Future Document

Critical Knowledge Needed by Teachers

1. Know the subject(s) they teach and how they are related to other subjects KTP1, KTI1, KC1, KC2, KC3, KC5, KC6, KC7, KC8, KC9, KC10, KC11, KC12

2. Know how to teach the subject(s) to students KA8, KTD1, KTD6, KTA1

3. Know how to assess student progress on a regular basis KTD8, KTA4, KTA5, KTA6

4. Know how to plan lessons in a logical sequence KTP2, KTP4, KTP6, KTD2, KTD7, KS1

5. Know how to reflect on their teaching and devise ways of improving it on an ongoing basis

KTA2, KTA7, KTA8

6. Know how to collaborate with other educators to create the most complete educational environment possible for students

WO1 (also covered in 3S below)

7. Know how to use technology available to us today, at an intermediate level minimally

8. Know and appreciate various cultures, and the larger global society and how to establish rapport with a diverse population of students and parents

WO2, KTD5

9. Know how and where to get needed information and how to educate students to seek and evaluate information

KC4

Critical Skills Needed by Teachers

1. Ability to recognize and respond to individual differences in students KTD3, KTI3, KTI4, KTI5, KTI6, KS2, KS4, KS6, KS10

2. Ability to implement a variety of teaching methods that result in high student achievement

KTP5, KTP7, KTC8, KS5, KS11, KS12

3. Ability to work cooperatively with parents, colleagues, support staff and supervisors

WO4, WO5, WO6, WO11, WO12

4. Ability to display genuine love of teaching students (enthusiasm) KTP7, KTC1, KTC4, KTC5, KTC6, KTI7, KTI9, KTI10

5. Ability to implement full inclusion techniques for special education students

KS7, KS8 (also number 1K above)

6. Ability to differentiate instruction for variety of developmental stages and ability levels

KTA3, KS3, KS9

7. Ability to write, speak and present well

8. Ability to develop critical thinking skills with students KTP3, KTD4, KTI8

9. Ability and willingness to relate to parents and other community members, individual and corporate, in a positive and helpful fashion

WO7, KTD4, KTI8

10. Ability to know and utilize technology in the teaching and learning process11. Ability to implement conflict-resolution strategies for both adults and

studentsWO3, KTC2, KTC9, KTC3, KTC7, KTC2,

WO - Working with othersKTP- Knowledge of teaching-planningKTD - Knowledge of teaching-delivering instructionKTI - Knowledge of teaching-interactions with studentsKTA - Knowledge of teaching-assessmentKTC - Knowledge of teaching-climateKC - Knowledge of the content fieldKS - Knowledge of students

Page 25: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 25

Table 2 Question Alignment to Praxis III Document

Educational Testing Service, Princeton, New Jersey

Domain A: Organizing Content Knowledge for Student Learning

1. Becoming familiar with relevant aspects of students' background knowledge and experiences

WO2, KTC7, KC4, KS3, KS4, KS5, KS6, KS10

2. Articulating clear learning goals for the lesson that are appropriate to the students

KTP4, KC1, KC2, CK3, KC5, KC6, KS8, KC7, KC9, KC10, KC11

3. Demonstrating an understanding of the connection between the content that was learned previously, the current content, and the content that remains to be learned in the future

KC8, KC12, KS1, KS2

4. Creating or selecting teaching methods, learning activities, and instructional materials or other resources that are appropriate to the students and that are aligned with the goals of the lesson

KTP1, KTP2, KTP3, KTP5, KTP6, KTD1,KTD2, KTD3, KS7, KS9, KS11

5. Creating or selecting evaluation strategies that are appropriate with the goals of the lesson

KTA4, KTA5, KTA6

Domain B: Creating an Environment for Student Learning

1. Creating a climate that promotes fairness KTC3, KTC42. Establishing and maintaining rapport with students KTD5, KTC2, KTC53. Communicating challenging learning KTC7, KTC64. Establishing and maintaining consistent standards of classroom behavior KTC1, KTC9, KTC10, WO35. Making the physical environment as safe and conducive to learning as

possible(covered in Domain B3)

Domain C: Teaching for Student Learning

1. Making learning goals and instructional procedures clear to students KTC8 (also in questions A1, A2, A3, and A4)

2. Making content comprehensible to students KTD6 (also in questions A1, A2, A3, and A4)

3. Encouraging students to extend their thinking KTD4, KTI1, KTI2, KTI4, KTI5, KTI6, KTI8

4. Monitoring students' understanding of content through a variety of means, providing feedback to students to assist learning, and adjusting learning activities as the situation demands

KTD8, KTI3, KTA1, KTA2, KTA3, KS12

5. Using instructional time effectively KTD7, KTC8

Domain D: Teacher Professionalism

1. Reflecting on the extent to which the learning goals were met KTA7, KTA8, WO72. Demonstrating a sense of efficacy KTI7, KTI9, KTI103. Building professional relationships with colleagues to share teaching

insights and to coordinate learning activities for studentsWO1, WO9, WO10, WO11, WO12

4. Communicating with parents or guardians about student learning WO4, WO5, WO6, WO8

WO - Working with othersKTP- Knowledge of teaching-planningKTD - Knowledge of teaching-delivering instructionKTI - Knowledge of teaching-interactions with studentsKTA - Knowledge of teaching-assessmentKTC - Knowledge of teaching-climateKC - Knowledge of the content fieldKS - Knowledge of students

Page 26: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 26

Table 3 Levels of Development--Working with Others

Level Description ExamplesLevel 1 Egocentric orientation--concerned more about self

than others. Others are valued for what they can provide.

Does not believe in the "social capital" principle where the construction of an interactive web of relationships of parents, children, teachers, and the school community is important.

Teachers view themselves as relatively independent from their colleagues. They rarely seek or accept assistance.

Cooperation with others is viewed as necessary but to be avoided if possible.

Establishes self-contained classroom with little interaction with others

Rarely if ever invites community members to participate in the instructional process.

Parents are viewed as outsiders with little to contribute

Communication with parents is one-way Asks colleagues for help or ideas only as

necessary and often view this assistance as coming from a personal weakness

Cooperation with others on committees is viewed as a necessary evil

Sharing of materials with other teachers is rare Parents are viewed as enforcers of the teacher's

rules but not active assistants in the learning process

Level 2 Focuses on own classroom but sees the importance of school coordination and interactions with others for the "good of the school"

Works cooperatively with other teachersBelieves in the value of coordination for specific

purposes such as curriculum sequencing, discipline policies, building school spirit, etc.

Views parent help as important but within the context of the teachers instructional program

Believes community members can be a good source of lesson enhancement but must coordinate with teacher's plans

Work on and with curriculum committees to establish outcomes, curricular sequences, assessment measures

Works with others to establish consistent discipline routines within the school

Routinely employs parents as "helpers" Communicates with parents about the

importance of students completing homework or other assigned activities outside of school

Communicates student progress information with parents

Listens to parents during established conference times but makes only minor modifications to the planned lessons as a result of this parent information

Level 3 Altruistic motivation is the driving force for these teachers. Concerned with the larger good

Great respect for "social capital" ideaEnjoys contact and interactions with othersViews interactions with others as essential for own

personal developmentDerives energy and sense of well-being from

interactions with othersViews own participation in community activities as

essential for group welfare.

Likely to ask the question, "How will this affect the students or school?"

Clearly interested in building a sense of school community and cohesiveness. Volunteers for committees to bring the school and community together

Actively seeks to work with others and thrives in such an environment. Enjoys mentoring

Believes the collective power of the staff is more potent than the sum of the school's individual members

Views parents as equal partners in the education of children.

Views the integration of the school and community is essential for relevance, cooperation, building of values, and growth.

Actively involved in community activities, especially those focused on improving the larger community or generating positive assistance for the school

Page 27: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 27

Table 4 Levels of Development--Knowledge of Content

Level Description ExamplesLevel 1 Minimal knowledge

Lacking basic college coursework in much of the field

Teacher barely stays ahead of students Extensive reliance on one text or teacher's guide Lack of creation of extension material Lets students teach each other with little teacher guidance or

thought Teacher does not know if textbook has errors or other

perspectives exist Teacher often has gaps in knowledge base or inaccurate

knowledge/understanding Teacher may make content factual errors Student probing questions are often deflected or postponed Teacher's product questions far exceed process questions Test tend to be factual not conceptual or predictive in

orientation Depends on singular text to decide what to cover Teacher will actually skip activities suggested in the text or

curriculum guide if perceived to call for student conceptual understanding (memory/ recitation often the standard)

Level 2 Adequate knowledge base typical of a 36 hour college major in the subject field

Solid grounding in the field with preparation covering all content which he/she teaches

Rarely makes errors of fact, expressions, or logic in content field

Typically relies on external guides, curriculum, text to decide what content to select

Does not question standard field organization or usual curricular content/format

Can pull information from multiple sources/texts and will often decide to cover some parts of one text but not other parts based on intuitive understanding of student needs

Testing includes factual information and also some conceptual probing of the student

Level 3 Expert knowledge typical of a major with more than 50 hours in the fieled.

Additional coursework in collateral fields.

Experience in a professional job in the field. (Teaching the subject K-12 would not count as field experience.)

Evidence of subject knowledge currency.

Connects subject to students' external world Links subject to other content areas Designs or organizes own content knowledge or organization

not found in existing texts Posseses information not in general knowledge or written

sources Can easily extend students' questions to other areas (facilitate

transfer) Committed to the subject field and continuous active learning Continues to expand content for students based on recent

findings (content current and timely) Understands multiple organizations within the field Knows subject fields not typically covered in K-12

curriculum Clear idea what students should know from field independent

of text/curriculum based on logic or experience Content organized and presented based on how students learn

and maybe independent of standards or typical publications Possess understanding or viewpoints not discussed in typical

K-12 textbooks/ curriculum guides

Page 28: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 28

Table 5 Levels of Development--Knowledge of Instruction (Delivery of Instruction)

Difficulty Description of Teacher Skill ExamplesLevel 1 Coherence of an instructional delivery

plan is typically lackingTeacher dwells on simple one-step

teacher behaviors that are relatively independent of other behaviors. No connections made between differing parts of the lesson

Coherent models of teaching are generally lacking

The teacher discusses the lesson as separate parts with little consideration given to prior or subsequent learning

Activities generally do not reinforce lesson development and often appear to be selected randomly or without thought

The teacher is aware of some general teaching strategies (lecture, group work, etc.) but has little idea of specific steps involved in each model or how the steps should be executed

The selection of activities, examples, metaphors, etc. generally does not match the lesson objectives and might even serve to confuse students.

Level 2 Teachers considers multi-part segments of the lesson, how they fit together, why the sequence is important

Behaviors by the teacher are a result of preplanning and smooth execution.

These behaviors are generally independent of the students' behavior.

Several different models of teaching may be considered as possible, but decisions are made prior to much diagnosis.

Mechanical implementation of a particular model of teaching from one of the families suggested by Joyce and Weil (behavioral, social, information processing, personal)

Careful planning of a music lesson devoting specific time periods of practice to selections within the pieces and focusing on certain techniques students need to master within those sections

Teaching a literature unit followed by assigned student reaction papers and grading of those papers via a specific rubric

Level 3 Teacher considers multi-part behaviors that are selected and executed based on ongoing analysis of classroom events.

The teacher generally makes future directional decisions based on current classroom events.

Teacher is prepared to use different models of teaching to achieve the same outcome.

Selection of the model is dependent on the desired outcomes and expected reactions and background of students

New instructional designs might be devised on the spot.

Teacher makes an intentional shift from a direct presentation to a review when it becomes apparent that "guide" students do not understand important concepts or processes. Subsequent analysis on the teacher's part (by additional probes of student understanding) sets a direction for future teaching behaviors.

The teacher initiates a class discussion and makes decision about the direction of the lesson from the students' reactions.

The teacher makes active decisions concerning what import concepts to summarize from the discussion and where to direct the class next.

Artful implementation of various models of teaching is common where the teacher knows the characteristics of each model, the necessary elements to follow, and when to abandon the model.

Each part of the lesson is carefully integrated and intentional

Page 29: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 29

Table 6 Levels of Development--Knowledge of Instruction (Planning)

Difficulty Description of Teacher Skill ExamplesLevel 1 Teacher planning behaviors focus on his

or her role in the class. The primary focus is on what the teacher

intends to do within the class setting relatively independent of considerations about the students' learning goals.

Little consideration is devoted to the linkages among goals, activities, and assessment.

Planning activities are very dependent on textbook suggestions or similar outside guidance.

Teacher decides to review the previous lesson for about 5 minutes then move into a mini-lesson about adverbs. She then plans to assign 5 short sentences asking the students to identify 4 adverbs and provide a reason why for their answers.

Teacher decides to do a demonstration illustrating the differences between hydrogen and oxygen gas. She plans to ignite the two gasses and also demonstrate the difference in mass.

Level 2 At this level of teacher planning, the teacher begins to think about what he or she wants the students to be able to do at defined points in the lesson (not simply at the end).

Often the teacher will use student outcome objectives as a guiding principle to help with content selection, materials management, testing, and so forth. Often this skill is labeled "Task Analysis".

The content and methods of the academic discipline often dictate the nature of the teacher's lesson at this level.

The teacher might think to herself--now let me think, I want the students to be able to eventually add 2 digit numbers involving carrying. To do so they first will need to be able to ……., after that the students will have to ……, then finally they will need to be able to …… . At the end of the first ten minutes, the students should have mastered …… . To achieve these objectives I think I could use these materials (she goes on to explain) and organize them like this. Having the students interact with the materials in this way (she goes on to explain) should result in this type of learning.

Level 3 Teachers at this level begin to incorporate branching designs into the planning of their lessons, so that they could easily vary the content and method based on classroom feedback.

They have several possibilities they could pursue, but will wait to make final decisions until they obtain additional diagnostic feedback from the class or individual students.

Activities, outcomes, assessment, procedural strategies, etc. are clearly conceived but flexible and can be quickly organized and re-organized based on ongoing assessment (dynamic) of current student feedback.

A chemistry teacher thinks to himself--although it is important for the students to understand how the Periodic Table is organized, it is more important they understand the process scientists use to attempt to make sense of seemingly unrelated bits of information. To give the students a sense of this discovery process, I will give each group of students 50 cards containing information about 50 different fictitious atoms. It will be there job to organize the 50 cards into some form of order. There are several possibilities the students could come up with (arranging by size, color, state of matter, melting point, etc. or some combination) so I will have to wait until I see their logic before the next part of the lesson can be designed. For example, if they put the cards in order of the number of electrons in the outer shell, then we can talk about families of elements. I have in mind about 10 different ordering concepts in my mind but will discuss them in order of their discovery by students.

Page 30: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 30

Table 7 Levels of Development--Knowledge of Instruction (Climate Development)

Level Description ExamplesLevel 1 General lack of student attentiveness to

academic tasks.Caustic environment exists with little

respect for others. Student safety may be an issue.

Lack of classroom rules, rule practice, rule compliance, and rule monitoring

Inconsistent and unpredictable discipline practices Teacher has difficulty maintaining student's attention--

frequently off-task. Confusion is common within the classroom. Withitness lacking and desists are often off target or

directed to entire class Students hesitate to ask questions or express opinions Lesson momentum problems are common (pacing, start-

stop actions, fragmentation, etc.).Level 2 Students are attentive to teacher

directed instruction but not necessarily when working in unsupervised groups.

Students ask lesson-based questions but are still guarded about expressing opinions not related directly the content of the lesson.

Rules are developed by the teacher, taught to the students, and monitored.

Withitness is present with the teacher actively anticipating most potential behavioral problems.

Desists are correctly structured and directed appropriately.

Students feel physically and psychologically safe in class. On-task student behavior is common when teacher is in

the room directing activities. Behavioral problems are infrequent, but if they occur they

are swiftly addressed. Teacher open to some diversity of opinion as long as

previously discussed in classLevel 3 Students self-regulate behavior

commensurate with the learning goals.

Discipline problems are rare. Students feel comfortable, cohesive,

secure, interested, and value learning.

Students will share feelings and aspirations with the teacher and class members.

Rules are jointly developed and enforced by the teacher and students.

Students feel free to express opinions minority opinions--even very radical ones.

Withitness is extreme. Teacher knows exactly what is going on everywhere and can anticipate almost all events.

Pace is lively and directed toward learning objectives. Teacher praise is appropriated, directed as needed, and

functions to support classroom interactions and learning. The teacher can leave the room for brief periods with

little degradation in learning. Enthusiasm is common among the students--a certain

"electricity" can be felt within the classroom.

Page 31: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 31

Table 8 Levels of Development--Knowledge of Instruction (Assessment)

Level Description ExamplesLevel 1 Assessments viewed as a means of

student control. Assessments are infrequent, rarely

measure important concepts, poorly designed, with results rarely affecting instruction.

No attempt made to prepare students for the assessments.

Singular dimensions are typical of assessments.

Student feedback is delayed and non-specific

Teacher threatens to give students a test if they do not behave

Assessments are delayed for long periods of time Assessment timing or coverage does not

correspond with the lessons Student practice exercises do not correspond with

the format of the assessment Feedback is non-specific and does little to help the

student understand how to correct learning Assessment administration is poorly planned and

executed--too little or too much time, inadequate materials, poorly designed questions, general student confusion about what to do

Teacher often miss-estimates student ability to perform on the assessment

Level 2 Assessments are viewed as a means of grading and to some extent providing feedback to the teacher about instruction.

Assessments may tap several skill areas but are primarily paper instruments.

Students are prepared for the content of the test but sometimes are surprised by the form.

Student feedback is prompt and reasonable but may be limited to correct and incorrect.

Teacher typically administers multiple-choice, essay, or short answer tests

Tests are graded in the conventional way indicating errors and correct answers

Teacher usually custom tailors test to course objectives

Assessments and practice activities correspond Assessments flow from the lessons and homework Teacher makes some adjustments in the lesson

based on the formal assessment results Assessments serve as the basis for grades or other

summary marks Feedback is limited to corrections and general

information about class descriptive statisticsLevel 3 Assessments viewed as a means of

diagnosing individual student process and product understanding.

Students are carefully prepared to take the assessments.

Multiple dimensions of student understanding and performance are measured.

Assessments are well designed and scored.

Student feedback is rapid, detailed, and addresses student strengths and weaknesses

Assessments and the lessons correspond Assessments are multi-dimensional and measure

various aspects of the lesson--cognitive achievement, skill development, etc.

Assessments measure various achievement levels--analysis, synthesis, knowledge acquisition, etc.

Students practice assessment via simulations Feedback is detailed, prompt, and individualized Corrective teaching follows errors Clear scoring algorithms are present and known to

the students Assessments are heavily used as a basis for

planning of next instructional sequence

Page 32: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 32

Table 9 Levels of Development--Knowledge of Instruction (Interactions)

Difficulty Description ExampleLevel 1 Information and interactions are often confusing for

students. They have difficulty knowing what is expected.

Teacher discourse is often unconnected and rambling from one point to another.

Vocabulary, illustrations, metaphor selection is unfamiliar to students.

Students have difficulty sorting out important material from background information--retention and marker techniques are rarely employed.

Non-verbal communication may conflict with the verbal requests of the teacher or difficult for student to interpret.

Students are hesitant to engage in discussions or answer questions.

Correctness of answers are the focus of student-teacher interactions.

Teacher seems unwilling to address student individual or group problems.

Threatening or difficult to answer questions are ignored.

Teacher asks a question of a specific student. The student just sits there without making any response at all because he or she either doesn't understand the question or is not aware that he or she is expected to say anything.

The teacher asks a student to identify the three causes of the Vietnam war. The student responds that there were large student protests, the draft was unpopular, and LBJ elected not to run for a second term. While these responses are related to the time period, the student was not able to correctly sort through all the teacher provided detail to respond appropriately.

The teacher asks a question of John. John responds with an incorrect answer. The teacher says, "That is not correct" and then asks another student.

Level 2 Teachers at this level engage in typical interchanges with students. Some questions are answered correctly and other incorrectly which usually prompts the teacher to rephrase or return with another question.

Vocabulary is familiar to students but may not be that currently used by students.

Many students are engaged in discussions but a large portion remain silent or only respond is asked.

The teacher is open to discuss controversial topics but only within the context of the lesson.

Non-verbal communication is appropriate and generally clear to students.

Teacher asks the students as a whole, "What is 4 + 4?" She then specifically directs the question to Jill. Jill responds incorrectly, and the teacher repeats the same question without comment but this time directs the question to John.

The teacher is holding a general discussion about the Vietnam War and its initial effect on the US economy. Half the class seems engaged, but many of the boys in the back seem to be staring out the window and rarely respond unless asked.

Level 3 Teacher presents information in a way that increases the chances students will comprehend

Teacher thematically connects statements and links student responses to prior material

Teacher uses a vocabulary familiar to students and rephrases when necessary

Teacher questions are understandable to students and rephrased when needed for additional clarity

Teacher uses discourse marker techniques to indicate what is important in the subject matter including marking expressions, repetition, and numeration of major points

Teacher employs non-verbal behavior as a way of signaling students

Negative student responses are dignified and redirected ultimately searching for a opportunity (redirecting, amplifying, restating, and refocusing)

The teacher asks questions using information familiar to the students’ background.

The teacher asks John, "What are the colors of Germany's flag?" John responds, "Red, white, and blue." The teacher then says, "I think you are thinking of the USA flag John, why don't you take a few moments to look that one up and I'll ask you again in a few minutes."

The teachers says, "Now class, yesterday you will remember that we talked about the impact of LBJ’s decision not to run again on the war effort, today we will extend that discussion to the home front. At the end of the discussion you should be able to identify the three top changes that decision had on the Democratic party."

Page 33: Technical Manual - University of Kansashoward/ICIS Material/Technical Man…  · Web viewShe then plans to assign 5 short sentences asking the students to identify 4 adverbs and

Technical Manual for the Interactive Computer Interview System Page 33

Table 10 Levels of Development--Knowledge of Students

Level Description ExamplesLevel 1 Minimal teacher knowledge of

educational psychologyTeacher lacking basic exposure to

students of this age or background Teacher views all students as similar

Unfamiliar with student expressions, attire, interests, music, etc.

Views students in blocks with little differentiation Has difficulty anticipating entering students knowledge base Often incorrectly diagnosis levels of student mastery or

understanding Often selects activities which have little student interest Teacher rewards often do not match motivational interests of

students Teacher is often at a loss to explain specific student behavior

in or out of class Teacher lacks common vocabulary such as stimulus-response

theory, behavioral modification, etc.Level 2 Academic knowledge, student

teaching experience, and non-school related teaching exposure to students such as summer camps or church school.

Teacher knows some personal background information about most students.

Generally familiar with at least student surface-level interests but this knowledge may be related to a specific context such as summer recreation programs

Begins to integrate class background information into lesson design

Teacher know principles of educational psychology and understand how they can be applied in the classroom but may not have much experience doing so.

Teacher realizes the diversity that might be present in the classroom and begins to make adjustments

Begins to see individual students in the class and modify instruction accordingly

Level 3 Academic knowledge, teaching experiences in the same context, community experience, and out-of-class contact in students' environment

Views students as unique individuals

Knows virtually all students well and also many parents/community members

Understands student inside jokes and expressions and can easily interact with students

Almost always correctly anticipates student reactions to teaching situations

Lesson design intentionally considers how individual students will react and learn.

Can correctly diagnose level of student understanding of both simple and conceptually complex material

Understands how home background will influence student reactions in class

Can describe background characteristics of individuals students and how they influence learning in the classroom

Can anticipate reaction patterns of individual students in the classroom to the same lesson

Knows how students will respond to each other given a specific classroom lesson or activity