usingeducationforthefuture questionnaires

14
In 1991, while working intensively with a group of schools that were looking for ways to improve their results, Education for the Future created questionnaires to help the schools understand their learning environments from the perspective of students, staff, administrators, and parents. The schools thought they were doing a good job, but they did not have any real input from their customers and staff. The casual conversations in the school halls and teachers’ lounge, at extracurricular events, parent conferences, and the grocery store provided narrow observations; and the schools wanted to hear from a wider range of those who lived in the community and those who were directly impacted by the services that the schools offered. Education for the Future designed, administered, and analyzed the results of the questionnaires. The questionnaire results helped each of the schools improve its operations and get student achievement increases. Today, Education for the Future continues to use updated versions of these questionnaires—across the United States as well as in other countries. Education for the Future has added related questionnaires to its collection, currently offering more than a dozen different questionnaires to assist schools and districts with continuous improvement. This paper describes how these questionnaires were developed and how they are currently being used. Definitions of perceptions, validity, and reliability are presented first. Perceptions The definitions of perceptions and its synonyms provide almost enough information to understand why it is important to know the perceptions of our students, graduates, teachers, administrators, and parents. The word perception leads us to such words as “observation” and “opinion,” with definitions that A SSESSING P ERCEPTIONS Using Education for the Future Questionnaires By Victoria L. Bernhardt Where did the questionnaires come from, what do they tell us, and are they valid and reliable? Page 1 of 14 © Education for the Future, Chico, CA Not to understand another person’s way of thinking does not make that person confused. Michael Quinn Patten

Upload: others

Post on 20-Mar-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

In 1991, while working intensively with a group ofschools that were looking for ways to improve theirresults, Education for the Future createdquestionnaires to help the schools understand theirlearning environments from the perspective ofstudents, staff, administrators, and parents. Theschools thought they were doing a good job, butthey did not have any real input from theircustomers and staff. The casual conversations in the

school halls and teachers’ lounge, at extracurricularevents, parent conferences, and the grocery storeprovided narrow observations; and the schoolswanted to hear from a wider range of those wholived in the community and those who were directlyimpacted by the services that the schools offered.Education for the Future designed, administered, andanalyzed the results of the questionnaires. The

questionnaire results helped each of the schoolsimprove its operations and get student achievementincreases.

Today, Education for the Future continues to useupdated versions of these questionnaires—acrossthe United States as well as in other countries.Education for the Future has added relatedquestionnaires to its collection, currently offeringmore than a dozen different questionnaires to assistschools and districts with continuous improvement.

This paper describes how these questionnaires weredeveloped and how they are currently being used.Definitions of perceptions, validity, and reliability arepresented first.

PerceptionsThe definitions of perceptions and its synonymsprovide almost enough information to understandwhy it is important to know the perceptions of ourstudents, graduates, teachers, administrators, andparents.

The word perception leads us to such words as“observation” and “opinion,” with definitions that

ASSESSING PERCEPTIONSUsing Education for the Future

QuestionnairesBy Victoria L. Bernhardt

Where did the questionnaires come from,

what do they tell us, and are they valid and reliable?

Page 1 of 14© Education for the Future, Chico, CA

Not to understand another

person’s way of thinking does

not make that person confused.

Michael Quinn Patten

Page 2 of 14© Education for the Future, Chico, CA

include—

◆ a view, judgment, or appraisal formed inthe mind about a particular matter

◆ a belief stronger than impression andless strong than positive knowledge

◆ a generally held view

◆ a formal expression of judgmentor advice

◆ a judgment one holds as true

Synonyms offered by the Merriam-WebsterDictionary include opinion, view, belief, conviction,persuasion, and sentiment:

◆ Opinion implies a conclusion thoughtout yet open to dispute.

◆ View suggests a subjective opinion.

◆ Belief implies often deliberateacceptance and intellectual assent.

◆ Conviction applies to a firmly andseriously held belief.

◆ Persuasion suggests a belief grounded onassurance (as by evidence) of its truth.

◆ Sentiment suggests a settled opinionreflective of one’s feelings.

All of us have perceptions of the way the worldoperates. We act upon these perceptions every dayas if they are reality. Basically, we do not actdifferently from what we value, believe, or perceive.In schools, we want to know what is valued,believed, or perceived. In other words, we want toknow what has to be in place in order for studentsto learn—from the student perspective—and whatis possible from the teacher and administrativeperspective.

Assessing PerceptionsCommon approaches to understanding perceptionsin schools include the use of questionnaires, focus

groups, and interviews. While each of theseapproaches provides good information,questionnaires may be the best way to assessperceptions because they can be completedanonymously and readministered to assess changesin individuals’ experiences and thinking over time.

A questionnaire is a system for collectinginformation to describe, compare, and explainknowledge, attitudes, perceptions, or behavior.Good questionnaires have the following features:

◆ Relay a strong purpose for participantsto complete the questionnaire

◆ Are concise and to the point

◆ Contain items that everyone canunderstand in the same way

◆ Include all participants because we wantall those answering the questionnaire tofeel their opinions are valued, plus wewant the results to be used later

◆ Start with more general items and leadto more specific items

◆ Have response options that make sensefor the question

◆ Use analyses appropriate to the itemsand their response options

◆ Provide reports that truly present theresults clearly and accurately

Questionnaires must be based upon an underlyingassumption that the respondents will give truthfulanswers. To this end, items or questions must beasked that are—

◆ valid—ask the right questions

◆ reliable—will result in the same answersif given more than once

◆ understandable—respondents knowwhat you are asking

◆ quick to complete—brain-compatible,designed well, and concise

◆ able to get the first response from therespondent—administration and set-up

◆ justifiable—based on a solid foundation

ValidityValidity is about asking the right questions to justifywhat you get in the end. If the content of aquestionnaire matches a situation that is beingstudied, then the questionnaire has content validity.

The content validity of Education for the Futurequestionnaires was ensured during the process ofquestionnaire development. Items were draftedbased on the literature about effective schools andincluded issues important to students and teachersinterviewed by Education for the Future staff before,during, and after questionnaire administration. Thequestionnaires were revised after years of inputfrom all parties and are regularly monitored forupdating.Why we use the items or questions we useare shown in the tables in Figures 1 through 3,discussed later in this paper.

ReliabilityReliability is a measure of an assessment instrument,such as a questionnaire, that says that if we give thesame instrument over time, we will get the sameresults.

Education for the Future wanted its questionnairesto be reliable. However, we also wanted thequestionnaires to be able to show change, if therewas change in a learning organization.

Our questionnaires were administered in theoriginal Education for the Future schools in Octoberand April three years in a row. We found mostly thesame results for students and parents in Octoberand April within each year. We could see thequestionnaire results change from April to Octoberwhen there were changes implemented in schoolprocesses. These findings made us think that

whatever perceptions students and parents have atthe beginning of a school year are the sameperceptions they will have at the end of the year,unless there has been some systemic change.

Staff questionnaire results change when changes aremade in the system, such as the creation of a visionor implementation of a new plan. If somethingdifferent is implemented, or relationships change,teachers’ responses on related items change.We findthat student responses will change if teachers’responses change. If teachers’ responses do notchange, student responses do not change. Ourcurrent reliability quotients are .93 for theelementary student questionnaire, .97 for thesecondary student questionnaire, .86 for the staffquestionnaire, and .90 for the parent questionnaire.

Where the Items Came FromThe items used in the Education for the Futurequestionnaires were created from the research aboutstudent learning and what students, teachers, andparents tell us have to be in place in order forstudents to learn. For example, William Glaser (TheQuality School, 1990) believes that students have tofeel safe, like they belong, have freedom, fun, andchoices in their learning in order to learn. Studentstell us that the one thing that has to be in place inorder for them to learn is that their teacher(s) caresabout them.

Figures 1 through 3 describe why the different itemsare used in our student, staff, and parentquestionnaires, respectively.

Note: We purposefully do not use exactly the sameitems or questions in student, staff, and parentquestionnaires. In our experience withquestionnaire administration, we have found thatthe items or questions for each of these groups needto be different. For example, we think it is valid andimportant to ask students if they feel like theybelong at school. It is not valid to ask parents if they

Page 3 of 14© Education for the Future, Chico, CA

think the students feel like they belong at theschool—that information would be second-hand. Itis valid and important to ask parents if they feelwelcome at the school. It would not be appropriateto ask students if they think their parents feelwelcome at the school. How does anyone know howsomeone else feels? We want to ask questions of thesource.

Page 4 of 14© Education for the Future, Chico, CA

FIGURE 1Education for the Future Student Questionnaire

Page 5 of 14© Education for the Future, Chico, CA

FIGURE 1 (Continued)Education for the Future Student Questionnaire

FIGURE 2Education for the Future Staff Questionnaire

FIGURE 2 (Continued)Education for the Future Staff Questionnaire

© Education for the Future, Chico, CA Page 6 of 14

FIGURE 2 (Continued)Education for the Future Staff Questionnaire

© Education for the Future, Chico, CA Page 7 of 14

FIGURE 3Education for the Future Parent Questionnaire

© Education for the Future, Chico, CA Page 8 of 14

DisaggregationsWe believe that, collectively, these questionnaireitems are powerful. We also know that the itemsbecome even more powerful when we disaggregatethe responses. The disaggregations that we use foreach questionnaire grouping follow:

Students

◆ Gender

◆ Ethnicity

◆ Grade

◆ Extracurricular participation(high school)

◆ Grade level when first enrolled in theschool (high school)

◆ Plans after graduation (high school)

◆ School within schools identification

Staff

◆ Gender (if there are appropriate numbers in

each subgroup)

◆ Ethnicity (if there are appropriate numbers

in each subgroup)

◆ Job Classification

◆ Grades and subjects taught

◆ Number of years of teaching

◆ Optional: Teaching Teams, ProfessionalLearning Communities, etc.

Parents

◆ Number of children in this school

◆ Number of children in the household

◆ Children’s grades

◆ Native language

◆ Ethnic background

◆ Who responded (Mom, Dad,

Grandparent, Guardian)

◆ Graduate of this school (high school)

The ScaleEducation for the Future staff worked hard to createitems that participants could respond to quickly,with results that could be displayed in a meaningfulway and easily interpreted. We wanted staffs to beable to see the item results relative to each other. Todo this, we piloted many different scales, including99, 10, 7, 6, 5, 4, and 3-point scales. We ultimatelyand easily chose the use of a 5-point scale. Any scalethat had more than 5 points upset therespondents—it was too difficult to respond to suchintricate distinctions. Respondents gave us lessinformation and did not complete thequestionnaire when they did not like the responseoptions. The even numbered scales did not allow usto average the responses, and averaging provides theeasiest understanding of the relationship of theresponses to each other. The even numbered scalesdid not allow respondents to give a response thatindicated half the time “yes” and half the time “no,”or “just do not have an opinion at this time.” Thethree-point scale did not discriminate enough.

Most staffs are used to results being displayed by thepercentage or number of responses in agreement ordisagreement with each item. Staffs struggle to findthe most important, the least important, and whatto do with the information. By using averages anddisplaying those averages together on a line graph,staffs see the degree of agreement and disagreementacross items and can look at the relationship of theitem content to other item content.

Our 5-point response options are most often:strongly disagree, disagree, neutral (neither agree nordisagree), agree, and strongly agree. Sometimes weuse “effective” response options in place of “agree,”when appropriate for the items. These options areplaced from lowest to highest, or left to right, inbrain compatible order. All items are written in apositive manner so the results do not need to beinverted to understand the most positive and themost negative responses. We totally disagree with

© Education for the Future, Chico, CA Page 9 of 14

© Education for the Future, Chico, CA Page 10 of 14

statisticians who say you need to ask questions intwo different ways to make sure your respondentsare telling the truth. We have found this methodfrustrating to respondents; plus, writing questionsin a negative fashion leads to double negatives and isusually not brain compatible. Not using braincompatible methods will elicit results that make youthink some of the respondents must have invertedthe scale. However, one cannot arbitrarily invert thescale for them. The analyst must accept theirresponses. Typically, one could end up gettingunreliable data, or have to throw out such questions.

Changing PerceptionsIs it possible to change perceptions? Absolutely.How do we get perceptions to change? The mosteffective approach to changing perceptions isthrough behavior changes. This means if someteachers do not believe in an approach beingproposed for implementation in the classroom, oneway to change the teachers’minds is to increase theirunderstanding of the approach and give them anopportunity to experience it. Awareness andexperience can lead to basic shifts, first in opinions,and then attitudes and beliefs. This is why manyschools have parent nights when there is a change ina math or technology curriculum.Giving parents anopportunity to experience the approach helps themunderstand the different perspective, which couldmake them more supportive of the program. This isalso why excellent Professional Developmentprograms have coaching and demonstrationcomponents along with their content training.

Another way to change perceptions is throughcognitive dissonance, an inconsistency between twoor more thoughts, opinions, or ideas. Cognitivedissonance is the discomfort one feels when holdingtwo ideas that are inconsistent. Cognitivedissonance creates perception changes when peopleexperience a conflict between what they believe andwhat they, or trusted sources, experience.

In order to change the way business is done, schoolsestablish guiding principles, which include thepurpose and mission of the school. These principlesgrow out of the core values and beliefs of theindividuals who make up the school community.Sometimes school communities adopt guidingprinciples that they want and hope to believe in, asopposed to those that they do believe in. The idea isthat those who try out behaviors that are consistentwith these principles will see positive impact,leading to change in their internal thinking andbeliefs in those principles. This is okay. Changedattitudes represent change at the deepest level of anorganization’s culture.

Too often schools think of their guiding principlesas being sacred and static. They might be sacred, butthey should never be static. Even if a school keeps itsguiding principles intact, their meanings evolve aspeople reflect and talk about them and as theprinciples are applied to guide decisions andactions.

An example of behavior changes precedingperception changes follows:

Blossom Middle School teachers were given aquestionnaire about their values and beliefsabout technology—how they believedtechnology would increase student learning,and in what ways e-mail, the Internet, andvideoconferencing used in instructional unitswould impact student learning. Additionally,the students were given a questionnaire askingthem their impressions of the impact oftechnology on their learning.

For two years, the results were almost thesame: Nothing was happening with respect tothe implementation of technology orperceptions about technology in the classroom.In the meantime, teachers became involved intheir own professional learning withdemonstration and coaching components;administration placed typical staff meeting

© Education for the Future, Chico, CA Page 11 of 14

items on e-mail, requiring teachers to beginimplementing technology for personal use.With these strategies; teachers beganimplementing technology in their classrooms—resulting in major behavior changes.

During the following year, it became clearfrom the questionnaire results that theclassrooms were different because teacherswere using technology—first for their ownbenefit, and then with and for students.Whenteachers’ actions changed in the classroomwith the use of technology, their ideas andattitudes changed about the impacttechnology could have with respect toincreasing student learning. It was also easy tosee in the student questionnaire responses thatstudent perceptions of the impact technologycould have on their learning also changed—after the teachers’ behaviors and attitudeschanged.

Again, if we want perceptions to change—and weusually do as we implement new concepts andinnovations—we need to change behaviors. As theexample above illustrates, to change studentperceptions, teacher perceptions must change,which requires teacher behavior to change.

Just a quick note about changing teacher behaviors:When we survey teachers about making desiredchanges in their classrooms, very close to 100% ofthe teachers who are not implementing the vision,or teaching to the standards, will say it is becausethey do not know what it would look like if theywere implementing these concepts in theirclassrooms. This has huge implications for howchange is supported. Powerful professional learningdesigns need to be incorporated to support andensure that behaviors change.

What We Have Learned About theEducation for the Future QuestionnairesPeople ask us all the time what we should expect tosee in the results—what are typical results fordifferent student age groups, parents, and staffmembers. They also ask what the results say abouttheir learning organization. While we preferrespondents to bring their own meaning to theresults, below are some of the typical findings.

Across All Respondents

What respondents, especially school staffs, like mostabout our questionnaires are the displays of resultsand the fact that the items are meaningful (Figure4). By showing the average responses of each itemwith all other items on the same line graph, staffscan understand relationships of the items to eachother and the relationship of the high responses tothe lower responses and how the highs couldprovide leverage for bringing up the low itemresponses. This display has additional advantages:Staffs are more likely to look across all of the itemsbefore creating a plan, as opposed to picking thelowest item and then creating a plan.

The line graphs provide clear displays ofdisaggregated responses by different groups ofrespondents, e.g., males/females. The disaggregatedresponses have helped many staffs find issues withindifferent student groups they did not know existed.The disaggregated responses have also showndifferences in how teachers perceive their work, bythe number of years of teaching, and the grades orsubjects they teach.

When questionnaire items are short andunderstandable to all respondents in the same way,when the items progress from general to morespecific, respondents are able to tell us very quicklyif they agree or disagree with each of the items. Wehave found that it takes passion for respondents to“strongly disagree” or “strongly agree.” When firstlooking across items, we search for those items

Page 12 of 14© Education for the Future, Chico, CA

FIGURE 4Our Elementary School Total Student Responses by Gender

Month / Year

about which students, staff, and parents are mostpassionate. These are leverage for improving the lowscores. When we look across the low scores, we cansee relationships of these items to each other. Theplan for improvement might include one majorpiece that will improve all items.

No matter what questionnaire you choose to use,you must always follow-up on the information tounderstand what respondents are saying—neverassume.

Students

For the most part, the younger the students, themore in agreement they are to all the items. Theolder the students, the less in agreement they are.Certainly there is a developmental aspect in playhere; however, there are young students who are notin strong agreement with the items, and there arehigh school students who are in strong agreementwith most of the items.We think the item responsestruly reflect the learning environment. If studentsare treated well and they like the way they arelearning, their responses will be mostly inagreement and higher. If students do not have funlearning, and they do not like the way they arelearning, most of their responses will be low.Students who do not feel that the teacher, orteachers, care about them or treat them fairly, haveoverall low responses.

Staff

Teacher questionnaire responses show the degree towhich staff work together to create a continuum oflearning for all students, if there is a clear and sharedvision, and if the administrators are adding value toschool processes, from the perspective of staff. Wehave found that teacher morale is very closelyrelated to how much staffs work together and arelead by strong administrators.

Teacher questionnaire responses tell us what isbeing implemented and what is possible withrespect to school improvement.

Parents

Parents basically report back to us what theirchildren tell them about school over the dinnertable. Most often the degree of parent agreement toitems matches student agreement/disagreement.However, if parents do not feel welcome at theschool, or feel that they are not sure how to helptheir children learn, they will not be involved theirchildren’s learning.

Summary

Education for the Future created a valid and reliableset of questionnaires for students, staff, and parentsin 1991 that continue to be used extensively today.The questionnaires have been updated over timebased on feedback provided by Education for theFuture customers. These questionnaires have shownthe impact of school change on student, staff, andparents. In addition, the questionnaires haveprovided valid and reliable information for schoolsto know what needs to change in order to getdifferent results. All of the people involved inschools, as well as those who are not directlyinvolved, have perceptions about how well schoolsare doing.We can use questionnaires to discover theperceptions that people have so that we can improvethe negative perceptions and build on the positiveones. Perceptual data are valuable and useful toensure positive changes in schools and in entiredistricts.

Page 13 of 14© Education for the Future, Chico, CA

Page 14 of 14© Education for the Future, Chico, CA (http://eff.csuchico.edu)

About Education for the Future

The mission of the Education for the FutureInitiative is to support and build the capacity ofschools to provide an education that will preparestudents to be anything they want to be in thefuture. Education for the Future strives to assistschools to become better equipped to educationallyprepare students to achieve their goals.

Education for the Future is a not-for-profit initiative,located on the California State University, Chicocampus, that focuses on working with schools,districts, State Departments of Education, and othereducational service centers and agencies on systemicchange and comprehensive data analyses that leadto increased student learning. In addition toworkshops and consulting, Education for the Futureprovides questionnaire support services to schoolsand districts who are participating in school reformefforts and not yet able to facilitate parts of theentire questionnaire process.

Victoria L. BernhardtExecutive Director

Education for the Future Initiative

400 West First Street

Chico, CA 95929-0230

Tel: (530) 898-4482 ~ Fax: (530) 898-4484

E-mail: [email protected]