prospectus version 2.8

Upload: amanda-l-wilson

Post on 10-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/8/2019 Prospectus Version 2.8

    1/25

    Amanda Wilson

    October 25, 2010

    EdS Thesis Prospectus

    How Students Perceive Their Learning

    An exploration of student reflections on assessment

    Statement of the Problem

    There has been a shift in higher education from teaching objectives to student learning

    outcomes. Much of this drive has precipitated from a shift in focus by regional accreditors of

    schools from how many resources a school has to educational effectiveness. The focus on

    student learning outcomes facilitates a culture of continual improvement of the education

    institution as a whole that strives to engage the entire academic community. Authentic

    assessment measures are becoming more common as supplements to more traditional assessment

    measures to provide valuable data needed to inform this culture of change and improvement. The

    outcomes assessment movementin foreign language classes has embraced these authentic

    assessment measures, mostly in the form of portfolio-style projects for students. Unfortunately,

    there is less research on how students are receiving these various forms of assessment and what

    they perceive as the benefits or drawbacks of each. The purpose of this study is to explore the

    reflections of students in a beginning level Spanish class towards various forms of traditional and

    authentic assessment tools.

    Significance of the Problem

    An Action Research Study Based in Classroom Research

    The focus of my study will be at the classroom level, specifically focused on students

    from my beginning Spanish 1 classes. This will be an action research study that will build out of

  • 8/8/2019 Prospectus Version 2.8

    2/25

    Classroom Research, as developed through the work of Dr. Patricia Cross. Cross defines

    Classroom Research as, ongoing and cumulative intellectual inquiry by classroom teachers into

    the nature of teaching and learning in their own classrooms (1996 p. 2). In her text on action

    research, Hendricks states that, the purpose of action research is for practitioners to investigate

    and improve their practice (2009 p. 3). These two perspectives on research as investigation of

    teachers into their practice for the purpose of improved student learning allows me to approach

    the issue of student perspectives on assessment in a very localized but very powerful way. By

    seeking out the opinions of my own students on the specific assessment tools I have used in their

    classes, we will have a concrete context in which we can explore their insights on how they learn

    and the way assessment affects their learning.

    Cross lists the characteristics ofClassroom Research as follows: Learner-Centered,

    Teacher-Directed, Collaborative, Context-Specific, Scholarly, Practical and Relevant, and

    Continual (1996 p. 2). My study will focus on my student perspectives of learning. It will of

    course be facilitated and directed by me, as their former teacher. It is collaborative in the sense

    that my students will help direct my research, conclusions, and future behaviors based on their

    feedback. This study is context-specific because it focuses on how these assessments are applied

    in, not just a foreign language class, but specifically a beginning Spanish class. The project is

    scholarly because it builds off ideas and insights researched and presented in the literature review

    below. My study is practical and relevant because I will use the data I gather to form conclusions

    that will inform my future practice. Finally my study is continual in the sense that it will serve as

    a foundation for future work on student reflections and working with different types of

    assessment, which will undoubtedly play a part in my doctoral research pursuits. Cross

    comments that, a Classroom Research project is not a one-shot effort that is completed,

  • 8/8/2019 Prospectus Version 2.8

    3/25

    published, and assumed to contribute one more brick to building the wall of truth (1996, p. 12).

    This study does not aim to predict how all students will view and appreciate various

    assessment techniques. Nor should the conclusions drawn be assumed to show any one truth.

    This will be an exploration of a select few students taking a specific course taught a certain way

    by me at one given time. The results of this study will most assuredly be different if it were

    duplicated. The purpose of this study is to explore the reflections of specific students in a

    specific beginning level Spanish class towards various forms of traditional and authentic

    assessment tools. I hope that by gathering and reflecting on their various opinions, I can make an

    informed decision about how to make the tools I use more effective for the learning experiences

    of future students.

    The process is one of trial and error and any study will give specific information that may

    be difficult to generalize to other populations. Cross says that, Classroom Research is based on

    the premise that generalizations across classrooms are, at best, tentative hypotheses, to be tested

    within the specific context of a given classroom (1996, p. 12). This study aims to create another

    piece of the puzzle. By collecting student reflections on assessment, it aims to give a tiny insight

    into how some students perceive these tools. The hope is that those insights, while very context-

    specific, might inspire me, and hopefully others, to continue asking the questions that must be

    asked in the classroom: How do students learn? How can teachers facilitate student learning?

    What tools can educators employ to create the most effective learning environment possible?

    Because at the very foundation of what it is to be an educator must be the hope, desire, and

    aspiration to help make sure it just keeps getting better.

    Fundamental Assumptions

    I hold a few fundamental assumptions going into this study. The first assumption that has

  • 8/8/2019 Prospectus Version 2.8

    4/25

    always rung true for me is that students are experts at being students. I teach university-level

    beginning Spanish 1 and my students are primarily freshman and sophomores, the very youngest

    being 17 years old. Said another way, every student in my class has been going to school for at

    least 12 years. While few have any knowledge of pedagogy, they have some awareness of how

    they learn. I believe that by asking students to reflect on the way they learn and what assessment

    tools have benefitted their learning experience in my beginning Spanish 1 courses, I can learn to

    see my classroom from their perspective and improve the way I approach and plan future classes.

    Students are not teachers and may not be always see what a teacher does or understand why they

    do it, but by gathering data from these learning experts, I can better understand how to facilitate

    their learning.

    The second assumption that I hold going into this study is that good assessment is not

    simply a measure of student accomplishment but a method to engage and promote learning. I

    chose to focus on assessment tools in this study because I worry that the goals of my course may

    not align well with the ways I assess learners. I see this as major worry across various institutions

    of learning and from various colleagues and students. It is common to hear anecdotes about

    diplomas not being worth the paper their printed on or about classes that are taken just to get the

    grade with no care or focus to learning the material the class purports to cover. These are

    pessimistic views and I am not crying foul on the whole educational institution by any means but

    there is a grain of truth in every story. I chose to pursue this study because I want to find the

    places where my assessment practices are aligning with my course goals and continue to pursue

    those avenues while discovering weaknesses in my practices that I can improve upon. I believe

    that by focusing on studying my current assessment methods, I can determine where students are

    simply going through the motions of assessment to demonstrate accomplishment and where they

  • 8/8/2019 Prospectus Version 2.8

    5/25

    are really being engaged by these tools. That is not to say that I believe accomplishment and

    engagement are mutually exclusive, simply that I want to make sure I am working to promote

    learning. If student learning becomes more effective through the use of the tools, I hope that

    demonstration of accomplishment and levels of engagement will also benefit.

    It Just Keeps Getting Better

    By conducting this action research study, founded in Crosss theories on Classroom

    Research, and focused on my specific students and assessment tools, I believe I will gain

    valuable insight into student perspectives on various forms of traditional and authentic

    assessment tools and lay a foundation for me to continue improving my teaching practice. The

    most important objective is to learn to be the most effective facilitator of student learning that I

    can be though continual reflection and improvement.

    Literature Review

    A Focus Shift: Student Learning Outcomes

    A culture change is occurring in higher education that shifts the primary focus of

    educators from teaching the subject matter of specific disciplines to a perspective of student

    learning (Allen 2004, p. 1). As departmental, organizational, and institutional cultures undergo

    change, and as the focus of that change is less on teaching and more on learning, a commitment

    to sustainable outcomes assessment becomes essential (Hernon, et al. 2006, p. 1). According to

    Allen, this type of assessment occurs when, empirical data on student learning is used to refine

    programs and improve student learning (2004, p. 2).

    At the classroom-level, Palmer encourages viewing student assessment as a strategic

    tool for enhancing teaching and learning (2004, p. 194). He goes on, a few pages later to add

    that, Continuous assessment starting early in the semester has the benefit of quickly identifying

  • 8/8/2019 Prospectus Version 2.8

    6/25

    those students falling behind and perhaps at risk of dropping out, so remedial action can be

    taken (p. 198).

    Allen comments that, while classroom assessment examines learning in the day-to-day

    classroom, program assessment systematically examines student attainment in the entire

    curriculum (2004, p. 1). In their 2001 work, Ratcliff, et al., point out that the continuous

    improvement cycle should begin with clear departmental goals that identify what a student can

    expect to gain as a result of studying a particular field or discipline (p. 25). Hernon, et al. add

    that, programs and institutions need to develop a strong and sustainable commitment to

    assessment as a process and as a means to improve learning based on explicit student learning

    outcomes (2006, p. 11). Ratcliff, et al. further point out that, While a colleges or universitys

    general goals for student achievement can be measured at the university level, the accreditation

    self-study must address student academic achievement in the discipline. Therefore, departments

    and programs must contribute to the accreditation self-study by assessing their students learning

    at the department level (2001, p. 32).

    Classroom, program (or department), and institutional assessment support a foundation

    for accreditation. As this culture shift continues to push the focus towards student learning,

    accreditation standards continue to link student outcomes assessment to continued accreditations

    of programs and schools (Ratcliff, et al. 2001, p. 13).

    Regional Accreditors on Assessment

    Allen tells us on page 18 of her 2004 text that, accrediting organizationsgenerally

    focus on two major issues: capacity and effectiveness. She goes on to explain that capacity is

    the bean counting of the process. It is when tallies are taken of the resources any institution has

    to support its student such as libraries, technology, physical space, and student support services.

  • 8/8/2019 Prospectus Version 2.8

    7/25

    The focus, however, has really turned more towards a long-term commitment to improving

    student learning (Hernon, et al. 2006, p. 1). Allen continues on to say that accrediting

    organizations, expect campuses to document their impact on student learning (2004, p. 18) and

    that, when accrediting bodies require assessment, campuses pay attention (2004, p. 2). She

    cautions, however, that, assessmentshould be implemented because it promotes student

    learning, not because an external agency requires it (Allen 2004, p. 2).

    Closing the Loop

    On pages 163 and 164, Allen imparts some friendly suggestions, one of which is to

    close the loop, stating that good assessment has impact (2004). The foundation of all this

    assessment is that it will drive change towards the continual improvement of the quality of the

    educational system. As Ratcliff, et al. put it, Assessment and accreditation are both premised on

    the importance of quality assurance (2001, p. 17).

    A Community of Assessors: Collaboration is Key

    To move toward these lofty goals, faculty, institutional research offices, and everyone in

    the educational enterprise, has [the] responsibility for maintaining and improving the quality of

    services and programs (Ratcliff, et al. 2001, p. 17). Assessment of student learning outcomes,

    includes all members of the [educational] community as they strive to contribute to and enhance

    the educational enterprise, as a whole (Ratcliff, et al. 2001, p. 17).

    Authentic Assessment as Means to Focus on Student Learning Outcomes

    Brown, et al. expound the functions of assessment on page 47 of their 1999 work as six

    points:

    1. Capturing student time and attention.2. Generating appropriate student learning activity.

  • 8/8/2019 Prospectus Version 2.8

    8/25

    3. Providing timely feedback which students pay attention to.4. Helping students to internalize the disciplines standards and notions of quality.5. Marking: generating marks or grades which distinguish between students or which enable

    pass/fail decisions to be made.

    6. Quality assurance: providing evidence for others outside the course (such as externalexaminers) to enable them to judge the appropriateness of standards on the course.

    With these purposes in mind, assessment methods can be examined to determine their validity.

    On pages 62 and 63, the authors discuss traditional unseen written exams and how they function

    as assessments: In particular, this assessment format seems to be at odds with the most

    important factors underpinning successful learningthere is cause for concern that traditional

    unseen written exams do not really measure the learning outcomes which are the intended

    purposes of higher education (Brown, et al. 1999). Palmer echoes these concerns on page 194 of

    his 2004 paper on authenticity in assessment, stating that, traditional forms of assessment can

    encourage surface learning rather than deep learning. Banta goes a bit further, with her colorful

    simile to discourage purchasing more traditional assessment measures for the purposes of

    improving student learning: Just as weighing a pig will not make it fatter, spending millions to

    test college students is not likely to help them learn more (2007, p. 2).

    Watson points out, a need for more authentic, learner-friendly methods to encourage

    [student] engagement (2008, p. 1) which seems to align with Brown, et al.s first and second

    functions of assessment listed above (1999, p. 47). Watson goes on to point out a bit further on

    that, the assessment of authentic performancehas the potential to address a number of

    contemporary criticisms of assessment, (2008, p. 1) such as those quoted above of Banta,

    Palmer, and Brown, et al. Banta also notes that, authentic and valid assessment approaches must

  • 8/8/2019 Prospectus Version 2.8

    9/25

    be developed and promoted as viable alternatives to scores on single-sitting, snapshot measures

    of learning that do not capture the difficult and demanding intellectual skills that are the true aim

    of a college education (2009, p. 3). She continues in the same paper on page 4 that the point, is

    knowledge creation, not knowledge reproduction (Banta 2009, p. 4).

    Brown, et al., bring it together nicely when they state that, ultimately, assessment should

    be for students[as] a formative part of their learning experience and that students who

    develop their test-taking skills, the best tend to succeed in assessment whether or not they are

    the most qualified in their field (1999, p. 58). In other words, their first two functions of

    assessment, capturing student time and attention and generating appropriate student learning

    activity, along with the fourth and sixth, helping students to internalize the disciplines

    standards and notions of quality and providing for quality assurance, are just as important as

    the fifth, marking, which tends to get all the attention but most often seems more prone to be

    partially invalid in the case of many traditional assessment measures (Brown, et al. 1999, p. 47).

    Whereas authentic assessment brings the focus of assessment to student learning outcomes

    because, the onus is on lecturers to be able to demonstrate that assessment is measuring well

    what it is intended to measure, thereby forcing an increase in the validity of the assessments

    (Brown, et al. 1999, p. 59).

    It is important to note, that the point is not to throw away traditional assessment

    measures. They can still serve some of the functions of assessment well. As Ratcliff, et al. state

    on page 28 of their 2001 text, formative and summative assessment methodologies provide the

    department or program with evidence of their students learning. While traditional summative

    assessments can, and should, support the functions of assessment processes, their results cannot

    stand alone to inform the process of continual improvement of learning (Ratcliff, et al. 2001, p.

  • 8/8/2019 Prospectus Version 2.8

    10/25

    28). To really get at that sixth purpose of quality assurance, a balance is needed (Brown, et al.

    1999, p. 47).

    Outcomes Assessment in Foreign Languages

    Trends over the last couple of decades in foreign language methodologies have espoused

    communicative goals of instructionyet, examinations in foreign language courses typically

    are pen and paper exercises that single out discrete points of grammar or vocabulary (Higgs

    1987, p. 1). Higgs raised the warning almost twenty-five years ago that if foreign language

    educators really want to set communication as a goal for their students, then, assessment

    procedures must test for communicative function (1987, p. 1). While these pen and paper exams

    are still common place, language classes have seen an influx of authentic assessment measures

    (Sullivan 2006, p. 590).

    Most of these assessments have come in the form of portfolio-style projects. Banta

    commented in her 2007 article on assessment that portfolio assessments would be the most

    authentic because students develop the content themselves (p. 4). Studies of English as a Foreign

    Language (EFL) learners have found that portfolio-style assessments have contributed to student

    learning, especially when combined with other assessment measures, and that portfolio

    assessments help students take ownership of their learning (Barootchi, et al. 2002 & Caner

    2010). Additionally, one study found that some EFL students in writing courses preferred the

    portfolio assessments over more traditional assessments (Caner 2010, p. 1).

    There are many styles of portfolio assessments depending upon the specific needs of the

    assessment, but the seemingly most popular version in language learning is the self-assessment

    portfolio. The European Language Portfolio (ELP) was the model for the American adaptations:

    LinguaFolio and the Global Language Portfolio (Cummings, et al. 2009, p. 1). These portfolios

  • 8/8/2019 Prospectus Version 2.8

    11/25

    present a learner-empowering alternative to traditional assessment (Cummings, et al. 2009, p.

    1) Moeller points toward these self-assessment models as more valid forms of assessment than

    more traditional assessment models (Moeller2010 Self-assessment in the foreign language

    classroom). She describes LinguaFolio, in particular, as, a portfolio that focuses on student self-

    assessment, goal setting and collection of evidence of language achievement (Moeller2010

    LinguaFolio). The students set their language goals and, based on the evidences of their own

    work that they collect, determine when their goals are met (Fasciano 2010, slide 9). Moeller

    points out that if language educators are using LinguaFolio effectively, it will necessitate moving

    away from teacher-centered methodologies and toward learning-centered outcomes because it is

    by its very definition a learner-centered self-assessment tool that facilitates the processes of goal

    setting and self-reflection and establishes intrinsic motivation in students (Moeller2010

    LinguaFolio). Brown, et al. also mention that a major advantage of these types of assessments is

    that they promote intrinsic motivation through personal involvement because the student is

    taking charge of their learning (1999, p. 75).

    Student Perception on Assessments

    At present, students often feel that they are excluded from the assessment culture, and

    that they have to use trial and error to make successive approximations towards the performances

    that are being sought in assessed work (Brown, et. al. 1999 p. 58). Because of the reflections

    gathered from their students, Brown, et al. encourage innovation in assessment (1999, p. 81).

    They also stated that, to some students, conventional forms of assessment appear to have no

    relevance to anything outside the university and are all about judging them, sometimes on a

    somewhat arbitrary basis, rather than involving them in genuine learning (Brown, et al. 1999, p.

    81). Instead they found that, students appreciate assessment tasks which help them to develop

  • 8/8/2019 Prospectus Version 2.8

    12/25

    knowledge, skills and abilities which they can take with them and use in other contexts such as in

    their subsequent careers and they encourage, assessment which incorporates elements of

    choice, because, it can give students a greater sense of ownership and personal involvement in

    the work and avoid the demotivating perception that they are simply going through routine tasks

    (Brown, et al. 1999, p. 81).

    Research Questions

    As previously stated, the purpose of this action research study is to explore the reflections

    of students in a beginning level Spanish class towards various forms of traditional and authentic

    assessment tools. To this end, I plan to focus my research on the overarching question: What are

    the perceptions of undergraduate students related to traditional and authentic assessments used in

    an introductory Spanish course? To support this overarching question, I will explore the

    following five sets of sub-questions:

    1. What do students think are the benefits or limitations of each type of assessment ontheir learning? Do students think these assessments reflect their learning?

    2. How do students feel that these assessments can enhance or detract from theirlearning experience?

    3. What factors do students feel affect the impact of each type of assessment on theirlearning?

    4. What preferences do students express toward each type of assessment? What aretheir reasons for these preferences?

    5. What recommendations do students have for enhancing their perceivedeffectiveness of each type of assessment?

    Figure 1.1 below demonstrates which data collection methods, described in the methodology

  • 8/8/2019 Prospectus Version 2.8

    13/25

    section of this work, should address each research question listed above.

    Figure 1.1

    Research Questions and Related Data Collection Methods Matrix

    Research Questions InterviewQuestions

    SurveyStatements

    Overarching question: What are the perceptions of undergraduate students

    related to traditional and authentic assessments used in an introductorySpanish course?

    X X

    Sub-Q 1: What do students think are the benefits or limitations of each type

    of assessment on their learning? Do students think these assessments reflect

    their learning?

    X X

    Sub-Q 2: How do students feel that these assessments can enhance or detract

    from their learning experience?X

    Sub-Q 3: What factors do students feel affect the impact of each type of

    assessment on their learning?X

    Sub-Q 4: What preferences do students express toward each type of

    assessment? What are their reasons for these preferences?X X

    Sub-Q 5: What recommendations do students have for enhancing their

    perceived effectiveness of each type of assessment?X

    Methodology

    Context/Setting

    My research will be conducted by gathering data from former students of my fall 2010

    beginning Spanish college courses. I teach at a public state funded institution and although we

    are moving towards being a more research focused institution, the current focus is more aligned

    with teaching. I am fortunate to have a great deal of freedom and control in my classroom. While

    it is true that my general curriculum and my textbook are mandated by the tenured faculty of my

    department, I am free to choose whatever path I believe will best help my students achieve the

    course goals. That is to say that while I am not free to choose what I teach, I am free to

    determine how to facilitate student learning. While I also receive feedback from peers once a

    year, I feel no other demands from any supervisors on my teaching methods. This allows me to

    constantly experiment with ways to improve how I teach my classes. I am currently in my sixth

    semester teaching these courses and I can say with certainty that no two semesters have held very

  • 8/8/2019 Prospectus Version 2.8

    14/25

    much in common outside of my general teaching philosophy. I am constantly trying to improve

    my methods based on what I have perceived as being effective. This hands-off situation created

    by the administration allows me to be fluid in my methods. While the freedom to teach the way I

    feel is best has many advantages, it also carries heavy responsibility; I have to rely on my

    perceptions of my students' learning with little feedback from anyone else.

    The classes I teach are capped at 28 students. This is a moderate number of students for a

    beginning foreign language class. While it would be ideal to have a smaller number because it

    would allow for more individualized attention, there are advantages to this class size as well.

    With this many students it is easier to employ group learning strategies, allowing students to

    facilitate their own and each others learning processes. These large classes make it easy to use

    traditional assessment measures because they are easy to administer and assess, even in so large

    a group. Authentic assessments, like portfolio-style assessments, are more challenging to

    administer and evaluate because they take longer to facilitate, collect, and assess but can provide

    richer, more detailed feedback to students. This particular semester that I am currently teaching,

    and the one on which my students reflections will be based once I begin my research in January,

    I chose to employ various assessment measures, both traditional and authentic, to provide these

    students with a concrete context on which to base their reflections.

    Participants

    For this project, there will be two primary participant groups. First, I will invite the 79

    students taking one of the three sections of beginning Spanish 1 in fall 2010 to participate in a

    broad attitude survey (see Appendix B). These students run the gambit in class rank, from

    freshmen to seniors, and in age, the youngest being 17 and the older ones being over thirty, as

    well as in educational experiences and majors. I hope to see at least 40% of these students

  • 8/8/2019 Prospectus Version 2.8

    15/25

    respond to the survey to ensure a valid sample.

    The second group of students I plan to solicit for this study will ideally be a group of

    nine. I would like to ask three students from each of the three sections to participate in individual

    interviews. I will choose which students to ask to participate in this study based on their

    performance levels in class. Ideally, I will find one high-, one mid-, and one low-performing

    student to ensure a broader range of perspectives on the assessment measures. I will use a semi-

    structured interview guide (see Appendix A) and record the interviews digitally.

    Research Plan

    I will complete my research by two methods: individual interview and a broad attitude

    survey. First, I will employ a semi-structured interview method to ask the nine students,

    described above, to explore their reflections on various forms of traditional and authentic

    assessment tools used their previous semester of beginning level Spanish. I will use the semi-

    structured interview guide (see Appendix A) to solicit their opinions on these assessments. I will

    record the interviews digitally, after having each student sign an informed consent form (see

    Appendix C). When I begin analyzing their reflections, I will create anonymity for my students

    by giving each student a pseudonym, known only to me, before categorizing each one as either

    high-, mid-, or low-performing.

    Additionally, I will email an invitation to all 79 students taking one of the three sections

    of beginning Spanish 1 in fall 2010 to participate in the broad attitude survey (see Appendix B)

    which will be housed online. This survey will ask these former students to comment on their

    engagement and motivation levels and their perceived effectiveness on their learning experiences

    of the various assessment tools used throughout the course. I hope to have at least 32 of those

    students surveyed to respond, which would be approximately 40% and enough for a valid

  • 8/8/2019 Prospectus Version 2.8

    16/25

    sample. I plan to send out an invitation to these students in early January, after grades have

    posted for the semester, asking them to complete the survey by February 1, 2011. On January 31,

    2011, I plan to send another email reminder asking students to complete the survey. If my results

    are still under 50% participation, I will send a final email request during the second week of

    February 2011. This survey will be completely anonymous because it will not ask for any

    identifying information (see Appendix B).

    Plan for Evaluation of Data

    The data collected from the individual interviews and online survey will be coded and

    evaluated from the foundation of the research questions they hope to answer. Figures 1.2 and 1.3

    demonstrate the relation of each data collection tool to the research questions being posed.

  • 8/8/2019 Prospectus Version 2.8

    17/25

    Figure 1.2: Interview to Research Questions Alignment MatrixThe following questions are a general guide for the interviews. The final project may contain slightly different data depending on the responses ofindividual participants and the open-ended nature of the interview.

    Interview Questions Research Questions

    1. Tell me a little bit about your first experiencewith Spanish. How old were you? What

    happened? Why did you decide to takeSpanish? What are your goals in regards to

    Spanish? What do you want to do with thelanguage?

    (Warm-Up)

    2. Tell me about how you learn. What situationsand tools help you learn? What tools andstrategies do you seek out and employ?

    3. Tell me a bit about your experience inbeginning Spanish I this past fall of 2010.

    Overarching question: What are the perceptions

    of undergraduate students related to traditionaland authentic assessments used in anintroductory Spanish course?

    Sub-Q 1: What do students think are the

    benefits or limitations of each type of

    assessment on their learning? Do students thinkthese assessments reflect their learning?

    Sub-Q 3: What factors do students feel affect

    the impact of each type of assessment on theirlearning?

    Sub-Q 4: What preferences do students express

    toward each type of assessment? What are their

    reasons for these preferences?

    4. There were four chapter tests that includedsections on grammar, vocabulary, listening,writing, and speaking. Tell me about your

    experience taking these tests. What did youthink the point of taking the tests was? Do you

    think they measured your ability to understand

    and use Spanish? Did you feel you could betterunderstand and use Spanish as a result of

    preparing for and completing them?

    5. The final exam was similar in structure to thechapter tests except it was cumulative, coveringall five chapters. Tell me about your experience

    taking this exam. What did you think the point

    of taking the exam was? Do you think itmeasured your ability to understand and use

    Spanish? Did you feel you could betterunderstand and use Spanish as a result ofpreparing for and completing it?

    6. You were required to create a culture blog thissemester that asked you to reflect on cultural

    artifacts of your choosing and how they relatedto you personally and what you were learning

    in the class. Tell me about your experience increating this blog. What did you think the pointof creating the blog was? Do you think it

    measured your ability to understand and use

    Spanish? Did you feel you could better

    understand and use Spanish as a result ofresearching and completing it?

    Overarching question: What are the perceptions

    of undergraduate students related to traditional

    and authentic assessments used in anintroductory Spanish course?

    Sub-Q 1: What do students think are thebenefits or limitations of each type of

    assessment on their learning? Do students think

    these assessments reflect their learning?

    Sub-Q 3: What factors do students feel affect

  • 8/8/2019 Prospectus Version 2.8

    18/25

    Figure 1.2: Interview to Research Questions Alignment MatrixThe following questions are a general guide for the interviews. The final project may contain slightly different data depending on the responses ofindividual participants and the open-ended nature of the interview.

    Interview Questions Research Questions

    7. Throughout the semester you were required tokeep up with the eLinguaFolio self-assessment

    project that asked you to reflect on your ownlearning and provide samples that demonstrated

    your best efforts. Tell me about yourexperience working on this project. What did

    you think the point of working on theeLinguaFolio project was? Do you think it

    measured your ability to understand and use

    Spanish? Did you feel you could betterunderstand and use Spanish as a result ofworking on it?

    the impact of each type of assessment on theirlearning?

    Sub-Q 4: What preferences do students express

    toward each type of assessment? What are theirreasons for these preferences?

    8. Considering these four ways (give student fourindex cards, each with the name of one of the

    above assessments to help them concentrate oneach one individually and in relation to the

    others as they talk about them) in which youwere tested during the semester, tell me how

    you feel they compared to each other. Do youhave a favorite?

    9. How did completing these assignments affectthe way you learned in the course?

    Sub-Q 2: How do students feel that theseassessments can enhance or detract from theirlearning experience?

    10.What suggestions for improvement would youmake about any or all of these assignments?Any other comments on testing in this course?

    Sub-Q 5: What recommendations do studentshave for enhancing their perceivedeffectiveness of each type of assessment?

  • 8/8/2019 Prospectus Version 2.8

    19/25

    Figure 1.3: Survey to Research Questions Alignment MatrixAll survey statements (except 12) will have responses from a scale of strongly disagree to strongly agree with this statement.)

    Survey Statements Research Questions

    motivation

    1. I felt motivated to study and learn to prepare for

    the four chapter tests.3. I felt motivated to study and learn to prepare forthe final exam.5. I felt motivated to work on the culture

    blog/portfolio.

    7. I felt motivated to work on the eLinguaFolioproject.

    Sub-Q 4:What preferences do students express toward eachtype of assessment?

    effectiveness

    2. I felt like the four chapter tests helped to

    demonstrate what I learned.

    4. I felt like the final exam helped to demonstratedwhat I learned.

    6. I felt like the culture blog/portfolio helped todemonstrate what I learned.8. I felt like the eLinguaFolio project helped to

    demonstrate what I learned.

    Sub-Q 4:What preferences do students express toward each

    type of assessment?

    fairness

    9. Overall, I felt like the individual grades Ireceived in this course were a fair assessment ofmy learning.

    10. Overall, I felt like my final grade in this coursewas a fair assessment of my learning.

    11. Overall, I felt like I understood the point behindthe tests and projects in this course.

    Sub Q-1, part 2:

    Do students think these assessments reflect their

    learning?

    additional comments

    12. Please leave any additional comments here.

    Recommendations for improvement are welcome

    and appreciated.

    Overarching Question:What are the perceptions of undergraduate students

    related to traditional and authentic assessmentsused in an introductory Spanish course?

    Time Line

    Figure 2.1 Summary of Time Line

    Date Due

    Jan 10 Initial email of survey

    Jan 17 Email solicitation of interviewees

    Jan 17-20 To UWC with chapters 4, 5, & 6

    Jan 24 Chapters 4, 5, & 6 due to committee

  • 8/8/2019 Prospectus Version 2.8

    20/25

    On Monday, January 10,

    2010, I will email the initial invite

    to the 79 students I am asking to

    participate in the survey asking

    them to complete it by Tuesday,

    February 1, 2010. Since this is the

    first day of classes for spring

    semester and I know students will

    be very busy, I plan to wait one

    week before sending out

    solicitations via email, on Monday, January 17, 2010, to my nine ideal interview candidates to

    start setting up interview slots. I will ask students to pick a time slot for their interview. These

    time slots will fall between Tuesday, January 18, 2010-Friday, February 18, 2010. If any of the

    nine ideal interview candidates have not responded by Friday, January 21, 2010, I will email as

    many alternate candidates as necessary to fill those spaces.

    The week of January 17-20, 2010, I will make three appointments at the University

    Writing Center on three separate days to have a consultant work through one of the following

    three prospective chapters with me: Chapter 4 (Methodology), Chapter 5 (Validity), and Chapter

    6 (Ethics). I will make corrections after each session and send chapters four, five, and six to the

    committee by Monday, January 24, 2010. The week of January 24-28, 2010, I plan to make an

    appointment with the University Writing Center to revise my bibliography and any appendices

    that are complete at this point.

    On Sunday, January 30, 2010, I will send an email reminder to the survey participants to

    Jan 24-28 To UWC with bibliography & appendices

    Jan 30 First survey reminder via email

    Feb 1 First due date for survey

    Feb 7 Additional reminder of survey via email

    Feb 4-8 To UWC with chapter 2

    Feb 12 Chapter 2 due to committee

    Feb 18 Interviews will be complete

    Feb 27 Transcriptions and data analysis due

    Feb 28-

    Mar 4

    To UWC with chapters 3 & 7

    Mar 7 Chapters 3 & 7 due to committee

    Mar 7-11 To UWC with chapters 1 & 8

    Mar 14 Chapters 1 & 8 due to committee

    Mar 15-20 Final trips to UWC and final revisions

    Mar 21 Full final draft due to committee

    Apr 4-7 Defense

    Apr 7-17 Final revision

    Apr 18 Final thesis submitted to graduate schoolApr 20 Graduate schools deadline for final thesis

  • 8/8/2019 Prospectus Version 2.8

    21/25

    remind them to complete the survey by Tuesday, February 1, 2010, if they have not already done

    so. This reminder will have to go out to all participants because the survey is anonymous and I

    will have no way to identify which participants have already completed the survey. If on

    Monday, February 7, 2010, I have a response rate of less than 50% on the survey, I will send out

    one more email reminder, asking students to please complete the survey. I believe this will be

    more than sufficient to get the 40% response rate I am hoping for.

    On either February 4, 7, or 8, 2010, I plan to take Chapter 2 (Literature Review) to the

    University Writing Center for a consultation. Ideally, I would like to make the appointment on

    the 4

    th

    in case there is not sufficient time to get through the entire chapter in one session. I will

    send chapter two to the committee by Saturday, February 12, 2010.

    All interviews should be complete by Friday, February 18, 2010. Therefore, I plan to

    complete the transcription and coding of all data, including that of the survey, by Sunday,

    February 27, 2010. The week of February 28-March 4, 2010, I plan to make at least two

    appointments at the University WritingCenter to review Chapter 3 (Research Details) and

    Chapter 7 (Data Representation). I will submit chapters three and seven to the committee by

    Monday, March 7, 2010.

    The week of March 7-11, 2010, I plan to make at least two, but most likely three

    appointments with the University Writing Center to work on Chapter 1 (Introduction) and

    Chapter 8 (Conclusion). I plan to submit chapters one and eight to the committee by Monday,

    March 14, 2010.

    During the week of March 15-20, 2010, I will make any necessary final visits to the

    University Writing Center and finalize any revisions the committee has previously advised me

    about through the courseof the semester. By Monday, March 21, 2010, I will submit a complete

  • 8/8/2019 Prospectus Version 2.8

    22/25

    final draft to the committee for review.

    I would like to schedule the defense of my thesis for the week of April 4-7, 2010. This

    will allow me to one and half to two weeks to make any necessary adjustments before I submit

    my final thesis to the graduate school on Monday, April 18, 2010, which is two days before the

    day the graduate school requires receipt on Wednesday, April 20, 2010, to allow for additional

    issues that may arise.

    Validity

    I chose to focus on four types of qualitative validity, described by Hendricks in her text on action

    research, to validate this study: democratic, outcome, process, and catalytic (2009). I chose

    democratic validity as the first way to validate my study because I chose particular students to

    voice their opinions. I have chosen students to interview whom I attribute to one end or the other

    of the spectrum of students, both those who are more introverted and those who are more

    extroverted, whom I am trying to help accommodate. In addition, I chose one student who falls

    between these two extremes of personality measures. Since my research question involves

    finding ways to facilitate oral participation and learning outcomes for both the reluctant, or

    introverted, and the outspoken, or extroverted, students, I have chosen to interview some of the

    more introverted and the more extroverted students from my previous semester of teaching. By

    looking at both ends of the spectrum and one in the middle, I have a clearer picture of the needs,

    opinions, and perspectives of the full range of students in my class.

    The second measure I chose is outcome validity because it speaks to how I will use the results

    for continued planning, ongoing reflection, and deepening my personal understanding of the

    topics I am exploring. Through this research I have learned which of my methods are working

    well and which need improvement. I have also learned about new ideas while conducting the

  • 8/8/2019 Prospectus Version 2.8

    23/25

    interviews and observing my colleague that will help shape how I move forward in my practice. I

    have taken what I have learned, reflected on what it means to me, and begun to brainstorm ways

    to apply my understanding to the way I approach my classroom. I also plan to continue to solicit

    feedback from my students in future research to continue the cycle and work toward continual

    improvement of my practice. I hope that by taking their opinions into consideration, I will

    continue to improve the way I teach and thereby help my students engage and participate in class

    and generally get more out of their learning experience with me. There is so much more to learn

    and try as I continue my path to become the best teacher I can be.

    Next, I chose process validity because I need to insure I have looked deeply at the

    problem so I can understand the ways context and processes have impacted my results and how

    this information carries me forward. To insure I have looked deeply and critically at these issues,

    I have relied on two main methods: asking open-ended questions during the interviews that left

    plenty of room for the students to tell me what they really think and writing out my reflections on

    the interviews, the broad attitude survey, and the observation in a reflective journal style so I am

    very conscious of conclusions I have drawn. All of these notes and reflections can be seen in

    totality in Appendix C of this report.

    Finally, I chose catalytic validity because it allowed me to be aware of the ways my processes

    and outcomes will change my practices. This is the most important part of my study. If the

    results do not change the way I approach my classroom, I will need to revise my research and try

    again. While I believe that I am a proficient instructor, my primary goal is to improve. Any

    insight I can gain from this study will help reshape my perspective and improve my practice. The

    primary insight I have gained at this point includes five personal goals to improve my teaching.

  • 8/8/2019 Prospectus Version 2.8

    24/25

    Ethics and Subjectivity

    I am very invested in my research topic because I am my research topic. I knew the first step for

    me would be to distance myself a bit and try to view the focus of my study as my teaching

    methods rather than myself. By trying to put a little intellectual separation between myself and

    my methods, I was able to be honest with myself about improving the way I do my job, but also

    get my students to be as honest as possible with me about their reflections.

    I tried to be as clear, honest, and open-minded as I could while I asked these students to

    dissect my performance as their instructor. I endeavored to make it as explicit as possible that I

    was not interested in having my ego stroked. While I did want and need to know what worked

    well for them, it was also imperative to be honest about the aspects of the class that were not

    particularly helpful. I believe this attempt was successful because I did get feedback on aspects

    of my methods and performance that could use work, like my tendency to be too variable in my

    daily teaching methods and my inconsistent use of the target language in class.

    Beyond assuring them that I was open to criticism, I had to really and actually be open. It was

    essential that I be neither overtly nor subtly defensive in my responses or my body language. The

    last thing I wanted to do was shut them down or squelch their opinions. This was another reason

    for including the broad attitude questionnaire. While I know who had access to the survey, I set

    the responses to anonymous to allow for more freedom and honesty. I was open to getting some

    negative or even potentially hurtful feedback in exchange for the chance of obtaining some

    useful feedback to help me be a more effective teacher. Fortunately, I did not receive any hurtful

    feedback. The students who participated in this research seemed to genuinely want to help me.

    Their comments were not also positive, but they were considerably constructive, especially

    considering the assumed maturity level of such a young research pool.

  • 8/8/2019 Prospectus Version 2.8

    25/25