impact of professional development involving … of professional development involving modelling on...

13
Vol.:(0123456789) 1 3 ZDM (2018) 50:273–285 https://doi.org/10.1007/s11858-018-0911-y ORIGINAL ARTICLE Impact of professional development involving modelling on teachers and their teaching Katja Maass 1  · Katrin Engeln 2 Accepted: 9 January 2018 / Published online: 16 January 2018 © FIZ Karlsruhe 2018 Abstract This paper presents an international research study of long-term professional development courses on modelling. It addresses the question of scaling-up professional development. So far, there has been much research on small-scale professional devel- opment courses, but we know very little about what it means to scale up such a course and to reach out to large numbers of teachers. Therefore, our study researches the impact of a scaled up professional development course on teachers and their teaching, as perceived by the teachers themselves and their students. The course was designed on an international level for use in 12 countries. The results show that such a course can indeed lead to desired outcomes concerning the teachers and their teaching, and the research therefore adds to our understanding of scaling-up. Keywords Mathematical modelling · Inquiry-based learning · Continuous professional development · Scaling-up professional development · International study · Teachers’ and students’ perceptions on teaching 1 Introduction Research and experience reveals that innovative teaching approaches promoted by mathematics education research- ers are far beyond the day-to-day practices of teachers in many countries (Akker et al. 2006; Boaler 2008; Krainer 2011). This is true in particular for modelling and inquiry- based learning (IBL) (Hudson et al. 2002; Weiss et al. 2003; OECD 2014). When aiming at implementing innovative teaching approaches, we face several challenges: First, it is simply a challenge for teachers to change their role from being just an instructor, who transmits knowledge, to being a facilita- tor of learning processes (Swan 2005, 2007). This is by no means easy and requires changes in teachers’ knowledge, classroom practices and beliefs (Clarke and Hollingsworth 2002; Zehetmeier and Krainer 2011). Second, from teachers’ perspective there are several impediments to the implementation of innovative teaching approaches such as modelling and IBL, which need to be overcome: Contextual factors such as classroom norms (e.g., students are not used to modelling), teachers’ community (e.g., limited collaboration with colleagues), school devel- opment and leadership (e.g., limited support by the head of school), the national educational and political system (e.g., non-supportive assessment) or a lack of resources (Engeln et al. 2013; Maass 2011; Skott 2013). Consequently any continuous professional develop- ment (CPD) activity aiming to encourage change needs to consider these challenges and fulfil certain quality criteria (see Guskey 2000; Loucks-Horsley et al. 2009) in order to encourage and enable teachers to implement the teaching approaches. However, if we want to ensure widespread implementa- tion of modelling, our CPD courses need to reach out to large numbers of teachers. This further complicates the issue (e.g., Maass and Artigue 2013; Roesken-Winter et al. 2015a) as unfortunately, little is known on how to scale up profes- sional development (Jackson et al. 2015; Roesken-Winter et al. 2015b). For example, what impact can CPD scaled-up with the so-called “Cascade Model” have? In this Cascade Model a professional development provider educates fur- ther course leaders in the first step (of the cascade). These * Katja Maass [email protected]; [email protected] 1 International Centre for STEM Education at University of education Freiburg, Kunzenweg 21, 79117 Freiburg, Germany 2 IPN-Leibniz Institute for Science and Mathematics Education, Olshausenstraße 62, 24118 Kiel, Germany

Upload: duongtruc

Post on 11-Apr-2018

221 views

Category:

Documents


6 download

TRANSCRIPT

Vol.:(0123456789)1 3

ZDM (2018) 50:273–285 https://doi.org/10.1007/s11858-018-0911-y

ORIGINAL ARTICLE

Impact of professional development involving modelling on teachers and their teaching

Katja Maass1  · Katrin Engeln2

Accepted: 9 January 2018 / Published online: 16 January 2018 © FIZ Karlsruhe 2018

AbstractThis paper presents an international research study of long-term professional development courses on modelling. It addresses the question of scaling-up professional development. So far, there has been much research on small-scale professional devel-opment courses, but we know very little about what it means to scale up such a course and to reach out to large numbers of teachers. Therefore, our study researches the impact of a scaled up professional development course on teachers and their teaching, as perceived by the teachers themselves and their students. The course was designed on an international level for use in 12 countries. The results show that such a course can indeed lead to desired outcomes concerning the teachers and their teaching, and the research therefore adds to our understanding of scaling-up.

Keywords Mathematical modelling · Inquiry-based learning · Continuous professional development · Scaling-up professional development · International study · Teachers’ and students’ perceptions on teaching

1 Introduction

Research and experience reveals that innovative teaching approaches promoted by mathematics education research-ers are far beyond the day-to-day practices of teachers in many countries (Akker et al. 2006; Boaler 2008; Krainer 2011). This is true in particular for modelling and inquiry-based learning (IBL) (Hudson et al. 2002; Weiss et al. 2003; OECD 2014).

When aiming at implementing innovative teaching approaches, we face several challenges: First, it is simply a challenge for teachers to change their role from being just an instructor, who transmits knowledge, to being a facilita-tor of learning processes (Swan 2005, 2007). This is by no means easy and requires changes in teachers’ knowledge, classroom practices and beliefs (Clarke and Hollingsworth 2002; Zehetmeier and Krainer 2011).

Second, from teachers’ perspective there are several impediments to the implementation of innovative teaching approaches such as modelling and IBL, which need to be overcome: Contextual factors such as classroom norms (e.g., students are not used to modelling), teachers’ community (e.g., limited collaboration with colleagues), school devel-opment and leadership (e.g., limited support by the head of school), the national educational and political system (e.g., non-supportive assessment) or a lack of resources (Engeln et al. 2013; Maass 2011; Skott 2013).

Consequently any continuous professional develop-ment (CPD) activity aiming to encourage change needs to consider these challenges and fulfil certain quality criteria (see Guskey 2000; Loucks-Horsley et al. 2009) in order to encourage and enable teachers to implement the teaching approaches.

However, if we want to ensure widespread implementa-tion of modelling, our CPD courses need to reach out to large numbers of teachers. This further complicates the issue (e.g., Maass and Artigue 2013; Roesken-Winter et al. 2015a) as unfortunately, little is known on how to scale up profes-sional development (Jackson et al. 2015; Roesken-Winter et al. 2015b). For example, what impact can CPD scaled-up with the so-called “Cascade Model” have? In this Cascade Model a professional development provider educates fur-ther course leaders in the first step (of the cascade). These

* Katja Maass [email protected]; [email protected]

1 International Centre for STEM Education at University of education Freiburg, Kunzenweg 21, 79117 Freiburg, Germany

2 IPN-Leibniz Institute for Science and Mathematics Education, Olshausenstraße 62, 24118 Kiel, Germany

274 K. Maass, K. Engeln

1 3

course leaders can be, for example, teachers from school, teacher educators from university or representatives from school authorities. In the second step (of the cascade) these course leaders run their own CPD courses. By means of this procedure many teachers can be reached (Maass and Artigue 2013). One of the major concerns with this model is the question of the forms of knowledge and practice that can actually be “handed down” the cascade (OECD 1998). So far, we know little about this issue, because small-scale research on professional development predominates, with most teacher education research conducted by teacher edu-cators studying the teachers with whom they are working (Adler and Jaworski 2009, for results of these studies see 2.2). Obviously, if a carefully designed small-scale CPD run by the researchers, who designed the course, leads to the desired impact on teachers this does not necessarily mean that a CPD course scaled-up through the Cascade-Model leads to a similar impact, as the course leaders might not use the course materials as intended or might not see the aims of the course clearly.

The research study presented here aims to close this knowledge gap on scaling-up. It presents an international research study involving CPD courses on modelling (in con-nection with IBL, see Sect. 2.1). It researches the impact of these CPD courses on teachers and their teaching, as per-ceived by the teachers themselves and their students. The CPD courses have been implemented across Europe in dif-ferent cultural settings. The study therefore shows to what extent the desired impact on teachers can be reached through scaling-up with the “Cascade Model” on an international level.

Through this design, the research study will additionally provide answers to questions concerning further knowledge gaps. Firstly, whilst there are several studies on the effects of modelling on students (e.g., Alfieri et al. 2011; Maass 2007; Mischo and Maass 2013; Schukajlow et al. 2015) as well as studies on the effects of related CPD courses on teachers’ knowledge (e.g., Besser et al. 2015), we lack studies on the effects of CPD courses on modelling on teachers’ teaching.

Secondly, we lack international studies on the effective-ness of CPD courses (Roesken-Winter et al. 2015a). We know that international exchange can be fruitful and collabo-rative design of materials can lead to high quality materials, as different partners bring in different expertise. However, we do not know if a course designed on an international level actually leads to the desired impact in different countries as they all have different national contexts, for example regard-ing the curriculum, teacher education, or the conditions at school.

The research study presented here was carried out within the project Primas (Promoting inquiry in mathematics and science education across Europe). Primas was one of the first projects funded within the programme “Science in Society”

of the Seventh Framework Program of the European Union. This program was supposed to fund research projects to pro-mote the implementation of IBL in day-to-day mathematics and science classes. It was set up as a consequence of a report initiated by the EU and written by a group of educa-tional experts, the so-called Rocard-Report (Rocard et al. 2007), as this report recommends IBL as opposed to teacher-centred transmission-based teaching.

Between 2010 and 2013 14 universities from 12 coun-tries (CH, CY, DE, DK, ES, HU, MT, NL, NO, RU, SK, UK) worked together in the project PRIMAS to promote the implementation of modelling and IBL (see Sect. 2.1) in mathematics and science classes. The partners were cho-sen according to their profiles (e.g., experts in modelling, material development, research) and so as to ensure a broad European coverage.

In this paper we present the quantitative evaluation of mathematics teachers and their teaching as perceived by themselves and their students. The study attempts to answer the following research questions:

1. Is there a significant change regarding the implementa-tion of modelling in mathematics teaching from the per-spective of the teacher after participation in our scaled-up CPD course using the Cascade model?

2. To what extent do students perceive a change in the teachers’ classroom practices?

3. To what extent are students’ and teachers’ perceptions of classroom practices in agreement?

The answers to these questions give insight into the impact the CPD course had on teachers and their teaching according to teacher and student perceptions before and after a CPD intervention. These answers are important indicators that shed light on the issues involved (Baumert et al. 2004; see Methods).

2 Theoretical background

2.1 Mathematical modelling

In literature on mathematical modelling, a variety of differ-ent definitions can be found (Kaiser and Sriraman 2006), as well as different assumptions about the modelling cycle (Maass 2004). We define mathematical modelling as the solving of a realistic problem by carrying out a so-called modelling process (Niss et al. 2007). Based on Blum and Leiss (2005), we conceptualize the following steps of the modelling process: (1) understanding the instruction and the real situation (situation model); (2) making assumptions and simplifying the situation model (real model); (3) math-ematizing the real model (construction of a mathematical

275Impact of professional development involving modelling on teachers and their teaching

1 3

model); (4) working within the mathematical model (math-ematical solution); (5) interpreting the solution; (6) validat-ing the interpreted solution. The modelling process may be illustrated by the following scheme (Fig. 1).

Modelling tasks are often required to be authentic (cf. Kaiser et al. 2011). Following on from a definition by Niss (1992), this means they are problems of a certain profes-sional discipline which are only a little simplified and experts working in this discipline recognize them as prob-lems they might meet in their daily work (Niss 1992). In this respect, also day-to-day life can be considered as a discipline and the people living in it as experts (Maass 2004). In a sim-ilar direction, Palm (2007) defines authenticity of problem in relation to whether the problem, taken from a situation in the real world, has already occurred or might happen.

One main reason for the requirement that authentic prob-lems be used, is that students should experience the power of mathematical modelling for the understanding and solu-tion of real questions meaningful to many people, in order to learn about the relevance of mathematics (Kaiser and Schwarz 2010). Palm (2007), on the basis of an empirical study, emphasises very clearly the positive impact of authen-tic problems on students’ achievement. The description of mathematical modelling above focuses on the realistic con-text, the modelling cycle and the individual steps that have to be carried out when going through the cycle, but it does not clarify what a related lesson should look like. However, literature on modelling generally highlights an active role of students and teachers (cf. Blum 2011; Maass 2007; Mis-cho and Maass 2013), similar to that found in inquiry-based learning (IBL).

By IBL, we refer to a student-centred learning paradigm in which students are invited to be involved as follows: observe phenomena and create their own questions; select mathematical approaches; create representations to clarify relationships; seek explanations; interpret and evaluate

solutions; and communicate their solutions (Dorier and Maass 2014).

On the teachers’ part, pedagogies make a shift away from a ‘transmission’ orientation, in which teacher explanations, illustrative examples and exercises dominate, towards a more collaborative orientation. The teacher’s role includes making constructive use of students’ prior knowledge, challenging students through probing questions; managing small group and whole class discussions; encouraging the discussion of alternative viewpoints and helping students to make connec-tions between their ideas (Swan 2005, 2007).

Definitions of IBL, however, differ in the degree of auton-omy given to students in the selection of problems and in the inquiry process itself (Artigue and Blomhøj 2013). In our approach to inquiry-based learning, we refer to a socio-cultural approach in which learning needs to happen in inter-active social classroom settings (Radford 2010). Here, the teacher takes an active role. Research evidence appears to show that teachers who take on active roles in IBL are more effective than those who take passive roles and let students discover on their own (Askew et al. 1997; Swan 2006).

The idea of connecting inquiry to real-life problems goes back to the American educator John Dewey (1859–1952), to whom the importance of inquiry in education is gener-ally attributed. Dewey’s perspective on IBL implies teaching closely linked to students’ life and interests (Artigue and Blomhøj 2013).

Summing up, we can say that modelling in teaching practice means to work in an inquiry-based way in realistic contexts. For an in-depth discussion of theoretical differ-ences between IBL and mathematical modelling we refer to Artigue and Blomhøj (2013).

Due to the different understandings of modelling and IBL, as discussed above, and due to the different perspec-tives on these approaches of the partners involved in Primas, a pragmatic definition of our mathematics teaching approach

Fig. 1 An idealized scheme of the modelling process (accord-ing to Blum and Leiss 2005)

realsituation

real modelsimplifying

mathematical model

mathematicalsolution

using mathematicaltools

interpretedsolution

interpreting

validating

REALITY MATHEMATICS

model of situationunder-

standing

mathematizing

276 K. Maass, K. Engeln

1 3

combining modelling and IBL was elaborated within the project focusing on the following three core aspects (cf. OECD 2016, p. 69):

Firstly, our teaching approach is characterized by the rel-evance the subject of inquiry has to students and their real lives. Therefore, learning environments should be authentic.

Secondly, students are invited to observe phenomena and create their own questions; select appropriate tools; carry out experiments; seek explanations; and interpret and evalu-ate solutions, etc., thus they are supposed to do inquiry and conduct investigations.

Thirdly, the interaction between students and teachers changes in class and turns from a more teacher-centred way of teaching to a more student-centred way.

In the following, when we refer to modelling we mean the mathematics teaching approach defined above, combining modelling and IBL. The following example illustrates our understanding of this teaching approach. In one session of our CPD course, teachers worked on the following model-ling problem: The pictures (Figs. 2, 3) show the construction of a house in Honduras, which is now a centre for a second-ary education programme that is designed to equip and moti-vate young people to help their communities and to reduce poverty. To build the house they first collect plastic bottles, fill them with sand and then build the houses with them.

In a modelling lesson following our inquiry-based teach-ing approach the teacher could first ask students to develop their own questions in relation to the situation given in group work. Questions would then be collected on the blackboard and structured according to whether they can be answered with mathematics or not. Afterwards students could decide on one question to answer, e.g., the question of how many plastic bottles are needed to build such a house. Students would then work in groups to investigate the problem and

seek for solutions. This means they would first have to seek for simplifications to set up an appropriate real model, for example that there are four walls, no windows and that all walls have the same size. Depending on how familiar the class is with modelling, assumptions could first be discussed in class or the students could continue directly in groups to estimate the number of bottles per row, the number of rows and so on. After calculating the number of bottles needed, thus finding the mathematical solution and inter-preting it, they would communicate their solutions in class. Then, either in group or in class discussion students could reflect on the solution and validate see how plausible it is. In the next phase they could refine the model, for example, by including the windows in the model.

2.2 Continuous professional development of teachers

The term continuous teacher professional development (CPD) relates to growth in teachers’ content knowledge (knowledge about the subject), pedagogical content knowl-edge (knowledge about how to teach the subject), and peda-gogical knowledge (Shulman 1986). CPD includes teach-ers’ classroom practice (Clarke and Hollingsworth 2002), as well as their beliefs, motivation, and competence in self-reflection (Baumert and Kunter 2013). Following on from the interconnected model (Clarke and Hollingsworth 2002) teachers’ professional growth can start through changes in teachers’ knowledge, beliefs, and attitudes, through profes-sional experimentation; through salient outcomes of experi-mentations or through external stimulus.

In the last decades, much research has been carried out on CPD. Following on from the results of these studies, it is now commonly accepted that CPD courses should:Fig. 2 Building a house out of plastic bottles in Honduras

Fig. 3 Building a house out of plastic bottles in Honduras

277Impact of professional development involving modelling on teachers and their teaching

1 3

• take into account all facets of teachers competences and practices (Barzel and Selter 2015);

• take teachers’ needs into account (Guskey 2000) and dis-cuss challenges teachers face (Maass 2011);

• combine phases of learning off-job in seminars and phases of learning on-job at school (Lipowsky and Rze-jak 2012);

• be long-term (Tirosh and Graeber 2003);• stimulate cooperation between teachers so as to support

teachers in the learning-on-job phases (McLaughlin and Talbert 2006);

• be relevant to teaching practice, meaning it should allow for teachers’ active participation during the seminar phases (Clarke 1994), encourage teachers to analyze cases from lessons, videos, or students’ documents (Bar-zel and Selter 2015) and that courses should use various instructional formats, such as group work and think-pair-share (Barzel and Selter 2015); and

• foster teachers’ reflection on their beliefs about math-ematics and science teaching and their teaching expe-riences (Putnam and Borko 2000; Tirosh and Graeber 2003).

The Primas CPD course implemented these criteria in practice. The overall focus of the Primas CPD course was to enable teachers to implement modelling in their daily classroom practices and therefore it started off from teach-ers’ needs. It offered support for all components of profes-sional competence (beliefs, motivation and self-reflection), relevance for day-to-day teaching and active participation of teachers. The course ran within a timeframe of two years in each country. It consisted of alternating meetings and lessons in which teachers analysed teaching practices and then experimented with the implementation of mod-elling. During the meetings, teachers had the opportunity

to exchange and reflect on their teaching practices and experiences. They were explicitly encouraged to analyze new teaching practices, implement them and reflect on the growth of new practices and beliefs (of both teachers and learners). These phases of analysis, implementation and reflection were repeated over the long-term.

The English partner, an expert in material design, designed the PD materials in cooperation with project partners. The materials were designed according to seven modules (see Table 1), each of them tackling issues teach-ers have to face when doing modelling in class.

Each module consists of guidelines for teacher educa-tors, handouts for teachers, video-examples of lessons (originally videotaped in England) and example lesson plans. The Primas modules are published on the Primas website (http://www.primas-project.eu).

In order to illustrate what happened in the sessions, we describe module 1 “student-led inquiry and modelling” more closely. As this was the introduction to modelling, teachers were first asked to explore a phenomenon: We gave them three cups of different shape and asked them to investigate their rolling behavior, to ask themselves related questions and try to answer them. After this group work they were asked to reflect on the different steps of their proceeding which led to the introduction of the model-ling cycle. In the second activity, they were asked to pose questions regarding situations given in photographs and then exchange the questions they found. By this activity they became familiar with an important step of model-ling, namely, posing questions. In the next activity, they watched and analysed a video-taped lesson about the “Plastic bottles in Honduras” problem. Finally they were asked to design cooperatively their own lesson on this problem so as to implement it in their day-to-day teaching before the next seminar and to reflect on it.

Table 1 Modules used in the Primas CPD course

Name Content

1 Student-led Inquiry and Modelling We presented teachers with phenomena and then invited them to pose and pursue their own questions. They thus experienced what it was like to think like a mathematician or scientist

2 Tackling Unstructured Problems We compared structured and unstructured versions of problems and considered the demands and challenges unstructured problems present in the classroom

3 Learning Concepts through Modelling We considered how modelling might be combined with the teaching of content4 Asking Questions that Promote Reasoning We reflected on characteristics of questioning that encourage students to reflect, reason, and

provide extended, thoughtful answers5 Students Working Collaboratively We reflected on the characteristics of student–student discussion that benefit learning the

teacher role in managing discussion6 Building on What Students already Know We considered the different ways teachers might use formative assessment to make effective

use of students’ prior knowledge and to support students7 Self and Peer Assessment We dealt with how to encourage students to take more responsibility for their own learning and

to assess and improve each other’s work

278 K. Maass, K. Engeln

1 3

2.3 Teachers’ and students’ perspectives on teaching

Students and teachers can have different perspectives on the same lesson or teaching. Students can be regarded as experts on different ways of teaching (Clausen 2002; De Jong and Westerhof 2001) as they are exposed to a variety of teachers in different subjects over an extended period of time. This has been documented by several studies, in which students’ perspectives have been typically aggregated to class means (Baumert et al. 2004; Baumert and Kunter 2006; Marsh et al. 2005). The perceptions of students seem to be particularly valid for the description of daily routines of teaching and social features of teaching, whilst they have a limited valid-ity for the (intended) instructional approach of the lesson (Baumert et al. 2004).

As opposed to students, teachers can be considered as experts on various instructional approaches, methods and lesson features due to their education and teaching experi-ence (Baumert and Kunter 2006). This is particularly true for their own educational intentions, whilst they lack the possibility of comparing their teaching activities to those of other teachers (Baumert et al. 2004).

Consequently, students’ and teachers’ perceptions of teaching can refer to different aspects. Whilst both perspec-tives provide valuable insight into teaching, the perceptions do not necessarily coincide, because their focus might be on different aspects or their perspectives on the same aspect can be different. In this respect, overlaps in perceptions are an important indicator of the extent to which an instructional approach that a teacher intends to follow, is visible in daily teaching routines for students (Baumert et al. 2004).

3 Methods

3.1 Implementation of the scaled up CPD

In all 12 Primas countries the scaled up CPD courses on modelling were implemented. In order to ensure quality across countries, we discussed the overall CPD principles and their implementation in the PD course at the biannual project meetings. In each country, project partners selected and educated further course leaders for running the CPD courses. These course leaders were either teachers from school, pre-service educators from higher education, or persons from institutions with a main responsibility in run-ning CPD. In workshops, these course leaders experienced the modules designed for our CPD courses themselves and reflected on their use, on a meta-level. Then they all used the modules for setting up their respective courses.

In 2012 and 2013, about 100 teachers in each country took part in the CPD courses run by the course leaders. The

courses were long-term and allowed for repetitive cycles of analysis, implementation, and reflection. Naturally, in order for Primas to succeed, the 12 different national con-texts involved had to be taken into account. Therefore, we carried out an analysis of the context in each country in which Primas was implemented (for details see Dorier and García 2013; Engeln et al. 2013). Subsequently, the national implementations varied to some extent (mainly as regards the organisational framework), whilst falling under the umbrella and subscribing to the major pedagogical prin-ciples and content (using the seven modules) as described above (see Maass and Doorman 2013). In particular, the actual timeframe had to be adapted to national contexts and thus varied across countries. In some countries, the course could run for the duration of 1 year and in others countries, over several weeks.

All teachers participating in our CPD courses were expected to implement modelling in their lessons following the idea of analyzing, implementing, and reflecting as pro-moted in our CPD courses. Thus between two CPD seminars they on the one hand ran lessons as usual and on the other hand also lessons on modelling.

In each of the 12 participating countries a pre-post study was conducted to gain insight into the effects of the Primas intervention. The answers to the first two of our research questions indicate to what extent teachers actually imple-mented modelling.

3.2 Methodological approach to gain insight into classroom teaching

Collecting data to measure changes in teachers’ classroom teaching is challenging. Teachers’ answers to questionnaires may give information on their intentions but not necessarily on the changes actually made. Classroom observations give better insights, but are very time-consuming so that only a few cases can be studied (Perrin-Glorian et al. 2008). One method that has been used, for example in PISA, to gain insight into classroom teaching is the combination of teach-ers’ and students’ questionnaires (Baumert et al. 2004).

As we have seen in the theoretical section, teachers’ and students’ perspectives on teaching can be different, however overlaps may indicate that a teaching approach intended by a teacher is visible for students. Naturally, lower scores of students on specific types of activities may have different reasons, for example, that students do not notice them, do not pay the same amount of attention or have a different threshold in order to denote some activity as being imple-mented. However, here in this study, we focus on overlaps in the perspectives and not on differences. Where differences in the perspectives occur, we discuss possible reasons in the Sect. 5. Thus, in this study these overlaps are an impor-tant indicator for the impact of the CPD on participants’

279Impact of professional development involving modelling on teachers and their teaching

1 3

teaching. Therefore, we here present and compare results from the teachers’ and the students’ questionnaires.

3.3 Design of the teachers’ and students’ questionnaires

Both student and teacher questionnaires include paral-lel scales about the teaching practice in their mathematics classes. The three scales respectively refer to each of our three core elements of mathematical modelling (see 2.1): investigative teaching (inv), student-centredness (stc) and authentic connections to students’ life (aut).

For our data collection, we adapted three student scales of the OECD PISA study (OECD 2009, pp. 333–336), namely, student investigation, interaction, and focus on models or applications, for the following reasons: First, the PISA scales conformed to our understanding of teaching modelling in an IBL way (cf. Sect. 2.1 and OECD 2016, pp. 69–72). Sec-ond, the PISA scales were tested on an international level and therefore ensured a valid and reliable testing in Primas. We basically took the student items of these scales as they were. For teachers we changed the wording so as to make it applicable to teachers. For example, we used for teachers “In my lessons I explain the relevance of this subject to our daily lives.” instead of the item for students “The teacher clearly explains the relevance of this subject concepts to our lives” (ST34Q15). Table 2 shows the items used in the teacher questionnaire. The adapted PISA items were dis-cussed within the consortium and were piloted in the base-line study of the project (Engeln et al. 2013; Euler 2011) which showed their usability. For each item respondents were expected to rate the frequencies of occurrence with respect to their own classroom practice on a Likert-Scale (“never or hardly ever”, “in some lessons”, “in most les-sons”, “in almost all lessons”).

A code was generated, to match pre-post questionnaires as well as teacher and student questionnaires. Both students and teachers had to give the first two letters of their mothers’ first name as well as their own birthday. Furthermore, the

students were asked to write down the code of their teach-ers to make it possible to match teachers with their students (research question 3).

Cronbach’s alphas for the three scales investigative (inv), the student-centeredness (stc) and authentic connections to students’ life (aut) for the teacher items are 0.80, 0.68 and 0.62, respectively. Correspondingly, Cronbach’s alpha for the three scales for the students’ are 0.76, 0.60 and 0.78

3.4 Data collection

We collected pre- and post-questionnaires from teachers (respectively) at the beginning of their first PD session as well as at the end of the last PD session. We handed the student pre-post questionnaires over to teachers and asked them to collect them (respectively) in the next lesson and in a lesson after their last PD course and send them to us. Due to the large scale of the study, it was not possible for us to collect the questionnaires ourselves. We discuss related limitations at the end of the paper.

3.5 Sample sizes

Our first research question specifically refers to secondary level mathematics teachers across Europe. We wanted to find out whether our CPD courses had an effect on these teach-ers. Therefore, we analysed the data from these teachers that were collected within the PRIMAS project. The sample size for this analysis is N = 326. 34% of the sample is male, 51% refers to lower secondary teaching. The sample comprises teachers from the 12 consortium countries.

In accordance with our second research question we focused on secondary level students who filled in the ques-tionnaire with respect to their mathematics class. For the analysis presented here we took student data into account only if we had a matching pre- and post-questionnaire. Fur-thermore, we required that more than 10 students belong to the same teacher to make class aggregation reasonable. Therefore, the student sample consists of 3505 lower and

Table 2 Items on teaching and learning practice, teachers’ questionnaire

Name Item Pisa 2006

inv1 In my lessons the students design their own experiments/investigations (ST34Q08)inv2 In my lessons the students do experiments/investigations to test out their own ideas (ST34Q16)inv3 In my lessons the students have the chance to choose their own experiments/investigations (ST34Q11)stc1 In my lessons the students are given opportunities to explain their ideas (ST34Q01)stc2 In my lessons the students have discussions about the topics (ST34Q13)stc3 The students are involved in class debate or discussion (ST34Q09)aut1 In my lessons I use this subject to help the students understand the world outside school (ST34Q12)aut2 In my lessons I show the students how this subject is relevant to society (ST34Q17)aut3 In my lessons I explain the relevance of this subject to our daily lives (ST34Q15)

280 K. Maass, K. Engeln

1 3

upper secondary students (49% male) aged 9–23 (M = 14, SD = 2). All students were taught by mathematics teachers who participated in a Primas CPD course. Altogether, these 3505 students can be allocated to 136 teachers. The average number of students per teacher is 26, ranging from 11 to 91. Therefore, the teaching practice of 136 teachers is rated by their students.

Research question 3 aims at comparing teachers’ and stu-dents’ perceptions about teaching. Therefore, we focused on teachers who filled in the pre-questionnaire and had also at least 11 students filling in the student-pre-questionnaire. The comparison between teachers’ and students’ perceptions is possible either with the pre- or with the post-questionnaire, because this research questions does not aim at evaluating the impact of the PD course as such (in contrast to research questions 1 and 2), but at generally comparing teachers’ and students’ perspectives on a certain stage of teaching. The answer to this question will therefore show in which of our three scales (investigative, student-centred, authentic) the perspectives of teachers and students differ. We chose the pre-questionnaires due to a larger sample size. Thus, the sample size to compare teachers’ and students’ perception of teaching practice is N = 147.

The sample sizes for the three research questions are dif-ferent for a variety of reasons. First, not all teachers had their students filling in the questionnaires. For this reason the teacher sample is bigger than the student sample. In some cases the student questionnaire was filled in but during the time of the course the learning group changed and it was not possible for the teacher to have the same class complete the post-questionnaire. Finally, sometimes teachers and stu-dents did not fill in a useable matching code, thereby causing drop-outs regarding all three questions. The sample size for question 3 is bigger than for question 2 as we used only the pre-questionnaires for question 3.

3.6 Analysis strategies

To compare teachers’ perception of their own teaching prac-tice before and after the CPD courses (research question 1) we conducted a paired sample t-test. Such a paired t-test calculates the difference within each pre- and post-data pair. It determines the mean of these changes and shows if this mean of differences is statistically significant. The t-test is a robust test assuming normal distribution of the differences. Therefore this paired sample t-test directly answers our research question.

To compare students’ perceptions of the teaching practice before and after their teachers’ CPD courses, we aggregated the student data per class and conducted a paired-sample t-test to test for difference between the pre- and the post testing. We did so because prelimi-nary analyses demonstrated that the student rating of the

teaching practice (investigative, student-centred, authen-tic) showed significant between-class variation, ranging from 0.36 to 0.56 and that the reliability of the class-mean was high (> 0.94). Therefore, class aggregation is reason-able (Lüdtke et al. 2006).

To compare students’ and teachers’ perspective we first looked at the correlation between students’ and teachers’ perspectives to consider how strongly the two variables are associated with one another. Having this prerequisite we then conducted a paired-samples t-test. For these analyses we again used aggregated student data.

For all three analyses, the t value, the corresponding p value of the paired-samples t-test and the effect size d are reported. The p value is the probability that the observed difference between two groups is due to chance. The p value depends on the sample size. With a sufficiently large sample, the p value will almost always be below a defined value. We also report the effect sizes, which are the abso-lute difference transformed into average standard deviation units. Unlike significance tests, effect size is independent of sample size and gives insight into the substantive sig-nificance of an effect. Cohen (1992) classified effect sizes as small (d = 0.2), medium (d = 0.5), and large (d > 0.8).

4 Results

4.1 Teachers’ perspective

Our research question 1 was: Is there a significant change regarding the implementation of modelling in mathematics teaching from the perspective of the teacher after partici-pation in our scaled-up CPD?

As displayed in Table 3 there are statistically significant differences—at the 0.001 significance level—in pretest and posttest scores for all for three scales of teaching practice from the perspective of the teacher. Thus, from the par-ticipating teachers’ point of view, investigative, student-centred and authentic teaching practice increased.

The effect size for this analysis is d = 0.23 for investiga-tive, d = 0.21 for student-centred and d = 0.12 for authen-tic. Therefore, our results can be considered to be small effects according to Cohen’s (1992) convention (d = 0.20), which is a normal effect size, considering the complex construct of modelling and in relation to this construct the relatively short time for its implementation (Coe 2002; Valentine and Cooper 2003).

In sum, this means that from the teachers’ perspec-tives their way of teaching changed towards using more investigation, more authentic contexts and being more student-centred.

281Impact of professional development involving modelling on teachers and their teaching

1 3

4.2 Students’ perspective

Our research question 2 was: To what extent do students perceive a change in the teachers’ classroom practices? Table 4 shows statistically significant difference—at the 0.05 significance level—from pre-test to post-test scores for the variable authentic, but not for the variables investigative and student-centred.

The effect size for this analysis (d = 0.12) was found to fall below Cohen’s (1992) convention for a small effect (d = 0.20).

These results mean that students perceive a decrease in authenticity from the beginning of the PD course to the end of the PD course. We discuss this result in the Sect. 5. Regarding the variables investigative and student-centred-ness, the results indicate that students seem to notice an increase of related activities; however, the differences are not significant.

4.3 Comparison of students’ and teachers’ perspective

Our third research question was: To what extent are stu-dents’ and teachers’ perceptions of classroom practices in agreement? The results of this analysis are displayed in Table 5.

The results show a relevant correlation between stu-dents’ and teachers’ perception indicating that teachers and students’ have a comparable understanding of the teach-ing practices in question. Nevertheless, the analysis also shows that teachers’ perception of the frequency of authen-tic teaching practice is significantly higher than students’ perception. The effect size for this analysis (d = 0.24) is small according to Cohen’s (1992) convention.

Table 3 Teacher’ perception: descriptive statistics and t-test results for teaching practice: investigative, student-centred, authentic

*p < .05, ***p < .001Response categories: 1: never or hardly ever, 2: in some lessons, 3: in most lessons, 4: in almost all lessons

Teaching practice Student Teacher N 95% CI for mean difference r T df d

M SD M SD (p) (p)

investigative 1.60 0.56 1.74 0.60 323 − 0.203, − 0.091 0.62*** (< 0.001) 5.20*** (< 0.001) 322 0.23student-centered 2.78 0.60 2.90 0.57 326 − 0.172, − 0.072 0.69*** (< 0.001) 4.81*** (< 0.001) 325 0.21Authentic 2.55 0.67 2.63 0.74 325 − 0.138, − 0.031 0.76*** (< 0.001) 3.09* (0.002) 324 0.12

Table 4 Students’ perspective: descriptive statistics and t-test results for teaching practice: investigative, student-centred, authentic–aggregated student data

* p < .05, ***p < .001Response categories: 1: never or hardly ever, 2: in some lessons, 3: in most lessons, 4: in almost all lessons

Teaching practice Pre-test Post-test N 95% CI for mean difference

R T df d

M SD M SD (p) (p)

Investigative 1.76 0.58 1.78 0.51 136 − 0.068, 0.036 0.85*** (< 0.001) − 0.61 (0.542) 135 0.03Student-centered 2.69 0.41 2.71 0.36 136 − 0.063, 0.019 0.81*** (< 0.001) − 1.08 (0.282) 135 0.06Authentic 2.43 0.52 2.37 0.58 136 0.013, 0.113 0.86*** (< 0.001) 2.47* (0.015) 135 0.12

Table 5 Descriptive statistics and t-test results for teaching practice : investigative, student-centred, authentic–aggregated student data–compar-ing teachers’ and students’ perception’

* p < .05, ***p < .001Response categories: 1: never or hardly ever, 2: in some lessons, 3: in most lessons, 4: in almost all lessons

Teaching practice Student Teacher N 95% CI for mean difference

r T df d

M SD M SD (p) (p)

Investigative 1.65 0.43 1.59 0.55 147 − 0.144; 0.027 0.455*** (< 0.001) − 1.36 (0.176) 146 0.12student-centered 2.70 0.43 2.74 0.58 147 − 0.043, 0.137 0.432*** (< 0.001) 1.03 (0.306) 146 0.09Authentic 2.43 0.52 2.58 0.72 146 0.043, 0.249 0.526*** (< 0.001) 2.80* (0.006) 145 0.24

282 K. Maass, K. Engeln

1 3

5 Discussion

5.1 Summary and discussion of results

In our project we developed a CPD course to encourage and enable teachers to implement mathematical modelling in their day-today teaching. The course followed relevant CPD quality criteria as discussed in research papers (see 2.2) and was deliberately designed for international use in different cultural settings. Naturally, in all 12 Primas countries the national contexts were different, but there were also many commonalities (e.g., modelling and IBL are included in all curricula, but in most countries not in assessment, see Dorier and Garcia 2013), which supports the idea of designing an international course.

All Primas countries implemented the CPD courses on a large scale following the Cascade-Model. In all courses, the same design principles (see Sect. 2.2) and the seven pre-designed modules were used. National adaptations mainly concerned organizational aspects or the use of additional modules, but did not dilute the major principles of the CPD concept (see Sect. 2.2).

We saw that from the teachers’ perspective their way of teaching changed towards using more investigation, and more authentic contexts, and it became more student-centred. This may indicate that teachers indeed have the intention to change their way of teaching in the intended direction after the course.

Regarding investigation and student-centredness the perception of students seems to point in the same direc-tion. These overlaps in teachers’ and students’ perceptions indicate that these aspects were actually implemented by teachers in their lessons to a certain degree and perceived by students at the end of the CPD.

Therefore, our research study contributes to the theo-retical discussion on scaling-up CPD, as it shows that a CPD course on modelling, which has been scaled up by using the Cascade-Model, can have an impact on the par-ticipating teachers and more importantly, on their teach-ing. Apparently, important knowledge and practices can be “handed down” the cascade (see part 1, OECD 1998). Our study also contributes to theory-building, as we show that a course designed at an international level can have desired impact in different national settings.

Our study also shows that the quality criteria for CPD, which have been set up in small scale studies (see Sect. 2.2), seem to be also relevant for CPD scaled up with the Cascade-Model, as our CPD course was designed fol-lowing these examples.

Additionally, on a more detailed level, we saw that regarding the two scales investigation and student-cen-tredness the perception of students’ seems to point in the

same direction although the difference in perception for students is not significant (research questions 1 and 2). We also saw that teachers’ and students’ perceptions of these concepts seems to be similar (research question 3). As we have said, this indicates that the changes intended by the teacher were actually perceived by the students, but maybe not as clearly as intended. Reasons for this might be that the teachers first need to learn how to implement modelling in class, and their initial attempts to do so are not yet fully developed. Additionally, their implementation efforts might reflect their inexperience at the beginning, and thus related efforts to change might not be visible immediately to students. Also, students might need some time to become adapted to a new way of teaching and change their beliefs about the nature of effective teach-ing (Maass 2004), so as to perceive the intended changes. With this result, our study contributes to the understanding of professional development regarding the timeline needed to implement change. As research has shown, CPD courses need to be long-term (Tirosh and Graeber 2003), but so far it has remained unclear what long-term means: Is 3 days longterm,or 3 weeks? Apparently, our CPD courses, although long-term within a timeframe of between sev-eral weeks and a year, were not long enough to change teaching in such a way that it led to a significant change in students’ perception, although the difference was heading in the right direction.

Finally, our results also show, that there is a significant difference between students’ and teachers’ perceptions regarding authenticity. As opposed to teachers, students perceive a decrease in authenticity from the beginning of the PD course to the end of the PD course. Also our results to research question 3 show a significant difference in the perception of authenticity. Authenticity is a major charac-teristic of modelling tasks (see theoretical background) and therefore this result deserves further attention. Apparently, teachers and students have different perceptions regarding authenticity. This result is in line with a small-scale quali-tative study (Maass 2004), which indicated that students’ understanding of what is relevant to their lives can be very different. Many students consider only those things as rel-evant to their lives, which they need now in their personal daily lives as students, and they do not take into account the relevance for their future or for others. For example, one student found the context of a mobile phone not relevant to real life because this student did not have one (Maass 2004). Different perspectives on authenticity are also highlighted by Vos (2011), who suggests: “…authenticity is a social con-struct that needs to be agreed upon in different communities (educators, students)” (p. 718).

Our study now provides large-scale evidence that the understanding of authenticity seems to be very different between teachers and students, and therefore our results

283Impact of professional development involving modelling on teachers and their teaching

1 3

contribute to the understanding of modelling and its teach-ing. Apparently, the selection of authentic tasks for class-room teaching and thus CPD needs particular attention.

5.2 Limitations and potential of the study

This study of course also has limitations, which need to be taken into account when using the results.

First of all, by combining teachers’ and students’ percep-tions on the teachers’ teaching, we gained valuable insights into the teaching. Nevertheless, we have no insight into what really happened in the classroom.

Second, our results do not provide in-depth knowledge concerning which elements of the PD design actually were the ones that supported the change of teachers and students. It would be the task of another research study to look in more detail at this question.

Third, the implementation in various countries on the one hand shows that the PD design can be used effectively in various cultural settings. On the other hand, the national adaptations regarding the organisational framework do not allow more precise statements regarding how long such a CPD course should be in order to show significant effects on both students and teachers.

Fourth, considering the size of the study we had no other possibility than having the teachers collect the students’ questionnaires. This may have influenced the results as the teacher may know the birth dates of the students and stu-dents may have anticipated this.

Fifth, there was a substantial drop out for some of the tests. We outlined the most important reasons above in the sample section. However, this drop out rate may still have influenced the results, as for example, teachers who did not implement modelling might be among those who did not have their students complete the questionnaires. In relation to this limitation and the one above, in another study it might be worth seeking for methods whereby the researchers could administer the students’ questionnaires.

The potential of this study is that it provides starting points for further research regarding the effects of scaled-up CPD. It would make sense to carry out a similar large-scale study in one country, including not only teachers’ and students’ questionnaires, but also class room observations. Interviews with teachers regarding the design of the CPD course have the potential to lead to detailed results regarding which design elements may be associated with which effects.

It would also be important to research the effects of a more detailed and in-depth consideration of the concept of authenticity in the CPD course as well as the effects of even longer lasting CPD courses.

Acknowledgements The project PRIMAS has received funding from the European Union Seventh Framework Programme (FP7/2007–2013)

under grant agreement no. 244380. This paper reflect only the authors’ views and the European Union is not liable for any use that may be made of the information contained herein.

References

Adler, J., & Jaworski, B. (2009). Public writing in the field of math-ematics teacher education. In R. Even & D. L. Ball (Eds.), The professional education and development of teachers of mathemat-ics–the 15th ICMI study (pp. 249–254). New York: Springer.

Akker, J. v. d, Gravemeijer, K., McKenney, S., & Nieveen, N. (2006). Introducing educational design research. In J. V. D. Akker, K. Gravemeijer, S. McKenney & N. Nieveen (Eds.), Educational design research (Vol. 1, pp. 3–7). Oxford: Routledge Chapman & Hall.

Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1–18.

Artigue, M., & Blomhøj, M. (2013). Conceptualizing inquiry-based education in mathematics. ZDM Mathematics Education, 45(6), 797–810.

Askew, M., Brown, M., Rhodes, V., Johnson, D., & Wiliam, D. (1997). Effective teachers of numeracy. London: Kings College.

Barzel, B., & Selter, C. (2015). Die DZLM-Gestaltungsprinzipien für Fortbildungen. Journal für Mathematik-Didaktik, 36(2), 259–284.

Baumert, J., & Kunter, M. (2006). Stichwort: Professionelle Kompe-tenz von Lehrkräften. Zeitschrift für Erziehungswissenschaft, 9(4), 469–520.

Baumert, J., & Kunter, M. (2013). The COACTIV model of teachers’ professional competence. In M. Kunter, J. Baumert, W. Blum, U. Klusmann, S. Krauss & M. Neubrand (Eds.), Cognitive activa-tion in the mathematics classroom and professional competence of teachers: Results from the COACTIV project (Vol. 8). Berlin: Springer: Mathematics Teacher Education.

Baumert, J., Kunter, M., Brunner, M., Krauss, S., Blum, W., & Neubrand, M. (2004). Mathematikunterricht aus Sicht der PISA–Schülerinnen und Schüler und ihrer Lehrkräfte. In P.-K. Deutschland (Ed.), PISA 2003–Der Bildungsstand der Jugendli-chen in Deutschland–Ergebnisse des zweiten internationalen Ver-gleichs (pp. 314–354). Münster: Waxmann.

Besser, M., Leiss, D., & Klieme, E. (2015). Wirkung von Lehrerfortbil-dungen auf Expertise von Lehrkräften zu formativem Assessment im kompetenzorientierten Mathematikunterricht. Zeitschrift für Entwicklungspsychologie und Pädagogische Psychologie, 47(2), 110–122.

Blum, W. (2011). Can modelling be taught and learnt? Some answers from empirical research. In G. Kaiser, W. Blum, R. B. Ferri & G. Stillman (Eds.), Trends in teaching and learning of mathematical modelling: ICTMA14 (pp. 15–30). New York: Springer Science & Business Media.

Blum, W., & Leiss, D. (2005). Modellieren im Unterricht mit der” Tanken"-Aufgabe. mathematik lehren, 128, 18–21.

Boaler, J. (2008). Bridging the gap between research and practice: International examples of success. In M. Menghini, F. Furinghetti, L. Giarcardi & F. Arzarella (Eds.), The first century of the Inter-national Commission on Mathematics Instruction (1908–2008): Reflecting and shaping the world of mathematics education. Roma: Instituto della Enciclopedia Italiana foundata da Giovanni Treccani.

Clarke, D. (1994). Ten key principles from research for the profes-sional development of mathematics teachers. In D. B. Aichele & A. F. Croxford (Eds.), Professional development for teachers of mathematics (pp. 37–48). Reston: NCTM.

284 K. Maass, K. Engeln

1 3

Clarke, D., & Hollingsworth, H. (2002). Elaborating a model of teacher professional growth. Teaching and teacher education, 18(8), 947–967.

Clausen, M. (2002). Unterrichtsqualität: Eine Frage der Perspek-tive? [Quality of instruction: A matter of persepctive?]. Münster: Waxmann.

Coe, R. (2002). It’s the effect size, stupid: what effect size is and why it is important. In Paper presented at the annual conference of the British Educational Research Association, University of Exeter, 12–14 September 2002.

De Jong, R., & Westerhof, K. J. (2001). The quality of student rat-ings of teacher behaviour. Learning Environments Research, 4(1), 51–85.

Dorier, J.-L., & García, F. J. (2013). Challenges and opportunities for the implementation of inquiry-based learning in day-to-day teach-ing. ZDM Mathematics Education, 45(6), 837–849.

Dorier, J.-L., & Maass, K. (2014). Inquiry-based mathematics educa-tion. In S. Lerman (Ed.), Encyclopedia of Mathematics Education (pp. 300–304). Dordrecht: Springer.

Engeln, K., Euler, M., & Maass, K. (2013). Inquiry-based learning in mathematics and science: A comparative baseline study of teach-ers’ beliefs and practices across 12 European countries. ZDM Mathematics Education, 45(6), 823–836.

Euler, M. (2011). WP9: Report about the survey on inquiry-based learning and teaching in the European partner countries. PRIMAS: Promoting inquiry-based learning in mathematics and science education across Europe.

Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks: Cirwin Press.

Hudson, S. B., McMahon, K. C., & Overstreet, C. M. (2002). The 2000 national survey of science and mathematics education: Compen-dium of tables. Chapel Hill: Horizon Research.

Jackson, K., Cobb, P., Wilson, J., Webster, M., Dunlap, C., & Appel-gate, M. (2015). Investigating the development of mathematics leaders’ capacity to support teachers’ learning on a large scale. ZDM Mathematics Education, 47(1), 93–104.

Kaiser, G., & Schwarz, B. (2010). Authentic modelling problems in mathematics education—Examples and experiences. Journal für Mathematik–Didaktik, 31(1), 51–76.

Kaiser, G., Schwarz, B., & Buchholz, N. (2011). Authentic modelling problems in mathematics education. In G. Kaiser, W. Blum, R. B. Ferri & G. Stillman (Eds.), Trends in teaching and learning of mathematical modelling: ICTMA14 (pp. 591–602). New York: Springer Science & Business Media.

Kaiser, G., & Sriraman, B. (2006). A global survey of international perspectives on modelling in mathematics education. ZDM Math-ematics Education, 38(3), 302–310.

Krainer, K. (2011). Teachers as stakeholders in mathematics education research. In B. Ubuz (Ed.), Proceedings of the 35th conference of the International Group for the Psychology of Mathematics Education (Vol. 1, pp. 47–62). Ankara: Middle East Technical University

Lipowsky, F., & Rzejak, D. (2012). Lehrerinnen und Lehrer als Lerner–Wann gelingt der Rollentausch? Merkmale und Wirkungen wirk-samer Lehrerfortbildungen. Schulpädagogik heute, 3(5), 1–17.

Loucks-Horsley, S., Stiles, K. E., Mundry, S., Love, N., & Hewson, P. W. (2009). Designing professional development for teachers of science and mathematics. London: Corwin Press.

Lüdtke, O., Trautwein, U., Kunter, M., & Baumert, J. (2006). Reli-ability and agreement of student ratings of the classroom envi-ronment: A reanalysis of TIMSS data. Learning Environments Research, 9(3), 215–230.

Maass, K. (2004). Mathematisches Modellieren im Unterricht. Hildesheim: Franzbecker.

Maass, K. (2007). Modelling in class: What do we want students to learn. In C. Haines, P. Galbraith, W. Blum & S. Khan (Eds.),

Mathematical modelling: Education, engineering and econom-ics–ICTMA 12 (pp. 63–78). Chichester: Horwood.

Maass, K. (2011). How can teachers’ beliefs affect their professional development? ZDM Mathematics Education, 43(4), 573–586.

Maass, K., & Artigue, M. (2013). Implementation of inquiry-based learning in day-to-day teaching: a synthesis. ZDM Mathematics Education, 45(6), 779–795.

Maass, K., & Doorman, M. (2013). A model for a widespread imple-mentation of inquiry-based learning. ZDM Mathematics Educa-tion, 45(6), 887–899.

Marsh, H. W., Trautwein, U., Lüdtke, O., Köller, O., & Baumert, J. (2005). Academic self-concept, interest, grades and standardized test scores: reciprocal effects models of causal ordering. Child Development, 76(2), 397–416.

McLaughlin, M. W., & Talbert, J. E. (2006). Building school-based teacher learning communities: Professional strategies to improve student achievement (Vol. 45). New York: Teachers College Press.

Mischo, C., & Maass, K. (2013). The effect of teacher beliefs on stu-dent competence in mathematical modeling–An intervention study. Journal of Education and Training Studies, 1(1), 19–38.

Niss, M. (1992). Applications and modelling in school mathematics–Directions for future developement. Roskilde: IMFUFA Roskilde Universitetscenter.

Niss, M., Blum, W., & Galbraith, P. L. (2007). Introduction. In W. Blum, P. L. Galbraith, H.-W. Henn & M. Niss (Eds.), Modelling and applications in mathematics education. The 14th ICMI Study (pp. 3–32). New York: Springer.

OECD (1998). Staying ahead: In-service training and teacher profes-sional development: Paris: OECD Publishing.

OECD (2009). Technical report—PISA 2006. Paris: OECD Publishing.OECD (2014). TALIS 2013 results: An international perspective on

teaching and learning. Paris: OECD Publishing.OECD (2016). PISA 2015 results (Volume II): Policies and practices

for successful schools. PISA: OECD Publishing, Paris.Palm, T. (2007). Features and impact of the authenticity of applied

mathematical school tasks. In W. Blum, P. L. Galbraith, H.-W. Henn & M. Niss (Eds.), Modelling and applications in mathemat-ics education. The 14th ICMI Study (pp. 201–208). New York: Springer.

Perrin-Glorian, M.-J., Deblois, L., & Robert, A. (2008). Individual practicing mathematics teachers: Studies on their professional growth. In K. Krainer & T. Wood (Eds.), Participation in math-ematics teacher education. Individuals, teams, communities and networks (Vol. 3, pp. 35–39). Rotterdam: Sense Publishers.

Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Edu-cational Researcher, 29(1), 4–15.

Radford, L. (2010). The anthropological turn in mathematics educa-tion and its implication on the meaning of mathematical activity and classroom practice. Acta Didactica Universitatis Comenianae Mathematics, 10, 103–120.

Rocard, M., Csermely, P., Jorde, D., Lenzen, D., Walberg-Henriksson, H., & Hemmo, V. (2007). Rocard report: “Science education now: A new pedagogy for the future of Europe”. EU 22845, European Commission.

Roesken-Winter, B., Hoyles, C., & Blömeke, S. (2015a). Evidence-based CPD: Scaling up sustainable interventions. ZDM Mathemat-ics Education, 47(1), 1–12.

Roesken-Winter, B., Schüler, S., Stahnke, R., & Blömeke, S. (2015b). Effective CPD on a large scale: examining the development of multipliers. ZDM Mathematics Education, 47(1), 13–25.

Schukajlow, S., Krug, A., & Rakoczy, K. (2015). Effects of prompting multiple solutions for modelling problems on students’ perfor-mance. Educational Studies in Mathematics, 89(3), 393–417.

Shulman, L. S. (1986). Paradigms and research programs in the study of teaching: A contemporary perspective. In M. C. Wittrock

285Impact of professional development involving modelling on teachers and their teaching

1 3

(Ed.), Handbook of research in teaching (pp. 3–36). New York: Macmillan.

Skott, J. (2013). Understanding the role of the teacher in emerging classroom practices: Searching for patterns of participation. ZDM Mathematics Education, 45(4), 547–559.

Swan, M. (2005). Improving learning in mathematics: Challenges and strategies. Sheffield: Teaching and Learning Division, Department for Education and Skills Standards Unit.

Swan, M. (2006). Collaborative learning in mathematics: A chal-lenge to our beliefs and practices. London: National Institute for Advanced and Continuing Education (NIACE) for the National Research and Development Centre for Adult Literacy and Numer-acy (NRDC).

Swan, M. (2007). The impact of task-based professional development on teachers’ practices and beliefs: A design research study. Jour-nal of Mathematics Teacher Education, 10(4–6), 217–237.

Tirosh, D., & Graeber, A. O. (2003). Challenging and changing mathematics teaching classroom practices. In A. Bishop, M. A.

Clements, C. Keitel, J. Kilpatrick & F. Leung (Eds.), Second international handbook of mathematics education (pp. 643–687). Dordrecht: Kluwer Academic Publishers.

Valentine, J. C., & Cooper, H. (2003). Effect size substantive inter-pretation guidelines: issues in the interpretation of effect sizes. Washington, DC: What Works Clearinghouse.

Vos, P. (2011). What is “authentic” in the teaching and learning of mathematical modelling? In G. Kaiser, W. Blum, R. B. Ferri & G. Stillman (Eds.), Trends in teaching and learning of mathematical modelling (pp. 713–722). Dordrecht: Springer.

Weiss, I. R., Pasley, J. D., Smith, P. S., Banilower, E. R., & Heck, D. J. (2003). Looking inside the classroom. Chapel Hill: Horizon Research Inc.

Zehetmeier, S., & Krainer, K. (2011). Ways of promoting the sustain-ability of mathematics teachers’ professional development. ZDM Mathematics Education, 43(6–7), 875–887.