pqpi summary report - ehosting.ca · appendix!f:!practicum!success!criteria!..... ......

33
TITLE 1 MENTORING WITH A PURPOSE MAKING GROWING SUCCESSSUSTAINABLE Practicum by: Jeff Boulton [email protected] Mentored by: Tara Connor Vice Principal, Iroquois Ridge High School [email protected] Associate Mentor for jobshadowing: Michael Gallant Vice Principal, White Oaks Secondary School [email protected] PQP Instructor: Cam Fraser [email protected]

Upload: tranminh

Post on 21-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

TITLE   1    

                       

   

MENTORING  WITH  A  PURPOSE  MAKING  “GROWING  SUCCESS”  SUSTAINABLE      

                   Practicum  by:    Jeff  Boulton  [email protected]    Mentored  by:  Tara  Connor  Vice  Principal,  Iroquois  Ridge  High  School  [email protected]    Associate  Mentor  for  job-­‐shadowing:  Michael  Gallant  Vice  Principal,  White  Oaks  Secondary  School  [email protected]    PQP  Instructor:  Cam  Fraser  [email protected]        

TITLE   2    

 Table  of  Contents    Introduction  .................................................................................................................  3  

Background  ..................................................................................................................  3  

Methodology  ...............................................................................................................  6  The  challenge  ...............................................................................................................................................................  6  The  process  ...................................................................................................................................................................  6  Results  .........................................................................................................................  8  Teacher  Results  ...........................................................................................................................................................  8  Student  Results  ...........................................................................................................................................................  9  Discussion  ..................................................................................................................  10  Relationship  between  challenges  and  successes  to  literature  ............................................................  10  Limitations  and  Future  Work  ............................................................................................................................  11  Relationship  to  educational  leadership  and  professional  standards  ...............................................  12  Conclusion  .................................................................................................................  14  

References  .................................................................................................................  15  

Appendices  ................................................................................................................  17  Appendix  A:  Teacher  Survey  Responses  .......................................................................................................  17  Appendix  B:  Student  Response  Survey  ..........................................................................................................  20  Appendix  C:  Samples  of  Student  Self-­‐Reflections  ......................................................................................  22  Appendix  D:  Letters  of  Praise  and  Appreciation  .......................................................................................  26  Appendix  E:  Practicum  Log  .................................................................................................................................  30  Appendix  F:  Practicum  Success  Criteria  ........................................................................................................  33        

TITLE   3    

INTRODUCTION  

The  extent  to  which  it  is  understood  that  regular,  timely  feedback  through  assessment  can  improve  student  learning  is  not  currently  reflected  in  teacher  practice.  In  fact,  few  classrooms  offer  any  opportunities  for  students  to  provide  feedback  or  to  self-­‐assess.  Even  fewer  provide  opportunities  for  formalized  processes  of  self-­‐reflection.  Given  the  initiatives  recently  mandated  by  the  Ministry  of  Education  in  Ontario,  and  given  the  value  of  these  processes,  the  purpose  of  this  practicum  was  to  devise  a  way  which  would  enable  teachers  to  allow  students  to  self-­‐assess  and  self-­‐reflect,  and  then  collect  and  collate  this  data  into  a  report  for  easy  analysis.  Given  the  value  of  reflection  and  feedback,  the  adoption  and  implementation  of  any  such  device  would  arguably  be  almost  completely  positive  on  student  achievement.  Moreover,  the  design  and  implementation  of  such  a  device  would  in  itself  be  an  excellent  demonstration  of  educational  leadership.  After  some  experimentation,  the  software  chosen  was  a  Google  Docs  spreadsheet,  a  completely  web-­‐based  collaborative  tool.  The  resulting  file  is  an  easy  to  use,  accessible,  and  cost-­‐effective  way  to  analyze  student  data  in  order  to  respond  to  individual  student  needs.  It  has  been  received  with  mostly  positive  feedback,  and  thus  this  practicum  has  achieved  its  stated  objectives.  

The  summary  of  this  endeavor  begins  with  a  description  of  the  impetus  for  this  initiative  and  a  review  of  existing  literature  supporting  the  need  for  such  a  tool.  That  is,  the  benefits  of  feedback,  self-­‐assessment,  and  reflection  on  student  achievement.  It  also  discusses  current  teacher  practices  while  mentioning  impediments  to  progress  in  effecting  real  and  sustained  change  in  education.  An  outline  of  the  process  that  was  undertaken  in  order  to  complete  this  practicum  is  outlined,  followed  by  a  discussion  of  the  results  of  the  implementation  as  well  as  anecdotal  and  scaled  responses  from  a  teacher  and  a  student  survey.    Results  are  summarized  and  discussed  as  well  as  their  limitations,  with  reference  made  to  extensive  appendices.  The  paper  concludes  with  a  brief  discussion  of  connections  made  to  the  leadership  framework  through  the  process.      BACKGROUND  

It  was  quickly  becoming  evident  that  Growing  Success,  the  latest  policy  of  the  Ministry  of  Education  of  Ontario  (Ontario  Ministry  of  Education,  2010),  was  going  to  disrupt  the  status  quo  for  a  lot  of  educators.  Perhaps  most  daunting  was  the  new  (or  renewed)  focus  on  learning  skills.  As  with  most  initiatives  involving  change,  there  is  usually  a  time  of  adjustment,  confusion,  and  displaced  roles  causing  a  great  deal  of  anxiety  amongst  staff  (Fullan,  2007).  This  would  be  no  exception.  Moreover,  it  was  easy  to  perceive  that  with  attention  on  learning  skills  increasing  along  with  the  requirement  to  gather  sufficient  evidence  of  achievement  and  growth  in  these  skills,  the  result  could  easily  be  increased  workload  for  teachers,  particularly  if  schools  and  boards  approached  the  task  in  an  erratic  and  disorganized  fashion,  as  many  feared  they  would.    

TITLE   4    

While  most  teachers  concerned  themselves  with  restated  intentions  surrounding  late  penalties  for  assignments,  minimum  marks,  and  the  permissibility  of  other  traditional  punitive  measures,  it  was  quickly  determined  that  the  best  direction  in  which  to  proceed  in  this  practicum  was  to  deal  with  the  greater  issue.  Growing  Success  makes  recommendations  and  assertions  based  on  extensive  research  (some  cited  below),  the  implications  of  which  are  nothing  less  than  a  fundamental  shift  in  the  teaching  and  learning  process.  Though  firmly  grounded  in  science,  the  implementation  of  its  core  principles  could  easily  result  in  increased  workloads  if  mishandled,  particularly  in  the  collection  of  evidence  to  measure  and  incorporate  learning  skills  and  self-­‐assessment.  The  aim  of  this  practicum  was  to  endeavor  to  prevent  this  from  happening.  The  goal  seemed  clear  then:  accomplish  this  task  in  a  way  that  minimized  or  even  reduced  teacher  workload,  while  incorporating  some  of  the  central  themes  of  Growing  Success  with  its  enhanced  focus  on  feedback,  assessment,  and  student  self-­‐assessment.    

This  is  an  endeavor  worthy  of  the  time  devoted  to  it.  Assessment  and  self-­‐assessment  are  hardly  the  result  of  political  fashion.  It  is  now  understood  that  formative  assessment,  regular  feedback,  and  in  particular  self-­‐assessment  and  self-­‐reflection  have  very  strong  influences  on  student  achievement  (Andrade  &  Valtcheva,  2009),(Black  &  Wiliam,  1998),  (Bierer,  Dannefer,  Taylor,  Hall,  &  Hull,  2008),  (Brown  &  Hirschfeld,  2007),  (Butler,  Pyzdrowski,  Goodykoontz,  &  Walker,  2008),  (Chen,  Heritage,  &  Lee,  2005),  (Heritage,  2009),  (Kohn,  2009),  as  well  as  on  student  learning  habits,  motivation  (Biggs,  1998),  and  self-­‐confidence  (Nguyen,  Hsieh,  &  Allan,  2006).  Indeed,  the  more  a  student  believes  that  assessment  builds  individual  accountability  while  simultaneously  accepting  that  this  in  turn  is  good  for  students,  the  better  student  outcomes  are  (Brown  &  Hirschfeld,  2007).  As  if  measurable  benefit  was  not  enough,  advances  in  the  cognitive  sciences  have  confirmed  not  only  the  benefits  of  reflection  on  learning  (CISCO,  2008),  but  that  virtually  all  adolescent  learners  are  capable  of  engaging  in  it  (Heritage,  2009).  This  confirms  that  self-­‐assessment  needs  to  be  a  major  component  of  every  secondary  teacher’s  skill  set,  yet  as  recent  as  ten  years  ago,  self-­‐assessment  was  rare  in  most  classrooms  (Biggs,  1998).  According  to  the  outcomes  of  this  practicum,  it  still  would  appear  to  be  so,  which  only  further  strengthens  the  case  for  developing  this  tool.  

It  is  not  only  a  moral  imperative  to  care  for  and  ensure  each  student  reaches  his  or  her  fullest  potential  (Ontario  College  of  Teachers,  2010),  (Lingard,  Hayes,  Mills,  &  Christie,  2003),  there  is  a  legal  albeit  vague  obligation  in  the  Education  Act  to  promote  student  achievement  and  deliver  effective  and  appropriate  education  at  the  supervisory  level  (Government  of  Ontario,  1990).  In  light  of  this  and  the  extensive  body  of  literature  on  the  benefit  of  feedback  and  reflection,  an  educator  could  hardly  be  considered  professional  if  he  or  she  failed  to  provide  time  for  such  activities,  or  if  an  administrator  did  not  encourage  or  develop  staff  to  do  so.  However,  as  previously  stated,  this  is  not  occurring.  

The  need  for  a  tool  that  can  easily  administer,  collect,  and  collate  reflections  seems  clear;  the  only  question  that  remained  was  what  form  it  should  take.  It  was  understood  that  though  feedback  is  valuable  in  many  forms  (Debuse,  Lawley,  &  Shibl,  2008),  it  needs  to  be  on  going,  regular,  sufficiently  specific,  and  timely  enough  that  students  can  bridge  gaps  in  their  learning  and  make  productive  changes  in  their  

TITLE   5    

learning  habits  (Orrell,  2006),  (Weaver,  2006),  (Bierer,  Dannefer,  Taylor,  Hall,  &  Hull,  2008)  (Butler,  Pyzdrowski,  Goodykoontz,  &  Walker,  2008),  (Mok,  Lung,  Cheng,  Cheung,  &  Ng,  2006).  Thus,  this  was  the  challenge:  to  devise  a  way  to  allow  teachers  to  easily  collate  and  collect  this  data  so  that  they  can  respond  to  individual  student  needs  without  a  high  demand  on  their  time.  Only  modern  information  technology  can  make  this  happen  to  the  extent  necessary  to  realize  significant  effect.  

It  was  understood  well  in  advance  that  this  would  be  the  easy  part  of  the  process.  Ever  since  Black  and  William  (1998)  first  published  their  significant  work  on  assessment  there  appears  to  have  been  only  moderate  progress  in  its  adoption  and  productive  usage  in  most  classrooms.  Teacher  use  of  feedback  does  not  match  stated  beliefs,  nor  is  it  usually  timely  enough  or  specific  enough  to  yield  any  benefit  (Weaver,  2006),  (Orrell,  2006).  Most  teachers  continue  using  traditional  testing  methods  despite  their  detrimental  effect  on  developing  independent  life  long  learning  skills  (Harlen  &  Crick,  2003),  (Boud  &  Falchikov,  2006).  This  should  be  a  major  concern  since  we  know  that  positive  student  attitudes  often  result  in  improved  achievement,  and  course  characteristics  such  as  types  of  assessment  can  affect  those  attitudes  (Nguyen,  Hsieh,  &  Allan,  2006),  (Harlen  &  Crick,  2003).  Change  appears  to  be  a  slow  if  not  nearly  stagnant  process  in  education.  Given  that  what  is  known  to  constitute  productive  practice  with  positive  results  on  student  achievement  is  largely  absent  from  most  classrooms  in  developed  countries  (State  of  Queensland,  Department  of  Education,  2001),  (Lingard,  Hayes,  Mills,  &  Christie,  2003),  (Boud  &  Falchikov,  2006),  it  stands  to  reason  change  is  not  forthcoming  in  education.    

No  practicum  initiative  that  introduces  change  could  be  successful  if  it  does  not  address  the  change  process  (Fullan,  2007),  moreover,  the  initiative  in  question  had  to  be  technological  in  nature,  complicating  matters  further.  The  literature  is  clear  that  educators  have  been  slow  to  adopt  technology  into  curriculum  delivery,  even  when  they  may  use  it  frequently  for  administrative  tasks  in  their  role  as  teacher  or  just  to  make  what  they  have  always  done  more  efficient  (Kotrlik  &  Redmann,  2005),  (Palak  &  Walls,  2009),  (Tucker,  2009).  This  is  more  than  a  funding  or  access  issue  as  well  (Scardamalia,  2001).  In  fact,  some  of  the  largest  technology  initiatives  have  failed  miserably  due  to  poor  implementation,  a  lack  of  effective  teacher  training  and  continued  support  (Dale,  Robertson,  &  Shortis,  2004).  If  this  process  were  to  be  successful  in  the  long  run,  it  would  have  to  address  the  two  largest  obstacles  to  that  success:  getting  teachers  to  use  the  software,  and  having  teachers  use  the  results  to  adjust  practice  or  respond  to  student  needs.            

TITLE   6    

METHODOLOGY  THE CHALLENGE  

The  challenge  then  was  to  devise  a  process  that  would  manage  all  of  the  realities  that  existed  at  the  time.  A  process  recognizing  that  the  redefined  focus  in  Growing  Success  had  the  potential  to  increase  workloads  if  mishandled,  that  the  literature  strongly  confirmed  the  benefit  of  not  only  regular  and  timely  feedback,  but  of  self-­‐reflection  and  self-­‐assessment  by  students,  and  that  as  an  educational  leader  to  tolerate  the  absence  of  such  activities  in  schools  was  tantamount  to  professional  malpractice.  Moreover,  through  this  process  this  practicum  would  offer  as  a  remedy  a  tool  that  required  the  implementation  of  changes  in  practice,  at  the  very  least  the  adoption  of  a  technology  tool  some  would  be  uncomfortable  with.     Given  the  time  constraints  of  this  endeavor,  it  was  determined  early  on  that  little  could  be  done  to  assist  teachers  with  the  interpretation  of  results  and  how  to  adjust  practice  or  respond  to  student  need,  though  results  would  suggest  it  may  not  be  necessary  (Appendix  A,  questions  15-­‐17)  contrary  to  research  (Lachat  &  Smith,  2005).  Instead,  attention  would  be  directed  to  the  design  of  the  tool  prior  to  the  second  semester.  Beginning  in  the  second  semester  the  dissemination  of  the  tool  would  begin,  followed  by  training  and  follow-­‐up  support  as  required.  An  evaluation  would  take  place  towards  the  end  of  the  semester  to  determine  successes  and  issues  needing  to  be  addressed.        THE PROCESS  Building  the  file    

The  initial  design  proved  to  be  more  difficult  than  anticipated.  As  this  author  is  experienced  in  spreadsheet  design,  it  was  felt  that  a  Google  Docs  survey  (which  compiles  the  results  automatically  in  spreadsheet  form)  could  be  used  to  gather  student  responses,  and  then  the  data  could  be  exported  to  an  Excel  spreadsheet.  From  there  it  would  be  organized  into  a  report  format  for  each  student.  The  design  incorporated  the  feedback  of  one  other  teacher  in  a  minor  capacity,  who  was  helpful  in  finding  a  number  of  issues  in  the  early  stages.  Though  the  technology  was  sound,  the  process  proved  untenable  when  first  attempted  at  a  trial  run  during  a  Professional  Development  session  (Appendix  E,Appendix  E  Dec.  6-­‐7).  The  size  of  the  audience  was  too  great  to  instruct  clearly,  and  the  steps  required  were  too  much  to  comprehend  for  the  less  technically  savvy  educators.  The  decision  was  made  to  limit  the  initial  trial  run  to  those  more  willing  and  able  to  embrace  the  initiative,  both  technically  and  pedagogically.  It  was  also  decided  that  to  simplify  the  learning  for  staff,  the  entire  file  would  be  designed  to  operate  within  Google  Docs  so  that  a  single,  web-­‐based  tool  would  result.  The  problem  was  no  one  knew  if  that  was  even  possible.  The  result  was  a  great  deal  of  time  determining  how  to  make  it  happen.      

TITLE   7    

Assembling  a  team  of  the  willing    

Though  changed  attitudes  do  not  occur  until  one  has  actually  tried  something  new  (Fullan,  2007),  the  original  approach  was  not  the  solution.  It  became  clear  that  it  would  be  necessary  to  build  upon  success  in  order  for  all  staff  to  buy  in  and  consider  using  the  software.  This  necessitated  assembling  a  small  team  of  staff  who  were  already  both  more  confident  with  technology  and  willing  to  experiment,  and  who  saw  the  potential  benefits  of  the  software  and  of  the  process  of  collecting  student  reflections  as  well.  It  was  hoped  that  these  perceptions  would  make  a  difference  in  the  willingness  of  participants  to  try  and  incorporate  the  tool  in  their  practice  as  the  literature  suggests  (Kotrlik  &  Redmann,  2005).  

 Distribution  and  training    

This  team  was  voluntary  and  met  together  only  once  for  the  initial  training  and  distribution  of  the  file.  It  consisted  of  a  much  more  manageable  group  of  eight  teachers.  The  process  involved  an  overall  explanation,  followed  by  distribution.  The  remainder  of  the  workshop  involved  one-­‐on-­‐one  support  with  using  the  file,  as  well  as  with  the  use  of  Google  Docs  itself.  Some  staff  were  intrigued  enough  that  they  quickly  moved  on  from  using  the  survey  (which  they  grasped  quite  quickly)  to  building  their  own  surveys.  The  file  was  subsequently  distributed  at  three  separate  workshops  during  the  semester  to  teachers  outside  of  the  author’s  school  and  board,  where  it  was  well  received  also  (Appendix  E,  March  7,  April  8,  May  6).    Implementation  and  monitoring    

The  timing  of  distribution  was  significant;  it  preceded  the  start  of  the  second  semester.  Staff  was  left  to  introduce  and  implement  the  survey  in  their  respective  classes.  If  they  required  assistance  they  were  to  contact  the  author  for  support.  The  same  instructions  were  offered  to  staff  at  out-­‐of-­‐board  locations,  but  they  did  not  receive  the  file  at  the  start  of  their  respective  semesters.    Feedback  and  evaluation    

A  feedback  survey  was  designed  to  monitor  progress  and  collect  anecdotal  comments  after  one  semester.  In  the  spirit  of  this  practicum,  it  was  designed  using  Google  Docs  and  administered  to  both  staff,  as  well  as  to  the  students  of  this  author  who  were  also  involved  in  the  trial  run  and  initial  use  of  the  software.  Both  surveys  were  administered  in  the  last  two  full  weeks  of  school  in  June  (Appendix  A  and  Appendix  B),  which  was  much  later  than  originally  planned  due  to  time  constraints  as  well  as  the  sense  that  it  had  not  yet  been  widely  used  by  all  members  of  the  team.  The  decision  was  thus  made  to  delay  the  surveys  until  semester’s  end,  at  which  point  the  survey  was  designed,  administered,  and  results  were  analyzed.  

   

TITLE   8    

RESULTS  TEACHER RESULTS

Overall  the  response  was  overwhelmingly  positive  for  the  efforts  undertaken  during  this  practicum  (Appendix  D).  Although  the  sample  is  hardly  representative  –  indeed  the  survey  is  anything  but  scientific  –  results  indicate  what  most  would  call  a  qualified  success.  Indeed,  they  are  consistent  with  the  author’s  expectations.  By  the  semester’s  end  thirty-­‐four  teachers  across  Ontario  had  willingly  received  the  Google  Docs  file  during  four  separate  workshops.  The  semester-­‐end  teacher  survey  was  distributed  by  email  to  each  of  these  individuals  with  responses  totaling  twelve  (Appendix  A).  Of  these  twelve,  three  reported  having  used  the  file  during  the  semester,  though  the  rate  of  adoption  may  be  higher  as  it  is  known  that  at  least  one  who  did  not  respond  to  the  survey  used  the  file.  

One-­‐variable  data  is  still  very  telling,  suggesting  the  methodology  was  sound  in  selecting  willing  and  technically  able  colleagues  to  build  the  initial  test  team.  The  fact  that  an  additional  twenty-­‐six  teachers  in  the  province  who  received  the  file  also  self-­‐selected  for  the  workshops,  increased  the  likelihood  that  they  shared  similar  technical  confidence  and  pedagogical  views  with  the  original  eight  team  members.  These  two  facts  likely  help  explain  the  responses  of  the  nine  teachers  who  did  not  use  the  file.  Most  significantly,  only  two  experienced  any  technical  hesitation.  In  fact  the  nine  laid  blame  almost  exclusively  on  time  constraints,  as  expected.  Some  were  just  waiting  until  the  beginning  of  a  semester  to  incorporate  the  survey  as  part  of  regular  learning,  given  that  they  were  not  part  of  the  original  team  and  received  the  file  part  way  through  the  semester.  One  did  respond  that  follow-­‐up  several  weeks  later  would  have  been  helpful,  confirming  the  findings  of  (Scardamalia,  2001)  and  (Dale,  Robertson,  &  Shortis,  2004).  Though  help  was  offered  if  needed,  regrettably  neither  the  logistical  constraints  of  the  author’s  job  nor  of  this  practicum  enabled  effective  support  (such  as  additional  workshops)  for  the  26  teachers  spread  across  Ontario.  

Significantly,  all  who  used  the  file  agreed  that  the  process  was  worthwhile.  One  unexpected  result  was  that  all  teachers  who  responded  as  having  used  the  file  viewed  results,  and  all  felt  more  than  able  to  adjust  teaching  or  instruction  to  accommodate  student  feedback,  suggesting  that  student  responses  can  and  would  be  used  to  have  an  impact  on  learning  outcomes,  somewhat  contradicting  the  findings  of  (Lachat  &  Smith,  2005),  (Ljungdahl  &  Prescott,  2009).  

As  well,  one-­‐variable  results  suggest  that  teachers  who  attempted  to  use  the  tool  felt  that  students  were  not  disagreeable  to  reflecting  or  self-­‐assessing  through  this  medium.  What’s  more,  most  do  in  fact  complete  it  and  as  though  to  confirm  the  literature,  students  are  able  to  provide  valuable  feedback  (Heritage,  2009).  One  teacher  writes:    

 “Several  students  who  I  would  not  have  expected  to  respond  gave  very  thoughtful  input.      I  found  that  this  helped  me  understand  my  students  

TITLE   9    

better  and  this  information  was  useful  when  speaking  with  parents  during  interviews”  (Appendix  A).    

Results  seems  to  be  insightful  and  surprising  for  this  teacher,  and  this  holds  the  promise  of  benefits  for  student  achievement  and  apparently  even  for  the  parent  community.  This  same  teacher  did  not  prepare  students  for  self-­‐assessing  and  self-­‐reflecting  by  discussing  the  purpose  and  value  of  reflection  on  learning  at  any  time  prior  to  students  completing  the  Google  Docs  survey.  In  fact,  this  was  the  only  teacher  responding  who  assigned  the  reflection  and  self-­‐assessment  activity  for  homework  exclusively.    Notwithstanding  this,  it  would  seem  this  teacher  obtained  tangible  results  anyway.  If  this  is  true  with  minimal  effort,  imagine  the  potential  if  this  educator  had  made  reflection  a  centerpiece  of  the  learning  in  his  or  her  classroom.    

When  data  was  more  closely  examined  (two-­‐variable  statistics),  there  were  further  results  of  interest.  It  became  clear  that  the  same  teacher  just  mentioned  was  also  was  the  only  teacher  who  did  not  experience  meaningful  class  discussion  as  a  result  of  conducting  the  survey.  All  the  rest  including  this  author,  discussed  the  purpose  and  value  of  reflection  on  learning  prior  to  administering  the  survey,  and  had  at  least  some  meaningful  discussion  at  some  point  afterwards.  These  increased  results  from  greater  student  acceptance  also  confirm  what  is  found  in  the  literature  (Brown  &  Hirschfeld,  2007).  Thus,  in  spite  of  the  lack  of  discussion  or  preparation,  significant  results  were  achieved.  Interestingly,  age  may  also  be  a  factor,  as  the  older  the  grade  level,  the  more  the  teacher  reported  students  as  being  agreeable  to  the  process.        STUDENT RESULTS  

Results  from  the  survey  of  the  author’s  students  were  perhaps  more  compelling  than  those  from  the  teachers  (Appendix  B).  There  was  an  overwhelmingly  positive  response,  confirming  virtually  all  the  literature  on  the  subject  and  thoroughly  validating  the  author’s  efforts  and  this  practicum’s  purpose.  Results  are  more  compelling  since  they  represent  a  voluntary  response  rate  of  over  50%  of  all  students  this  semester;  unfortunately,  the  fact  that  they  are  exclusively  grade  twelve  students  in  academic  courses  diminishes  the  impact.    

Just  the  same,  they  are  worthy  of  note.  Responses  indicate  that  68%  of  students  already  reflected  on  their  own  learning  long  before  the  Google  Docs  self-­‐reflection  file  was  introduced  into  the  class.  Not  surprising  then  is  that  92%  of  respondents  were  agreeable  to  conducting  the  survey,  and  80%  agreed  or  strongly  agreed  that  they  reflected  in  a  meaningful  way  so  that  responses  would  yield  useful  results.  In  fact,  three  students  commented  anecdotally  that  they  felt  it  would  be  advantageous  to  do  more  reflections,  and  some  stated  that  results  should  be  compiled  and  sent  to  students  (though  this  was  in  fact  done).  All  survey-­‐specific  comments  were  positive;  at  worst  they  only  encouraged  that  the  process  be  even  more  integrated  into  the  learning  environment.    

TITLE   10    

It  is  easy  to  understand  their  reasoning.  56%  of  students  agreed  that  by  engaging  in  a  more  formal  process  of  reflection  as  a  result  of  participating  in  the  survey,  they  discovered  things  about  themselves  that  they  did  not  expect,  and  a  further  68%  felt  that  they  could  use  the  compiled  results  of  multiple  reflections  to  discover  something  about  how  they  learned.  Overall,  84%  found  the  process  worthwhile.  Sadly,  the  strongest  statement  may  be  the  final  question:  52%  disagreed  or  strongly  disagreed  that  they  have  the  opportunity  to  reflect  or  provide  feedback  in  most  of  their  classes,  corroborating  the  literature.  

DISCUSSION  RELATIONSHIP BETWEEN CHALLENGES AND SUCCESSES TO LITERATURE  

These  practicum  results  confirm  what  was  expected,  and  what  literature  reviewed  to  this  point  also  substantiates.  Most  teachers  are  reluctant  to  invoke  a  change  in  practice,  but  even  those  who  are  capable  and  willing  will  find  it  difficult  without  the  proper  supports  in  place.  The  relatively  poor  adoption  rate  is  evidence  of  this.  All  were  initially  keen  and  very  receptive  to  the  idea,  and  particularly  to  the  fact  that  they  were  receiving  the  file  for  free.  However,  only  three  of  twelve  respondents  claimed  to  have  used  the  file.  Responses  from  teachers  outside  of  this  author’s  school  were  positive,  but  again  stressed  the  difficulty  of  finding  the  time  to  integrate  it  into  curriculum,  citing  the  honest  intent  to  do  so  at  the  start  of  the  new  school  year.  One  respondent  wrote:    

 “…I  learned  a  lot  from  Jeff's  session  and  would  like  to  incorporate  more  of  his  elements  in  my  class  surveys,  especially  interested  in  the  mapping,  graphing  sections.  Unfortunately  time  is  always  running  short  and  being  able  to  grasp  a  new  concept,  implement  it  with  some  consistency  is  a  challenge.    The  session  I  am  referring  to  was  short  1hr  as  part  of  PD  day  in  April  we  had  enough  time  for  a  brief  introduction  and  to  ‘play  with’  the  survey,  however  any  follow  up  time  was  independent.”    Even  the  limited  team  of  eight  at  the  author’s  school  likely  required  more  on  

going  support  than  they  claimed.  Not  all  attempted  to  use  the  survey  file,  and  one  did  so  without  discussing  the  intent  or  purpose  with  the  class.  

What  is  compelling  is  that  in  spite  of  the  challenges,  the  results  experienced  by  those  few  teachers  who  did  try  it  were  surprising  and  intriguing  enough  that  they  still  saw  the  value  in  the  endeavor,  and  intend  to  try  it  again.  This  is  the  result  that  was  expected,  both  instinctively  by  the  author,  as  well  as  in  existing  research.    

The  value  of  this  file  is  in  the  data  it  collects.  Examine  the  comment  from  one  of  the  author’s  students  below:  

 

TITLE   11    

“There  was  a  big  project  due  but  due  to  my  time  management  skills,  I  struggled  to  finish  on  time.  I  left  it  to  the  last  minute,  which  affected  how  well  I  did  the  project”  (Appendix  C,  student  two).    

Such  a  comment  indicates  quite  clearly  that  not  only  are  students  capable  of  honest  reflection,  but  that  by  compiling  such  comments  they  and  their  teachers  can  see  a  pattern  of  behavior  that  may  not  be  obvious  otherwise.  Any  or  all  of  these  reflections  combined,  provide  valuable  targets  for  the  teacher’s  attention  and  intervention.  The  result  should  be  improved  learning  skills  and  improved  outcomes,  and  the  research  supports  this  finding.     What  is  unfortunate  is  that  the  number  of  teachers  who  will  not  try  to  implement  such  a  file  or  even  the  practice  of  self-­‐assessment  and  reflection  for  students  will  greatly  outnumber  those  who  will,  and  this  is  disappointing  considering  the  strength  of  the  student  results  from  the  process.      LIMITATIONS AND FUTURE WORK  

In  essence  this  was  a  trial  run  that  provided  tremendous  leadership  and  learning  opportunities  for  the  author.  However,  admittedly,  in  the  author’s  class  there  was  limited  official  response  to  student  feedback,  resulting  in  minimal  impact  save  what  intervention  was  conducted  by  the  students  themselves.  Most  of  the  effort  was  put  into  designing  and  testing  the  Google  Docs  file  for  its  initial  uses.  It  is  not  known  to  what  extent  other  teachers  responded  or  modified  instruction.  

As  well,  the  team  was  under-­‐supported  during  the  semester,  meeting  only  once  in  the  beginning  of  the  process  and  receiving  a  reminder  to  conduct  the  survey  sometime  before  midterms.  No  opportunity  was  given  until  the  end  to  solicit  feedback  or  dialogue  about  best  practice  or  success  stories.  They  could  have  been  beneficial  to  the  process,  and  mere  act  of  continued  support  is  essential  for  sustained  change  (Fullan,  2007),  (Dale,  Robertson,  &  Shortis,  2004).  

Though  results  mirror  the  body  of  research  literature,  it  must  be  made  clear  that  results  from  feedback  surveys  consisted  of  small,  non-­‐representative  samples.  Students  responding  were  all  in  the  author’s  class,  and  all  were  senior  academic  students  from  a  school  in  a  high  socio-­‐economic  neighborhood.  Furthermore,  there  is  no  formalized  process  for  following  up  on  responses  once  student  reflections  are  collected.  Reflection  is  only  valuable  if  it  results  in  learning  and  improved  performance,  but  without  a  more  formalized  or  at  least  systematic  response  process  to  student  feedback,  bad  habits  or  concerns  will  go  unaddressed.  

In  the  future  the  process  of  analysis  and  teacher  response  should  be  streamlined,  with  strategies  developed  in  teams  for  dealing  with  some  of  the  most  common  problems.  For  instance,  when  presented  with  the  question:  “what  can  you  do  to  improve”,  students  usually  respond  with  “study  more”  as  a  plan  improvement.  This  is  most  certainly  a  sign  that  they  do  not  have  a  response  or  a  strategy  to  deal  with  course  difficulties.  As  well,  missing  from  the  practicum  was  the  critical  step  of  teacher-­‐student  conferencing.  However,  the  purpose  of  this  initiative  was  to  provide  a  device  through  which  data  could  be  collected  on  student  self-­‐assessments  of  

TITLE   12    

learning  skills  and  on  their  reflections.  Though  beyond  the  scope  of  this  practicum,  it  is  important  to  mention  that  the  next  step  is  to  conference  with  each  student  self-­‐reporting  difficulties  in  order  to  provide  strategies  for  dealing  with  their  concerns.    RELATIONSHIP TO EDUCATIONAL LEADERSHIP AND PROFESSIONAL STANDARDS  

This  project  was  in  line  with  the  standards  of  the  teaching  profession  in  that  a  teacher  who  successfully  implemented  the  use  of  the  Google  Docs  file  would  be  demonstrating  all  five  professional  standards  (Ontario  College  of  Teachers,  2010).    To  illustrate,  the  voluntary  use  of  such  a  file,  which  is  for  the  sole  benefit  of  improving  student  learning,  in  addition  to  revealing  an  educator’s  commitment  to  students  also  shows  a  commitment  to  ongoing  professional  learning.  Acknowledging  the  benefit  of  reflection  and  meta-­‐cognition  is  demonstrating  a  commitment  to  professional  knowledge,  learning  to  use  the  file  and  sharing  results  is  participating  in  a  learning  community  around  its  implementation  and  effective  usage,  and  lastly,  using  student  survey  results  to  respond  to  student  needs  for  the  purpose  of  improving  each  individual  student’s  achievement  indicates  a  commitment  to  professional  practice  and  the  application  of  professional  knowledge.  Thus,  any  educator  responsible  for  the  lead  role  in  devising  and  implementing  the  use  of  such  a  device  has  clearly  demonstrated  instructional  leadership  as  it  results  in  teachers  practicing  and  advancing  their  practice  in  all  aspects  of  the  standards.  

In  many  ways  this  was  an  experimental  practicum,  the  outcomes  of  which  were  far  from  certain  or  assured  at  the  outset.  There  was  little  expertise  in  the  building  other  than  the  author’s  to  use  in  order  to  construct  such  a  file,  and  no  known  exemplars  existed  anywhere  for  one  to  draw  upon  for  inspiration.  In  this  sense,  there  was  little  instructional  leadership  of  others.  However,  in  all  other  respects  this  practicum  afforded  the  author  a  tremendous  opportunity  to  develop  critical  leadership  skills  (Table  1).  

   

Setting  Directions,  building  relationships,  and  developing  the  organization    

The  purpose  and  goal  of  the  initiative  had  to  be  communicated  to  both  staff  and  administration  in  order  to  gain  approval  to  run  a  workshop.  At  the  outset  it  also  required  the  design  of  an  educational  tool  that  did  not  yet  exist,  that  could  be  used  by  the  average  educator  to  fill  a  void  that  the  author  perceived  (and  that  research  confirmed)  was  desperately  in  need  of  being  filled.  That  would  require  a  leap  of  faith  on  the  part  of  any  leader.    

    The  fact  that  the  initiative  was  embraced  by  colleagues  and  supervisions,  and  that  eight  teachers  volunteered  from  this  author’s  school  is  a  statement  of  his  reputation  for  being  competent,  as  well  as  a  statement  of  the  stated  purpose  and  utility  of  the  project.  That  utility  –  that  is,  the  potential  of  the  file  –  and  how  educators  from  across  Ontario  came  to  see  it,  demonstrates  how  that  value  was  

TITLE   13    

created  in  teachers’  minds  by  the  author  (see  Appendix  D).  Creating  that  vision  and  understanding  is  an  essential  skill  for  educational  leaders,  if  they  are  to  effect  change  and  utilize  available  technologies  and  resources  to  effect  real  and  lasting  change.  To  take  on  the  task  required  staff  to  be  willing  to  work  under  his  guidance.  The  author’s  acts  of  showing,  inspiring,  and  revealing  the  use  and  potential  of  the  file  to  educators  not  only  in  his  own  school,  but  across  the  province  is  an  indication  of  being  able  to  build  new  relationships  and  develop  the  organization  (in  this  case,  the  initiative).  It  stands  as  further  proof  of  setting  ambitious  goals.         Setting  

Directions  Building  Relationships  

Developing  the  Organization  

Leading  the  Instructional  Program  

Skills   � Think strategically and build and communicate a coherent vision

 

� Develop, empower and sustain individuals and teams  

� Collaborate and network with others inside and outside the school

� Initiate and support an inquiry-based approach to improvement in teaching and learning

� Support student character development strategies    

Knowledge   � Ways to build, communicate and implement a shared vision

� New technologies, their use and impact

 

� The significance of interpersonal relationships

� Strategies to promote individual and team development  

� Models of effective partnership   � Strategies for improving achievement

� Use of new and emerging technologies to support teaching and learning  

 

Attitudes   � Commitment to setting goals that are not only ambitious and challenging, but also realistic and achievable  

� Commitment to effective working relationships commitment to effective teamwork  

  � Commitment to raising standards for all students

� Belief in meeting the needs of all students in diverse ways  

 

Table  1:  Selected  excerpts  from  the  Ontario  Leadership  Framework  (The  Institutde  for  Education  Leadership,  2008)  

 Leading  the  Instructional  Program       Perhaps  the  practicum’s  most  relevant  connection  to  the  leadership  framework  is  the  opportunity  it  provided  to  lead  the  instruction  program,  even  if  initially  for  only  a  few  willing  teachers.  The  entire  process  was  one  of  inquiry  and  experimentation,  resulting  in  changed  practices  or  at  least  perceptions  for  several  teachers.  To  accomplish  this  end,  the  latest  technologies  were  employed  at  minimal  cost,  first  by  the  author  learning  them  (leadership  by  example)  followed  by  the  author  teaching  other  staff  how  to  use  them.  Clearly,  the  entire  purpose  of  the  initiative  was  to  introduce  new  technologies  and  strategies  to  improve  teaching  and  learning,  but  more  importantly,  that  aim  was  rooted  in  the  underlying  belief  that  teachers  should  and  can  respond  to  individual  student  needs.  That,  after  all,  is  precisely  what  the  file  facilitates.  In  the  end,  through  instructional  leadership  at  least  three  teachers  were  convinced  that  the  change  initiated  through  this  practicum  was  simple,  easy,  and  yielded  tangible  benefits  for  students.  That  kind  of  success  tends  to  spread.        

TITLE   14    

CONCLUSION  

This  author  has  always  maintained  that  there  are  only  ever  two  questions  a  learner  wants  to  know,  and  before  concluding  every  lesson  must  answer  them  to  be  considered  successful  in  the  mind  of  the  learner.  “Can  I  do  this?”  and  “Why  do  I  want  to?  What  is  in  it  for  me?”  In  this  case,  the  learners  are  teachers.  Years  of  research  have  spoken  to  educational  workers  with  a  single  message  when  it  comes  to  feedback,  reflection,  and  self-­‐assessment:  they  lead  to  improved  results.  Yet,  in  light  of  this  reality,  few  have  taken  the  steps  necessary  to  integrate  these  activities  into  their  classroom  practice.  With  no  directive  from  administration,  and  limited  time  to  initiate  such  change,  it  is  not  surprising.  Teachers  feel  no  pressing  need.  However,  they  should.  The  low  sense  of  urgency  is  objectionable  given  the  incredibly  high  benefit  and  desire  on  the  part  of  students.  Their  survey  feedback  indicates  a  strong  sense  of  benefit  from  the  experience,  as  well  as  a  desire  to  do  more  reflection  not  less.  Moreover,  the  content  of  student  self-­‐reflections  indicates  a  substantial  need  to  for  intervention  in  the  instructional  program.  

As  expected,  the  lack  of  time  and  lack  of  an  impetus  has  delayed  and  stifled  innovation  in  an  educational  setting,  as  was  expected.  Nevertheless,  the  few  teachers  who  did  engage  in  this  activity,  as  well  as  the  students,  provided  this  author  with  a  valuable  learning  experience,  and  provided  those  participants  with  a  valuable  experience  and  future  learning  tool.  The  resulting  Google  Docs  student  self-­‐assessment  file  is  an  easy  to  use,  accessible,  cost-­‐effective  way  to  collect  and  collate  student  data  in  order  to  respond  to  individual  student  needs,  and  thus  this  practicum  has  succeeded  in  its  stated  objectives  (Appendix  F).  

   

 

TITLE   15    

REFERENCES

Andrade,  H.,  &  Valtcheva,  A.  (2009).  Promoting  Learning  and  Achievement  Through  Self-­‐Assessment.  

Theory  Into  Practice  ,  48  (1),  12-­‐19.  DOI:  10.1080/00405840802577544.  Butler,  M.,  Pyzdrowski,  L.,  Goodykoontz,  A.,  &  Walker,  V.  (2008).  The  effects  of  feedback  on  online  

quizzes.  The  International  Journal  for  Technology  in  Mathematics  Education  ,  15  (4),  131-­‐136.  Bierer,  S.  B.,  Dannefer,  E.  F.,  Taylor,  C.,  Hall,  P.,  &  Hull,  A.  (2008).  Methods  to  assess  students’  

acquisition,  application  and  integration  of  basic  science  knowledge  in  an  innovative  competency-­‐based  curriculum.  Medical  Teacher  ,  30,  e171-­‐e177.  DOI:  10.1080/01421590802139740.  

Biggs,  J.  (1998).  Assessment  and  classroom  learning:  A  role  for  summative  assessment?  Assessment  in  Education:  Principles,  Policy  &  Practice  ,  5  (1),  103-­‐110.  

Black,  P.,  &  Wiliam,  D.  (1998).  Assessment  and  classroom  learning.  Assessment  in  Education:  Principles,  Policy  &  Practice  ,  5  (1),  7-­‐73.  

Boud,  D.,  &  Falchikov,  N.  (2006).  Aligning  assessment  with  long-­‐term  learning.  Assessment  and  Evaluation  in  Higher  Education  ,  31  (4),  399-­‐413.  DOI:  10.1080/02602930600679050.  

Brown,  G.  T.,  &  Hirschfeld,  G.  H.  (2007).  Students’  Conceptions  of  Assessment  and  Mathematics:  Self-­‐  Regulation  Raises  Achievement.  Australian  Journal  of  Educational  &  Developmental  Psychology  ,  7,  63-­‐74.  

CISCO.  (2008).  Multimodal  Learning  Through  Media:  What  the  Research  Says.  Retrieved  November  22,  2010,  from  CISCO:  http://www.cisco.com  

Chen,  E.,  Heritage,  M.,  &  Lee,  J.  (2005).  Identifying  and  Monitoring  Students’  Learning  Needs  With  Technology.  Journal  of  Education  for  Students  Placed  at  Risk  ,  10  (3),  309-­‐332.  

Dale,  R.,  Robertson,  S.,  &  Shortis,  T.  (2004).  You  can't  not  go  with  the  technological  flow,  can  you?  Constructing  'ICT'  and  'teaching  and  learning'.  Journal  of  Computer  Assisted  Learning  ,  20  (6),  456-­‐470.  DOI:  10.1111/j.1365-­‐2729.2004.00103.x  .  

Debuse,  J.  C.,  Lawley,  M.,  &  Shibl,  R.  (2008).  Educators’  perceptions  of  automated  feedback  systems.  Australasian  Journal  of  Educational  Technology  ,  24  (4),  374-­‐386.  

Fullan,  M.  (2007).  The  New  Meaning  of  Educational  Change  (Fourth  Edition  ed.).  New  York:  Teachers  College  Press,  Columbia  University.  

Gallegos,  I.,  &  Flores,  A.  Student-­‐Made  Games  to  Learn  Mathematics.  PRIMUS:  Problems  Resources  &  Issues  in  Mathematis  Undergraduate  Studies  ,  20  (2),  405-­‐417.  DOI:  10.1080/10511970802353644.  

Government  of  Ontario.  (1990).  Education  Act.  Retrieved  August  4,  2010,  from  Service  Ontario:  http://www.e-­‐laws.gov.on.ca/html/statutes/english/elaws_statutes_90e02_e.htm  

Harlen,  W.,  &  Crick,  R.  D.  (2003).  Testing  and  Motivation  for  Learning.  Assessment  in  Education  ,  10  (2),  169-­‐207.  DOI:  10.1080/0969594032000121270.  

Heritage,  M.  (2009).  Using  Self-­‐Assessment  to  Chart  Students’  Paths.  Middle  School  Journal  ,  40  (5),  27-­‐30.  

Kohn,  L.  Y.  (2009).  The  Need  for  Assessment  Literate  Teachers.  Southeastern  Teacher  Education  Journal  ,  2  (4),  33-­‐42.  

Kotrlik,  J.  W.,  &  Redmann,  D.  H.  (2005).  Extent  of  technology  integration  in  instruction  by  adult  basic  education  teachers.  Adult  Education  Quarterly  ,  55  (3),  200-­‐219.  DOI:  10.1177/0741713605274630.  

Lachat,  M.  A.,  &  Smith,  S.  (2005).  Practices  That  Support  Data  Use  in  Urban  High  Schools.  Journal  of  Education  for  Students  Placed  at  Risk  ,  10  (3),  333-­‐349.  

Lingard,  B.,  Hayes,  D.,  Mills,  M.,  &  Christie,  P.  (2003).  Leading  theory.  In  Leading  Learning.  New  York:  Open  University  Press,  McGraw-­‐Hill  Education.  

Ljungdahl,  L.,  &  Prescott,  A.  (2009).  Teacher's  use  of  diagnostic  testing  to  enhance  students'  literacy  and  numeracy  learning.  The  International  Journal  of  Learning  ,  16  (2),  461-­‐475.  

TITLE   16    

Nguyen,  D.  M.,  Hsieh,  Y.-­‐C.,  &  Allan,  D.  G.  (2006).  The  impact  of  web-­‐based  assessment  and  practice  on  students'  mathematics  learning  attitudes.  Journal  of  Computers  in  Mathematics  and  Science  Teaching  ,  25  (3),  251-­‐279.  

Nicol,  D.  (2009).  Assessment  for  learner  self-­‐regulation:  enhancing  achievement  in  the  first  year  using  learning  technologies.  Assessment  and  Evaluation  in  Higher  Education  ,  34  (3),  335-­‐352.  DOI:  10.1080/02602930802255139.  

Magolda,  P.  M.,  &  Platt,  G.  J.  (2009).  Untangling  Web  2.0's  Influences  on  Student  Learning.  About  Campus  ,  14  (3),  10-­‐16.  DOI:  10.1002/abc.290.  

Melendez,  B.  S.,  &  Williams,  T.  (2007).  Mathematical  Idol.  Primus  ,  17  (3),  268-­‐283.  DOI:  10.1080/10511970701385325.  

Mok,  M.  M.,  Lung,  C.,  Cheng,  D.,  Cheung,  R.,  &  Ng,  M.  (2006).  Self-­‐assessment  in  higher  education:  experience  in  using  a  metacognitive  approach  in  five  case  studies.  Assessment  &  Evaluation  in  Higher  Education  ,  31  (4),  415-­‐433.  DOI:  10.1080/02602930600679100.  

Ontario  College  of  Teachers.  (2010).  Foundations  of  Professional  Practice.  Retrieved  February  6,  2011,  from  Ontario  College  of  Teachers:  http://www.oct.ca/standards/foundations.aspx?lang=en-­‐CA  

Ontario  Ministry  of  Education.  (2010).  Growing  Success:  assessment,  evaluation,  and  reporting  in  Ontario  schools.  Toronto:  Queen's  Printer  for  Ontario.  

Orrell,  J.  (2006).  Feedback  on  learning  achievement:  rhetoric  and  reality.  Teaching  in  Higher  Education  ,  11  (4),  441-­‐456.  DOI:  10.1080/13562510600874235.  

Palak,  D.,  &  Walls,  R.  T.  (2009).  Teachers'  beliefs  and  technology  practices:  A  mixed-­‐methods  approach.  Journal  of  Research  on  Technology  in  Education  ,  41  (4),  417-­‐441.  

Scardamalia,  M.  (2001).  Big  change  questions.  "Will  educational  institutions,  within  their  present  structures,  be  able  to  adapt  sufficiently  to  meet  the  needs  of  the  information  age?".  Journal  of  Educational  Change  ,  2  (2),  171-­‐176.  

Sezer,  R.  (2010).  Pulling  out  all  the  stops.  Education,  130  (3),  416-­‐423.  State  of  Queensland,  Department  of  Education.  (2001).  The  Queensland  school  reform  longitudinal  

study.  From  http://education.qld.gov.au/public_media/reports/curriculum-­‐framework/qsrls/index.html  

Tucker,  B.  (2009).  Beyond  the  Bubble:  Technology  and  the  Future  of  Student  Assessment.  Education  Sector,  Reports.  

The  Institutde  for  Education  Leadership.  (2008,  August).  Putting  Ontario's  Leadership  Framework  Into  Action.  A  guide  for  school  &  system  leaders.  Retrieved  January  18,  2010,  from  The  Institute  for  Education  Leadership:  www.education-­‐leadership-­‐ontario.ca  

Weaver,  M.  R.  (2006).  Do  students  value  feedback?  Student  perceptions  of  tutors’  written  responses.  Assessment  and  Evaluation  in  Higher  Education  ,  31  (3),  379-­‐394.  DOI:  10.1080/02602930500353061.  

 

                 

TITLE   17    

APPENDICES APPENDIX A: TEACHER SURVEY RESPONSES  In  the  spirit  of  this  initiative,  teacher  feedback  on  the  learning  skills  survey  tool  was  solicited  through  a  web-­‐based  Google  Docs  survey  conducted  from  June  7,  2001  to  June  14,  2011.  All  34  of  those  with  whom  the  survey  was  shared  were  contacted  by  email  and  requested  to  complete  the  survey.  Only  12  responded,  three  of  which  had  used  the  file  with  at  least  one  of  their  classes.  Both  the  tabulated  results  and  the  survey  are  linked  below.      Teacher  Survey    https://spreadsheets0.google.com/spreadsheet/viewform?hl=en_US&hl=en_US&formkey=dEtrRmNPVjBvbHQ1VVFEdXRkQVZRenc6MQ#gid=0      Teacher  Results                                                            

1. I was unable to acquire the google docs file in order to conduct the survey

2. I was unable to get the file to work properly

3. I lost my password/login information

4. I forgot about trying to use it

5. I was simply too short of time to do it or fit it in

6. I did not have access to technology to conduct it

7. I did not feel comfortable with the technology or program

8. The technology I had access to in order to implement the survey was inadequate

9. The survey worked as designed without any complications

10. Students were agreeable to conducting the survey

11. I prepped my students by discussing the value of reflection and learning skills before assigning the survey.12. Conducting the survey resulted in a meaningful class

discussion13. I received responses from the vast majority of my

students14. For students who did not complete the survey, I did my

best to encourage them to do so15. I reviewed the written responses of my students after

each survey 16. I used their responses respond to student needs and/or

adjust my instruction 17. I am uncertain how I could use their responses to adjust

instruction or meet student needs18. I used the bubble motion chart to interpret results for all

my students as a whole19. Overall I found students' self-evaluations of their

learning skills to be insightful for at least some students20. Overall I found collecting student self-evaluations of

learning skills to be a worthwhile process.

!"#$%&'()*+,-&#.. *+,-&#.. /.0"#-' 1&#..

!"#$%&'()1&#..

! " # # #

$ " # % #

& ' # # %

" " % ' %

" " % % '

& ( # # #

& " # " #

( ( % # #

!"#$%&'()*+,-&#.. *+,-&#.. /.0"#-' 1&#..

!"#$%&'()1&#..

# # # % "

# # " # %

% # # # "

# % % % #

# # % " #

# # # " %

# # # # '

# # # % "

" % # # #

% % % # #

# # # % "

# # # % "

2.-,$%,)3$#)%$")456'.5.%"+%&)"7.)!0#8.(

2.,0'",)$3)1"".56".9)456'.5.%"-"+$%

TITLE   18    

 Anecdotal  Teacher  Responses    If the previous options did not capture the reason for not trying the survey, please explain your reason(s) below. I teach at an alternative school this year and only have a couple of students who show up regularly. I fully plan on using this approach at my new school in September. I think it would work beautifully in a traditional school setting. I just had a busy semester, and wanted to implement it at the beginning as a part of my course expectations. When this didn't happen, I decided to use it first semester next year and implement it in both semesters. I have worked with and created a few different learning skills surveys on my own prior to this session. I learned a lot from Jeff's session and would like to incorporate more of his elements in my class surveys, especially interested in the mapping, graphing sections. Unfortunately time is always running short and being able to grasp a new concept, implement it with some consistency is a challenge. The session I am referring to was short 1hr as part of PD day in April we had enough time for a brief introduction and to "play with" the survey, however any follow up time was independent. It is an important to evaluate learning skills and critical to putting the ministries "Growing Success" document into practice. I was also uncomfortable with using my own Gmail account and reluctant to set up yet another email account. I was also reluctant to ask students for their e-mail addresses and to thus open the gateway for student-teacher e-mail communication.     What would have made the difference (so that you tried the survey)? Starting the survey at the beginning of a course as a part of class expectations. This survey is very easy to set up and use, and a great tool...the only reason I didn't use it was because I got very busy and forgot at the beginning of the semester. Having more time to get together a few weeks later. I just need time to play with it and implement it If I / the students can use our CHATT account instead. If the bugs had been ironed out before I tried to use it...now, looking back at the procedure, it seems convoluted and confusing since there were extra steps I had to do and many retries before I got my practice one to work. I'm not confident that I would know where to begin now and would want better step-by-step instructions that work. I am sad that I didn't use the survey, because I was very impressed with it. I am still planning on trying to use it in September. I think I got overwhelmed with the year and kept meaning to implement it, but it never happened. I'm embarrassed that I didn't try it!    

TITLE   19    

 Suggestions or obstacles to be overcome (in making the learning skills survey more useful): Grade 9s either don't really understand it or see it as something worthwhile. Got few responses from grade 9 classes and the responses were not as insightful. Obstacles for me were waiting to release the survey until after they'd gotten their unit evaluation back. In many cases, due to students writing tests very late, this put too much distance between the unit they were reflecting on. I was concerned that it happened too late. Lost meaning/relevance + I think they forget about how they worked during that period of time. I implemented the survey at the mid-semester. They were sent by email on a Friday without prepping the class at all. I received 18 responses out of a total of 72 students. I found the responses were better on average than my traditional in class survey, although I suspect the students that responded electronically would also respond effectively in a written self-evaluation. To improve this process, I intend to use this next year but will prepare my students more effectively, perhaps by booking a computer room for the first self-evaluation. This will get them familiar with it. Students that did not do the self-evaluation did a pen and paper self-evaluation of their learning skills at the midterm. They were also asked several questions about their strengths, weaknesses and strategy to improve. Success stories if any (from using the learning skills survey)? Multiple students actually realized some reasons why they weren't being as successful as they wanted. Specific goal setting - students showed evidence that they'd been working towards targeted improvement in an area they chose. Several students who I would not expected to respond gave very thoughtful input. I found that this helped me understand my students better and this information was useful when speaking with parents during interviews.        

TITLE   20    

APPENDIX B: STUDENT RESPONSE SURVEY  What  follows  are  the  responses  from  28  of  Mr.  Boulton’s  47  students  taken  from  a  survey  conducted  from  June  15  to  16,  2011  at  the  end  of  the  second  semester.  The  survey  consisted  of  two  parts.  The  first  consisted  of  questions  about  the  Google  docs  learning  skills  survey,  similar  to  those  asked  of  teachers.  The  second  part  solicited  feedback  about  new  teaching  strategies  introduced  by  Mr.  Boulton.        Student  Survey    The  questions  given  to  students  appear  below,  but  the  original  version  can  be  viewed  at  the  link  below:    https://spreadsheets.google.com/spreadsheet/viewform?hl=en_US&formkey=dEgxUllHeHJ1MTRpaUdMSFFtaEdsbUE6MA#gid=0      Student  Results                                                          

1. I normally reflect on my results and efforts on my own

2. I was agreeable to conducting the survey

3. My teacher prepped me by discussing the value of reflection and learning skills before assigning the survey.

4. Conducting the survey resulted in a meaningful class discussion

5. I attempted to reflect in a manner that would provide valuable information to my teacher and myself

6. I just filled in the blanks to make my teacher happy

7. For me, writing out reflections highlighted things about myself I did not expect or normally don't spend time

thinking about

8. I believe teachers could use this device to adjust instruction or meet individual student needs

9. I believe I could use the compiled results of these reflections to learn something about how I learn or

improve my achievement

10. I would like to have access to the compiled results of these reflections

11. Overall I find reflecting to be a worthwhile process, or at least has the potential to be a worthwhile process

12. Generally I have the opportunity to reflect or provide feedback in most of my classes

!"#$%&'()*+,-&#.. *+,-&#.. /.0"#-' 1&#..

!"#$%&'()1&#..

! " # !$ %

$ ! ! !& %

$ $ # !" '

( ( !" & !

$ ! ) !) &

# !# " ( $

( ) ) !) $

$ ! " !# %

$ ! % !& !

$ ) * * (

$ " " !' (

" !! # & !

!"02.%")3..24-56)$%)!0#7.()8,.90'%.,,

TITLE   21    

Anecdotal  Student  Responses    Suggestions or obstacles to be overcome (to make the learning skills survey more useful): Being able to see the results, and the class opinions/results as well. Better Questions If we got an email of what we wrote on the survey after a month after the survey was written it would help. You could also send it with your opinion on if you believe we accomplished our goals or not. More completion of the survey. We did it a few times at the beginning of the semester but we didn't do it once after that (or I was away when the class did it). Also, the teacher should take the initiative to ask the students what they put down in the survey after they complete it in order for other students to see what goals their classmates have. The one thing that could have been better this year would be the access to class notes. It would help the students to become more organized. Teacher feedback on the student answers on the reflection. making our own notes in class was hard for some classes. and without notes, it was hard to understand some of the concepts learned for the unit. Possibly more reflections and consistent reflection on the surveys. - Send out the results more often so that we know what we talked about in past reflections. The survey really helped me in terms of me going through what I'm doing well and what not. I think their results will better help people when the surveys are conducted more often.    Success  stories  if  any  (from  using  the  learning  skills  survey)? I was able to reflect about my own study habits and techniques, something that i do not normally do. This helped me change my way of studying. I currently have a bad mark, so no success stories from this guy It helped me put everything in perspective. It allowed to look at what my mark was, what I needed to do in order to get where I wanted to go and what I needed to stop doing in order to get there in time for the end of the semester. It made me realize what I was doing right and wrong; it almost served as a midterm report card, except there were multiple of them, which really made me work harder in order to reach my goals in the courses. I have a better grasp of the economics subject as a whole and it will definitely benefit me in post-secondary education. I think that the general "laissez - faire" teaching is not meant to create success, but to teach lessons to prevent future failure. In this way I believe it was a success story, but in order to verify that you'd have to track students into university. Doing an assignment at the end of each unit turned out to be a success for me. it was a way for me to do my own research outside of class to understand the unit in depth. it was sometimes hard to understand the materials covered in class, but doing our own choice of an assignment had us do our own research and understand better. I am very happy with my success in this course and hope that I can continue to do well in the field of business - My problem was always the same. I've known I procrastinate forever, but I think it's more I need strategies to avoid doing it, or to understand why I do it, before I can overcome it. Just reflecting without guidance doesn't do much for me. Still, you're onto something here. :)

TITLE   22    

APPENDIX C: SAMPLES OF STUDENT SELF-REFLECTIONS  The  following  are  samples  of  student  reflections  from  Mr.  Boulton’s  classes,  as  well  as  the  original  survey  form  that  students  see  when  completing  these  surveys.    Survey  form  as  seen  by  the  student    https://spreadsheets.google.com/viewform?hl=en&formkey=dG1LTHdZR3BDQlJhdG1YVWhnQTNJX0E6MA#gid=18    Student  One                                                                        

TITLE   23    

   Student  Two              

TITLE   24    

Student  Three    

   Student  Four    

     

TITLE   25    

Student  Five    

           

TITLE   26    

APPENDIX D: LETTERS OF PRAISE AND APPRECIATION The  following  are  letters  and  emails  sent  in  appreciation  of  the  efforts  of  the  author  during  the  school  year.  They  relate  to  initiatives  related  to  this  practicum,  or  in  the  dissemination  of  the  approaches  and  pedagogy  surrounding  his  research  and  the  practicum  at  professional  development  days,  keynote  addresses,  mentorship,  and  conferences.        

 December  3rd,  2010          Attn:    To  whom  it  may  concern    Re:    Mentorship  by  Jeff  Boulton    

 On  November  28th  and  November  29th  of  this  year,  Mr.  Jeff  Boulton  agreed  to  meet  with  me  in  order  to  support  our  schools  new  business  initiative.    Mr.  Boulton  provided  me  with  a  number  of  insightful  resources  and  ideas.    He  explained  to  me  the  G-­‐7  Simulator  program  which  he  created  and  will  fit  perfectly  with  our  Macro  Economics  30  and  AP  Macro  Economics  courses  that  we  currently  offer.    This  program  allows  students  to  clearly  understand  the  complexity  of  coordinating  a  country’s  economy  in  today’s  globalized  world.    Concepts  such  as  GDP,  CPI,  unemployment  rates,  fiscal  and  monetary  policies  are  just  a  few  that  are  integrated  into  this  interactive  activity.    A  student  exemplar  was  also  provided  which  identifies  his  authentic  learning  assessment  model  geared  for  21st  century  learning.    This  activity  draws  essential  learning  and  has  the  students  apply  it  in  a  real  world  scenario.    Mr.  Boulton  also  provided  a  number  of  his  assessments,  recommended  a  number  of  video  resources,  and  provided  me  with  case  method  readings.    I  was  having  difficulty  locating  appropriate  high  school  level  case  method  readings  and  Mr.  Boulton  has  now  provided  me  with  a  number  of  good  samples  to  choose  from.        Mr.  Boulton  also  took  time  out  of  his  busy  schedule  to  share  with  me  a  number  of  technology  ideas.    Microsoft  Excel  activities,  Google  Docs  and  the  social  bookkeeping  site  called  Delicious  were  just  a  few  of  the  ideas  he  shared.    From  this  collaboration,  I  have  gained  a  lot  of  insight  into  how  I  would  like  to  build  our  business  program  here  in  Calgary  and  to  model  it  after  the  Iroquois  Ridge  Business  Department  program.      Without  Mr.  Boulton’s  guidance  and  support,  I  would  not  have  been  able  to  begin  to  implement  a  number  of  these  great  ideas.    During  my  visit  to  Oakville,  I  also  had  the  opportunity  to  meet  and  talk  with  a  number  of  the  other  teachers  in  the  Business  Ridge  department.    The  entire  business  faculty  was  gracious  enough  to  share  their  materials  and  to  welcome  me  to  their  school.    I  would  like  to  extend  a  big  thank  you  to  both  Jeff  Boulton  and  the  Iroquois  Ridge  High  School  Business  Department  for  all  their  guidance  and  support.        Sincerely,        Scott  Bennett  Director  of  Business  Studies,  West  Island  College,  Calgary      

TITLE   27    

                                                                                           

TITLE   28    

                                                                                           

TITLE   29    

               

TITLE   30    

APPENDIX E: PRACTICUM LOG It  is  worth  mentioning  that  this  practicum  involved  far  more  than  the  Google  Docs  survey  file.  As  the  author’s  exposure  to  and  understanding  of  the  literature  has  grown  over  the  last  several  years,  his  frustration  with  education  in  its  current  form  (particularly  in  mathematics)  resulted  in  a  long  overdue  transformation  in  practice  in  the  second  semester.  Thus,  the  practicum  involved  the  Google  Docs  survey,  the  mentorship  of  a  Vice  Principal,  and  changes  to  classroom  practice,  which  attempted  to  incorporate  as  much  of  what  he  has  read  and  believed  for  some  time.  Significant  time  was  also  spend  disseminating  these  ideas  to  others  through  workshops,  conferences,  mentorship,  and  keynote  addresses.                                                                                  

Total Hours 100.5

Job ShaddowingTotal 9.00

Day Hours Task

Oct. 6, 2010 1.00 After school. Gallant and I did a walk-around, and he began showing me the on-call system and how he copes with its ridiculousness. Discussed some of his first suspensions, causes and penalties.

Oct. 20, 2010 2.50

Morning. Talked about administrative P.D. and lack of effectiveness. Did walk arounds, on-calls, spoke to a few kids re: thefts, attendance and entrance to Gary Allan program (score). Examined video for thefts. Talked about strategies for dealing with staff late for supervisions (just do the on call and see if show up).

Nov. 18 3.00

Morning. Busy day at WOSS. Several suspensions and numerous interviews. One regarded a theft, the other an assault in the essentials area. Teachers reported, then first student, then second. For theft, first student, then second who was aggressive, father came in an took control of child. Use of video surveillance.

15-Jun 2.50 Morning. A slower morning. We looked at the exam schedule and discussed how it could be designed better in Google Docs. Tracked a truent student. Discussed his impressions of the job one year in.

Practicum LogTotal 91.50

Day Hours Task

Sep. 29, 2010 1.00 Using Google docs created a form to collect information on student learning skills and reflections on their achievement.

Oct. 1, 2010 2.00 Designed an Excel spreadsheet to pull out info from the Google Docs survey and present it in a colated, organized report for teachers, students, or parents. Can be emailed, printed, or just shown on a monitor.

Oct. 5, 2010 0.25 Met with Erin Leahy to show her how to use the Google Docs survey form.

Oct. 6, 2010 0.50 Discussed practicum ideas, nature of job, his enjoyment so far, and mentorship for practicum with Mike Gallant.

Oct. 8, 2010 0.75 Delivered a P.D. day session on Google Docs surveys for learning skills

Oct. 14, 2010 0.25 Requested permission to spend a couple prep periods at WOSS with Gallant

Oct. 19, 2010 0.25 Spoke with Deb regarding Practicum ideas and job shaddowing.

Oct. 19, 2010 0.25 Spoke with Tara re: practicum, plans, ideas. We did some brainstorming and discussed job of administrator

Oct. 22, 2010 1.00

P.D. at the Institute for Chartered Accountants. Delivered a session to teachers from across Ontario on the topic of using technology to improve assessment, streamline workload, and inform practice. Demonstrated the learning skills report generator and online testing to rave reviews.

02-Nov 4.00 Revised and finalized spreadsheet to make it more user friendly in hopes of sharing with many teachers. New version includes instructions.

12-Nov 0.75 Worked on practicum proposal. Laid down initial concepts of areas for mentorship and skills and knowledge sets to build in grade twelve students.

12-Nov 0.50 Researched good study strategies online from various universities.

16-Nov 0.25 Had a discussion with Tara Conner regarding practicum ideas that would be practical, feasible given time lines, deadlines, goals, and school needs. Settled on staff mentorship surrounding growing success document.

TITLE   31    

                                                                                           

16-Nov 0.25 Answered questions concerning use of a file by Chris Chihrin (TDSB) regarding file use for printing class sets of summatives using my software.

17-Nov 0.50

Correspondance with Scott Bennet about mentoring authentic learning tasks RE: Damian Cooper's book. Correspondance with Jordan Hoffman concerning delivering P.D. at his school around technology to assess for learning and growing success. Correspondance with Damian Cooper about career options in mentorship and P.D. as he is doing.

17-Nov 0.25 Revising files for Tarantino to use to generate assignments/summatives.

17-Nov 0.25 Phone call with Scott Bennet RE: school visit and simulation use, and business program start up at West Island College in Calgary Alberta.

17-Nov 1.00 Worked on simulation for teachers, specifically Scott Bennet, to make it easier to use.18-Nov 1.00 Conferrencing with Monique, Tara, and Joanne.18-Nov 1.00 Perfecting spreadsheet and learning skills survey.18-Nov 0.50 Dialog with Math Head regarding best practices with technology.

22-Nov 1.00 Worked with Kirkup. He had questions regarding formula use and spreadsheet design. Then we experimented successfully with a web-based document for building learning skills reports. Version will need to be refined, but will streamline the process.

29-Nov 0.25 Discussions with Dean Barnes and Allan Kirkup re: arranging students to test the learning skills survey system and being prepared to complete the online file in time for the December 6th staff meeting and instructing staff on how to use it.

Nov 29-30 3.00

Met with Scott Benett from Calgary regarding authentic learning tasks and assessment. We went over google docs applications such as data collection, collboration, assignment submission, Excel applications, incorporating essential business skill in curriculum delivery, and authentic assessment tasks. Share resources and padegogical approaches as he sets up a business program in his school for the first time.

01-Dec 4.00

Email dialogue with Dean re: giving students on learning team chance to try survey before staff meeting. Spent bulk of time on a data investigation to quantify the results of a shift in pedegogy in 2006, when introduced online testing to BAF 3M. Showed dramatic effects. Will integrate histograms into any presentation or seminar in the future.

02-Dec 2.00 Met with Mike Gallant to dialogue about board improvement plan and the role of data, research and the learning skills program I developed and how it might be used in an interview for a board-level position

03-Dec 1.50

Meet with Damian Cooper and discussed effective assessment and padegogy. Explored why my authentic tasks like the G7 simulation he illustrated in his book are so valid, and probed possiblities of co-authoring his next book together. Also discussed teaching as a profession from a global perspective; strengths weaknesses, and challenges.

03-Dec 1.50 Worked with Alan Kirkup and Kevin McConvey on spreadsheet file to compile student learning skills reflections, and discussed delivery of lesson to staff session on Dec. 6th. Sent a copy of the Excel-based file to Tara Connor and Doug Bothwell

04-Dec 1.50 Worked on statistical analysis of all sections of all grade 11 accounting sections I've taught to examine how padegocial practice in conjunction with technology drastically improved student achievement. The results are stunning.

06-Dec 0.50 Met with Kevin and Allan at lunch to go over presentation to staff on report use

06-Dec 1.00 Delivered staff session on how to obtain google docs survey and then copy and paste survey results into spreadsheet that produces reports.

07-Dec 1.00 Techie Tuesday at lunch. Allan, Kevin and I. Follow up for staff from Monday's staff session. No one showed up except Jennifer Burke. Good lessons there.

09-Dec 2.50 With Kirkup's help finally managed to perfect the survey file so it can be 100% web-based with no downloading required for staff which seemed to be problematic at the staff session on Monday Dec. 6

10-Dec 1.00 Preparing research for conferences on A&E and response and results in BAF 3M

13-Dec 0.25 Scott Bennett called at home for some clarification. Answer questions and provided guidance.

16-Dec 4.50 Prepared raw data for a statistical analysis of longitudinal data from my teaching career

24-Dec 3.00 Built models to analyze learning skills reflections correlated with student achievement throughout the semester.

26-Dec 2.00 Added a self-constructing motion chart to help teachers analyse their data as a whole.03-Jan 1.00 Downloaded complete markbook data sets for analysis18-Jan 2.00 Correcting errors in file, setting up my website to distribute the file to teachers more easily.

19-Jan 2.00 Prepared "homework" for staff prior to Jan 31 P.D. day. Topic: the 21st century learner. Consisted of a web page with media and readings to set the purpose and significance.

19-Jan 2.00 Wrote instructions and coded a Flash lesson to go along with the learning skills self-assessment file to be distributed and explained on upcoming P.D. days.

01-Feb 2.00 Delivered a hand's on P.D. centre around Google docs applications for collaboration, student self-evaluation, and improved assessment feedback.

02-Feb 2.00 Read academic research on use of assessment to enhance student achievement03-Feb 0.50 Met with mentor to discuss refining narrowing focus of proposal

TITLE   32    

                                                             

05-Feb 3.00 Worked on interpreting and further developing the prototype for online automated collection of student interests, and info for tailoring lessons and which will form the basis of contextualized learning for students that is relevent to their own interests.

06-Mar 4.00

Prepared presentation for 21st Century Leanrer theme for TanenbaumCHAT, a private school in Toronto. It incorporated learner-centred approaches, assessment and evaluation methods, importance of timely feedback, and illustrate the use of google docs to collect self-assessment and learning skills by students.

07-Mar 5.00 Delivered the presentation and workshop at TanenbaumCHAT06-Apr 2.00 Prepared speech for P.D. address to teachers on April 8

08-Apr 3.00

Delivered part of the keynote address to all math and science teachers in the Halton Board. Discussed growing success and the 21st century learner. What it was really about, approaches they could take. The goal was to provoke thought and engage them in the change process. It was very well received by both staff and administrators. The afternoon was for business teachers. It involved a workshop to discuss, explore, and distribut the Google Docs learning skills survey.

01-May 4.00 Prepared workshop for OBEA Conference on Growing Success and learning skills.

05-May 1.25 Presented a workshop on 50 things to do in accounting which included automated web-based assessment. Incorporated the results of the statistical analysis of my teaching career to date to justify the use of such assessment tools.

06-May 2.50 Presented two workshops at the OBEA Conference. One on how to use the learning skills student self-assessment, the second on an Economic game simulation

04-Jun 2.00 Designed a survey to collect teacher responses regarding their experiences with using the Google Docs survey

10-Jun 3.00 Met with Christina C to explain resources and teaching philosophy and padegogy 11-Jun 1.00 Analysed the results of the Google Docs survey15-Jun 0.25 Concluding meeting with Mentor to disucss results and write up of summary and reflection.

16-Jun 2.00 Worked on feedback survey for students to complete regarding new teaching methodologies and the use and value of the Google Docs learning skills self-reflection survey.

TITLE   33    

APPENDIX F: PRACTICUM SUCCESS CRITERIA  The  following  is  the  author’s  evaluation  by  his  own  proposed  success  criteria.  Where  criteria  were  not  fully  met,  explanations  are  provided  in  red.            A  Job  Shadowing  provides  this  candidate  with:  ½√  The  maximum  hours  permitted  of  job  shadowing  (nine  hours  was  completed,  but  

the  marginal  benefit  of  additional  job  shadowing  rendered  it  unnecessary),  √   Frank  and  practical  advice  for  my  career  path,  √   Personal  anecdotes  of  initial  challenges  in  the  role,  √   The  opportunity  to  witness  a  variety  of  administrative  tasks  during  different  

times  of  the  day,  week,  and  semester,  and  √   A  concept  of  mistakes  made  by  the  mentor  in  his  first  year,  and  lessons  

learned  from  that  experience.    Practicum  should  result  in:  √   This  candidate  having  led  a  team  that  accomplishes  the  goals  set  out  in  this  

paper’s  section  describing  the  practicum,  as  measured  by  a  survey  and  anecdotal  evidence,  

√   The  development  of  a  tool  for  teachers’  use  that  they  report  as  effective  and  useful  and  enabling  them  to  respond  to  student  needs,  

√   The  mentorship  of  teachers  in  the  development  of  this  tool  in  such  a  way  that  they  learn  new  skills,  

½√  The  inspiration,  mentorship  and  training  of  staff  to  use  this  software  in  such  a  way  that  it  is  quickly  adopted  and  implemented  successfully.  This  will  be  measured  initially  by  scaled  responses  from  the  team,  but  eventually  by  the  number  of  staff  using  the  software.  This  can  be  measured  through  Google  Docs  itself  since  the  file  must  be  shared  with  each  staff  member  who  uses  it  (More  had  access  than  was  anticipated,  but  seemly  little  integration,  though  high  intent).  

√   Some  success  stories  of  usage  from  within  the  school,  but  also  from  across  Canada  will  be  compiled,  and  

√   An  opportunity  to  practice  skills  and  competencies  from  the  leadership  framework  will  present  itself  throughout  the  process.