strategic planning, implementation, monitoring and evaluation (spime) for educational managers and...

46
WELCOME & MABUHAY ! ! ! Mr. VirGILio G. Gundayao, MBA/MPA Exec. Dir., Graft-Free Philippines, a national project of Philippine Jaycee Senate 2004 Exec. Director, Junior Chamber International (JCI) Phils. Immediate Past Exec. Director, JC Leaders International Ex-AMO, CSC Mamamayan Muna, Hindi Mamaya Na! Program

Upload: virgilio-gundayao

Post on 16-Aug-2015

107 views

Category:

Education


3 download

TRANSCRIPT

  1. 1. WELCOME & MABUHAY ! ! ! Mr. VirGILio G. Gundayao, MBA/MPA Exec. Dir., Graft-Free Philippines, a national project of Philippine Jaycee Senate 2004 Exec. Director, Junior Chamber International (JCI) Phils. Immediate Past Exec. Director, JC Leaders International Ex-AMO, CSC Mamamayan Muna, Hindi Mamaya Na! Program
  2. 2. WE specially want to give credit to the TEAM EFFORT of the Class For the Group Dynamics For the coordination to come out with a good atmosphere or ambiance conducive to the art of learning by doing For exemplifying the essence of ANDRAGOGY or Adult/Alternative Learning system cum Pedagogy For attempting to transcend the various dimensions of educative processes
  3. 3. I. PROLOGUE Plan your work and work your plan. - Vince Lombardi The function of Implementation, Monitoring, and Evaluation (IME) in the Strategic Planning is a pivotal management and leadership component paving towards organizational success. The SPIME complements and completes the SWOT, environmental scanning, action plan to come out with an ideal but workable road map, blue print, milestone, in the attainment of organizational goals, values, and objectives. On this oral and written reports, it is apparent that the SPIME is intertwined as a practical approach towards exploring the entire gamut of a CorPlan that are within the frame work of the organizational vision, mission, goals, objectives and valuesin this case of an educational institution.
  4. 4. II. KEY WORDS Strategic Planning is worthlessunless there is first a strategic vision. John Naisbitt Strategic (tactical, militant, impact, guided/directional) Planning (preparation, preparedness, forecasting, envisioning) Implementation (execution, manning, re-tooling, Monitoring (checking, overseeing, comparing, coordinating, supervising,) Evaluation (assessment, measurement, scoring, grading, classifying, analyzing, synthesizing)
  5. 5. . THE STRATEGIC PLANNING PROCESS DEVELOP Vision Mission Values LONG-TERM OBJECTIVES GENERATE, EVALUATE, SELECT STRATEGIES IMPLEMENT STATEGIES MGT ISSUES IMPLEMENT STRATEGIES MRK, FIN, HR, OPS, ETC MEASURE, EVALUATE PERFORM EXTERNAL AUDIT INTERNAL AUDIT FEEDBACK STRATEGY FORMULATION STRATEGY IMPLEMENTATION STRATEGY EVALUATION
  6. 6. . Project Cycle Phases AppraisalEvaluation Planning Design Implementation Monitoring
  7. 7. . Dynamic Planning Model Scan Plan Implement Monitor Review Scan Plan Implement Monitor Review One year Five recurring steps: Scanning to identify trends Planning to develop an agreed strategic direction Implementing by allocating resources, developing structures and procedures Monitoring regularly in achieving stated goals Reviewing at the end of each set period
  8. 8. . STRATEGIC PLANNING PHASES VISION MISSION GOALS OBJECTIVES SWOT & PESTE* STRATEGIES ACTION PLANS *Political, Economic, Social and Technological Environment
  9. 9. . The tasks of Strategic Planning a summary Defining the business and developing a vision SWOT and gap analysis -setting objectives Crafting a strategic action plan Implementing and executing strategy Evaluating performance reviewing adjusting and correcting
  10. 10. . Various levels of indicators Input indicators Output indicators Process indicators Outcome indicators Economy Efficiency Effectiveness
  11. 11. III. IMPLEMENTATION & MONITORING OF A STRATEGIC PLAN Failing to prepare is preparing to fail. - John Wooden (See self-designed SPIME Table) Action Plan Execution Schedule (Time Frame) Checklist Diagram/Flowchart Management Function Meetings (FDG, RTD, staff, committee, commission, board) Inspection Report
  12. 12. Planning for Implementation Set priorities. Identify resourcestime, expertise, materials, funds, space, and other resources. Establish clear leadershipvision and focus. Provide professional developmenttraining, job-embedded, sustained, and aligned. Create lines of communication for input and feedback. Monitor and evaluate the implementation and impact of strategies.
  13. 13. . Implementation - a learning process Everything that can go wrong, will go wrong - Murphys law How will we respond? How can we learn from mistakes?
  14. 14. . Matt H. Evans, [email protected] Make sure everything is linked and connected for a tight end-to-end model for driving strategic execution is linked and connected for a tight end-to-end model for driving strategic execution. INITIATIVE Employee Productivity Improvement Program Employee Satisfaction Survey Rating 90% favorable overall Measure Target Target Actual 90% 45% PercentSatisfaction gap MEASURE / TARGET OBJECTIVE Improve Employee Satisfaction ACTION PLAN Identify issues per a company wide survey Sanity Check . . . Down to Specifics
  15. 15. Thinking Chain . 15 A treasure hunt into learning and school improvement begins with a need to know something, a theory, a question, or a hypothesis that has come to us through some natural flow of logic. What part of the curriculum do our students struggle with most? Which critical expectations are our students the weakest? What sub-group of students are most in need of improvement? Which instructional strategies lead to the most rapid rate of growth for different student populations? Data Disaggregated sub-groups Data -Aspects of ____ -Skills Data -EQAO* (Literacy & Numeracy) -SEF (Self/School Evaluation Form) -District/School Assessments *Education Quality & Accountability Office (Ontario, Canada) Evidence Evidence Evidence Evidence Adapted from The Handbook for SMART School Teams, 2002) p.63
  16. 16. . Effective implementation requires the following: Focus A schedule Collaboration Data (sign-in sheets, teacher surveys, evaluation forms, debriefing notes, lesson plans, classroom observations, student samples, and achievement data) Ongoing monitoring Periodic evaluation
  17. 17. IV. ISSUES IN THE STRATEGIC PROGRAM EVALUATION Failing to prepare is preparing to fail. - John Wooden (See self-designed SPIME Table) Budget Constraints Data Time Reliability Validity Sensitivity Internal Evaluator/s External Evaluator/s Complacency Stiff Competition De-motivation (other attitude/behavioral manifestation)
  18. 18. . Designing Good Evaluations Better to have an approximate answer to the right question, than an exact answer to the wrong question. Paraphrased from statistician John W. Tukey
  19. 19. . Designing Good Evaluations Better to be approximately correct than precisely wrong. Paraphrased from Bertrand Russell
  20. 20. Methodology in Grading and Evaluating Your Organizations Strategic Planning Performance Each of the ten strategic planning and implementation tasks has been cast as a question focusing on whether or not an organizations existing processes ensure the execution of them. For each question, you are asked to respond with a number from 1-5 indicating the frequency in which your organization completes each of the required strategic planning/implementation tasks. These are as follows: Score Frequency Indicator 1 Very Rarely or Never 2 Occasionally 3 About Half of the Time 4 The Majority of the Time 5 All of the Time r
  21. 21. Evaluation Questions. By using the above scoring methodology, you can now assess your overall strategic planning effectiveness by responding to the following questions: 1. Does your organization have established strategic planning cycle linked to fiscal year-end and budgeting process? 1 2 3 4 5 2. Does your organization undertake strategic planning in a manner that is clearly linked into the broader corporate planning process? 1 2 3 4 5 3. Does your organization operate on the basis of a professionally run planning process, supported by an external facilitator and/or a staff person dedicated to lead the planning exercise? 1 2 3 4 5 4. Does your organization undertake environmental scanning and/or opinion surveys of key audience segments to serve as strategic inputs to the strategic planning process? 1 2 3 4 5 5. Does your organization have a planning process that ensures the active involvement process of functional/operational unit heads? 1 2 3 4 5
  22. 22. 6. Does your organization follow-up with a formal operational planning process that translates the strategic plan into operational plan(s)? 1 2 3 4 5 7. Does your organization have an employee work plan development process that clearly references goals set out in the strategic plan and ensures day to day implementation of operational plan? 1 2 3 4 5 8. Does your organization have a quarterly reporting process whereby Board of Directors receives updates on organization progress in meeting strategic goals set out in the strategic plan? 1 2 3 4 5 9. Does your organization have an employee compensation process whereby employees are evaluated and rewarded based upon achieving operational objectives in support of the strategic communication plan? 1 2 3 4 5 10. Does your organization undertake an ongoing program evaluation process whereby the impact of key strategies and tactics are rigorously assessed against defined objectives? 1 2 3 4 5
  23. 23. Calculating Your Organizations Strategic Planning Grade For grading purposes, you are asked to total your score based upon the ten questions. The total numeric scores are then translated into a letter grade based upon the following: Total Score Grade 42 or More A 37-41 B 31-36 C 25-30 D 24 or less F 4. Reading Your Report Card To help your organization interpret their grades, we provide some general observations that accompany each grade. Such comments, by necessity, deal in generalities. However, they do offer the basis for an assessment of a organizations effectiveness in using strategic planning as a salient management tool.
  24. 24. Grade Comments A . Your organization represents a best practice in its approach to strategic planning and implementation. Within your organization strategic planning is a powerful management tool for setting priorities, defining strategies, and determining performance benchmarks. B. Your organization is committed to a regular, formalize, strategic planning process that helps to set strategic priorities. C. Your organization has undertaken some strategic planning in an effort to set strategic direction. However, the lack of a disciplined process in each of the key task areas probably means that the results of the planning process do not offer the degree of priority setting they are otherwise capable of providing the organization. D. Your organization displays a very limited commitment to strategic planning. When it is carried out it is ad hoc and is seldom translated into workable action plans that gain organization-wide commitment. F . (Failed) Your organization fails to undertake even the basic elements of strategic planning. While occasionally senior management may convene a planning session, the virtual absence of follow-through renders the resulting plan useless.
  25. 25. V. CRITERIA FOR EVALUATING AN EDUCATIONAL STRATEGIC PLAN Program Evaluation Criteria (Adapted from Materials provided by the United States Department of Education Essential components of successful curricula are challenging, comprehensive and high-quality academic programs that are accessible to all students. The selection of these programs is one of t he most significant decisions educators must make. The program evaluation criteria listed below may serve as a guide to help facilitate this process.) A. Quality of Program Criterion 1. The program's learning goals are challenging, clear, and appropriate for the intended student population. Indicator a.) explicit and clearly stated. Indicator b.) consistent with research on teaching and learning or with identified successful practices. Indicator c.) foster the development of skills, knowledge, and understandings. Indicator d.) include important concepts within the subject area. Indicator e.) can be met with appropriate hard work and persistence.
  26. 26. Criterion 2. The program's content is aligned with its learning goals, and is accurate and appropriate for the intended student population. Indicator a.) aligned with its learning goals. Indicator b.) emphasizes depth of understanding, rather than breadth of coverage. Indicator c.) reflects the nature of the field and the thinking required in the field. Indicator d.) makes connections within the subject area and between disciplines. Indicator e.) culturally and ethnically sensitive, free of bias, and reflects diverse participation & diverse student
  27. 27. Criterion 3. The program's instructional design is appropriate, engaging, and motivating for the intended student population. Indicator a.) provides students with a relevant rationale for learning this material. Indicator b.) attends to students prior knowledge and commonly held conceptions. Indicator c.) fosters the use and application of skills, knowledge, and understandings. Indicator d.) engaging and promotes learning. Indicator e.) promotes student collaboration, discourse, and reflection. Indicator f.) promotes multiple and effective approaches to learning. Indicator g.) provides for diverse interests.
  28. 28. Criterion 4. The program's system of assessment is appropriate and designed to inform student learning and to guide teachers' instructional decisions. Indicator a.) an integral part of instruction. Indicator b.) consistent with the content, goals, and instructional design of the program. Indicator c.) encourages multiple approaches and makes use of diverse forms and methods of assessment. Indicator d.) probes students abilities to demonstrate depth, flexibility, & application of learning. Indicator e.) provides information on students' progress and learning needs. Indicator f.) helps teachers select or modify activities to meet learning needs.
  29. 29. B. Usefulness to Others Criterion 5. The program can be successfully implemented, adopted, or adapted in multiple educational settings. Indicator a.) provides clear instructions and sufficient training materials to ensure use by those not in the original program. Indicator b.) is likely to successfully transfer to other settings. Indicator c.) specifies the conditions and resources needed for implementation. Indicator d.) program's costs (time and money) can be justified by the benefits.
  30. 30. C. Educational Significance Criterion 6. The program's learning goals reflect the vision promoted in national standards. Indicator a.) consistent with national standards. Indicator b.) The program's pedagogy and ssessment are aligned with national standards. Indicator c.) The program promotes equity and equal access to knowledge, as reflected in national standards.
  31. 31. Criterion 7. The program addresses important individual and societal needs. Indicator a.) is of sufficient scope and importance to make a significant difference in student learning. Indicator b.) contributes to increases in teachers' knowledge of effective teaching and learning. Indicator c.) is designed to improve learning for a wide spectrum of students OR serves to meet the special learning needs of under- served students OR serves to meet the special learning needs of students whose interests and talents go beyond core programs of study
  32. 32. . Matt H. Evans, [email protected] Measurement Template Down to Specifics(Insert organization name) (Insert division name) (Insert department name) Risk Frame area objective supports (Insert objective owner) (Insert measureme nt owner) (Insert reporting contact info) Objective Description description of objective purpose, in sufficient detail for personnel not familiar with the objective to understand its intent. Objective descriptions are typically two or three paragraphs long. This will appear in the pop-up window when you mouse over the objective in the Balanced Scorecard System. References source documentation for objective and objective description Comments additional information about the objective not covered in above blocks, such as recommendations for further revision, additional organizations objective impacts, recommendations for coordination / alignment with other objectives, etc. Measure Name - The name exactly as you want it to appear in the Balanced Scorecard, including the measure number (i.e. Percent Employees Satisfied, etc.) Measure Description description of the measure, include its intent, data source, and organization responsible for providing measure data. This will appear in the pop-up window when you mouse over the measure in the Balanced Scorecard. Measure Formula formula used to calculate measure value (if any) Data Source - The source of the data manual, data spreadsheet, or database name and contact familiar with the data Measure Weight - the relative weight of the measure based on the impact it has on the overall objective. The total weights for all measures for an objective must add to 100 Measure Reporter Person responsible for providing measure data. Include the name, organization and email. Target Maximum Maximum expected value for the measure. Effective Date Date the target first becomes effective Frequency How often target data will be reported Units Units of measure Target Point where the measure goes from green to amber Target Minimum Point where the measure goes from amber to red. The target minimum and target can not be the same value. Scorecard Perspective Name
  33. 33. Simplified Criteria for Evaluating an Educational Plan Specific Measurable Attainable Reliable Time-bound 3 E Quality Quantity Timeliness
  34. 34. SPIME Preparedness SP Resources Assessment/Measurement Before During After Qty Qlty Timeliness Who Forecasting STAFFING Manpower Plan/ning Impact Mitigation EFFICIENT * EFFECTIVE * ECONOMICAL PLANNING * ORGANIZING * DIRECTING * COORDINATING CONTROLLING MBR Emasculating LeapFrog MBE Repairing MBO R & D Archiving What FEEDBACK Outsourcing Tactics Risk Mgnt. Performing Diagnosing Allocation ROI Cost-Benefit Risk-Taking Insurance RTD Auditing MBWA Prognosis TQM Accounting Why SWOT Tool Kits Sketch Activating Enabling Simulation Meetings/Fellowship Decision-Making Crystallazing Gauging DISCIPLING * DELEGATING FGD Pilot- Testing Standar dizing Team- Building How Scanning Devices Designs Empowering Energizing Conflict Res. Supply & Procurement Stds. Realizing Schedule/ing Pole-vaulting Documentation/RecordsKeepng Verifying Validating Post-Testing Bonding Interpreting Recognizing Rewarding Where PESTE Packages Model Familiarizing Beating odds Legal Management Inspecting Supervising Training & Development Phasing/ Facing Critical Thinking Break-time Time Mgnt. When Road map Scheme Beating competition Breakthrough Clustering CSR Conceptualizing Streamlining Dove-Tailing Need /Gap Analysis Creative Thinking Constructive Destruction Which CPE Guidepost Calibration Beating standards Synergy Convergence Optimizing/Maximizing Investiture Trafficking Adaptati on Zero-based Budgeting Campaining Communica tion Trailblaizing Installing Matching Collaborati on Innovating Negotiating CHECKLIST Reinforcing * Linkaging * Networking * Fortifying Synchronizing Down & Right Sizing Scoring Adjustm ents Grading Programming Institutionali zing Rating SAPAE Engaging Crisis Mgnt Innovating Reporting/Integration Re-engineering G3: Goodness * Genuineness * Greatness Internalizing Investigating Fine- tuning Critiquing Coaching Men Money Machines Materials Method Moments Markets .
  35. 35. . What do We Evaluate? Evaluation is concerned with results focusing on Effectiveness Achievement of results Relevance Programme continues to meet needs Sustainability Results sustained after withdrawal of external support Unanticipated Results Significant effects of performance Causality Factors affecting performance Validity of Design Logical and coherent Efficiency Results vs costs Alternative Strategies Other possible ways of addressing the problem
  36. 36. . Why do we evaluate? To improve design and performance of an ongoing project/programme To make judgments about the effectiveness of a project/programme To generate knowledge about the best practices, lessons learned
  37. 37. . Strategy: Creating shareholder value through effective strategy is as easy as A-B-C... Where you Are Where do you want to Be How to get there (Course to follow) Monitor & Evaluate (Strategic control) Resource plans Financial plans Infrastructure plans Strategy Institute, after Bryson & Alston Not the only model but the easiest to understand!
  38. 38. . Summarize and Interpret Progress Monitoring Data G3
  39. 39. . Monitoring and EvaluationD:Videos Monitoring and evaluation are complementary functions Each provides a different type of performance information Both important for effective Result Based Management (RBM)
  40. 40. . Monitoring versus Evaluation Continuous Tracks Progress Answers what activities were implemented & results achieved Self-assessment by project management Alerts managers to problems Periodic In-depth analysis of actual vs. planned achievements. Answers how and why results were achieved; future impact Internal and/or external exercise Gives managers strategy and policy options
  41. 41. . Complementary Roles of Results-Based Monitoring and Evaluation Monitoring Evaluation Clarifies program objectives Analyzes why intended results were or were not achieved Links activities and their resources to objectives Assesses specific causal contributions of activities to results Translates objectives into performance indicators and set targets Examines implementation process Routinely collects data on these indicators, compares actual results with targets Explores unintended results Reports progress to managers and alerts them to problems Provides lessons, highlights significant accomplishment or program potential, and offers recommendations for improvement
  42. 42. . The perfect marriage? SWOT e-Scan VMGOvp ActionPlan SPIME Easy as ABC Where Are you Where do you want to Be The Course to follow The 4 functions model Planning Organising Leading Controlling Henceforth you shall be known as excellent Strategic Plan
  43. 43. VI. EPILOGUE We shall just have been over by New Man in the wagon of his Plan. Borris Pasternak The SPIME as I presented attempts to illustrate or elaborate the IME as a practical tool and device of managerial leadership. The SPIME just like the other counterpart of Action Planning, SWOT, Environmental Scanning, Mission, Vision, Goals, Values and Objectives reinforces that it can be utilized in a project, program, or task force activitywhether via short, medium, or long-range schedule. The SPIME likewise is an attempt to produce a preparedness table as an easily accessible quick guide anytime, anywhere, pragmatically, practicality.
  44. 44. VII. REFERENCES Franco, Ernesto A., et al., Project Management for Social and Economic Development, Anvil Publishing, 1997. http://www.unfpa.org/monitoring/toolkit/5communi.pdf http://planipolis.iiep.unesco.org/upload/Philippines/Philippines_EFA_MDA.pdf http://net.educause.edu/ir/library/pdf/eqm0213.pdf http://www.ride.ri.gov/instruction/curriculum/rhodeisland/resources/evaluation.htm http://www.leverus.com/associationresourcecenter/strategic.pdf http://www.konsult.leeds.ac.uk/public/level1/sec15/index.htm
  45. 45. Sam Walter Foss Bring me men to match my mountains Bring me men to match my plains: Men with empires in their purpose and new eras in their brains. Once again my WARMEST WELCOME TO ALL!