newsletter eppic v4 i4--fall 2005 final bcheers! (continued from page 1) volume 4 issue 4 page 2 ......

32
Drill and practice. And then some more. We all know that’s what it usually takes to learn tricky, complex job task performance. And more application and feedback back on-the-job to really master it. Info—Demo—Appo (Application) is the PACT Process for T&D design pattern of Instructional Activities in our MCD— Modular Curriculum Development/ Acquisition methods. The I-D-As are always sandwiched in be- tween the important Open and Close chunks of Lesson content. I-D-A...Tell ‘em, show ‘em, and then make them show you back that they got it...in an application of some sort. Or “Appo” as it was abbrevi- ated during an early project using the for- mal MCD methods and templates to bring a group of master performers and other subject-matter-experts to consensus re- garding the design of communications, education and/or training. “Appo” was a Guyism as my staff used to say. That Appo could be a practice exercise or a quiz/test. It could be paper and pencil, CBT, an e-learning test, or the completion of some desk procedure; or it could be a hands-on case study versus the sit back and postulate kind. It could be a playing out of a complex case in a structured collaborative team effort...and the important thing regardless of the vehicle used for the “Appo” - is its per- formance relevance. We ensure performance relevance in the PACT Processes for T&D via use of the Performance Model. It’s all about the “O” - as in business process “output” and the measures/standards, tasks/roles related to that output. And once we have that pinned down we can begin to envision the design of the Appo. Each T&D Event and sub-Event “Lesson” starts with an Open in- (Continued on page 2) My Point Is Get Real! Your Application Fidelity Is Critical! Pursuing Performance Volume 4 Issue 4 Fall 2005 with EPPIC Inc. A Quarterly-Seasonal Newsletter Issue Content My Point — Get Real! Your Application Fidelity Is Critical! — Guy W. Wallace, CPT lean-ISD Overview — Guy W. Wallace, CPT Simulation Exercises and Action Learning — Peter R. Hybert, CPT CRM Metrics That Really Work -part 2 of 2 — Mark Graham Brown Empowerment -part 2 of 2 — Guy W. Wallace, CPT EPPIC Inc. EPPIC Inc. EPPIC Inc. Achieve Peak Performance to protect and improve the enterprise Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz This and over 7 years of EPPIC and CADDI newsletters are available as single page PDF files on the EPPIC web site under the Resources tab at: www.eppic.biz Information: Demonstration: Application: Lesson Lesson Title: Est. Length: Lesson Map of Instructional Activities Lesson Objectives: Upon completion of the lesson, the learner will be able to . . . LESSON ACTIVITIES ©2002 EPPIC, Inc. Page 1 TMC Stores - The Most Convenient Stores 4 Selection Interviewing Basics 85 Min . Apply basic interviewing principles to conduct a job selection interviews 1. Lesson open 2. Laws/regulations/ pertaining to selection interviews 3. TMC policies relating Selection interviews 4. Selection interviewing steps 5. Interviewing do’s & don'ts 6. Practice interview techniques 7. Lesson close 30 Simulation Exercise 2 15 15 10 10 3 “Gopher” more at eppic.biz “Gopher” more at eppic.biz

Upload: others

Post on 22-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Drill and practice. And then some more. We all know that’s what it usually takes to learn tricky, complex job task performance. And more application and feedback back on-the-job to really master it. Info—Demo—Appo (Application) is the PACT Process for T&D design pattern of Instructional Activities in our MCD—Modular Curriculum Development/Acquisition methods. The I-D-As are always sandwiched in be-tween the important Open and Close chunks of Lesson content. I-D-A...Tell ‘em, show ‘em, and then make them show you back that they got it...in an application of some sort. Or “Appo” as it was abbrevi-ated during an early project using the for-mal MCD methods and templates to bring a group of master performers and other subject-matter-experts to consensus re-garding the design of communications, education and/or training. “Appo” was a Guyism as my staff used to say. That Appo could be a practice exercise or a quiz/test. It could be paper and pencil, CBT, an e-learning test, or the completion of some desk procedure; or it could be a hands-on case study versus the sit back and postulate kind. It could be a playing out of a complex case in a structured collaborative team effort...and the important thing regardless of the vehicle used for the “Appo” - is its per-formance relevance. We ensure performance relevance in the PACT Processes for T&D via use of the Performance Model. It’s all about the “O” - as in business process “output” and the measures/standards, tasks/roles related to that output. And once we have that pinned down we can begin to envision the design of the Appo. Each T&D Event and sub-Event “Lesson” starts with an Open in-

(Continued on page 2)

My Point Is

Get Real! Your Application Fidelity Is Critical!

Pursuing Performance Volume 4 Issue 4 Fall 2005

with EPPIC Inc.

A Quarterly-Seasonal Newsletter

Issue Content

• My Point — Get Real! Your Application Fidelity Is Critical!

— Guy W. Wallace, CPT

• lean-ISD Overview

— Guy W. Wallace, CPT

• Simulation Exercises and Action Learning

— Peter R. Hybert, CPT

• CRM Metrics That Really Work -part 2 of 2

— Mark Graham Brown

• Empowerment -part 2 of 2

— Guy W. Wallace, CPT

EPPIC Inc.Achieve Peak Performance

EPPIC Inc.EPPIC Inc.Achieve Peak Performance

to protect and improve the enterprise

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

This and over 7 years of EPPIC and CADDI newsletters are available as single page

PDF files on the EPPIC web site under the

Resources tab at: www.eppic.biz

Information: Demonstration: Application:

Lesson Lesson Title: Est. Length: Lesson Map of Instructional Activities

Lesson Objectives: Upon completion of the lesson, the learner will be able to . . . LESSON ACTIVITIES

©2002 EPPIC, Inc. Page

1

TMC Stores - The Most Convenient Stores

4 Selection Interviewing Basics 85 Min . • Apply basic interviewing principles to conduct a job selection interviews

1. Lesson open 2. Laws/regulations/pertaining to selection interviews 3. TMC policies relating

Selection interviews 4. Selection interviewing steps 5. Interviewing do’s

& don'ts 6. Practice interview techniques

7. Lesson close 30

Simulation Exercise

2

15

15

10 10

3

“Gopher” moreat eppic.biz“Gopher” moreat eppic.biz

Page 2: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

My Point Is

Get Real! Your Application Fidelity Is Critical!

cluding an advanced organizer we call “performance context” which precedes learning objectives and an overview of the lesson map, among other administra-tive content. Then the deployment of Info-Demo-Appo begins as necessary. Note: not all Lessons have Appos or Demos. As always...it depends. My personal favorite type of Appo is the Simulation Exercises—SimEx. I’ve been building these since the late 1970s. One of my all time favorites was for Product Managers and intended to teach them how to manage a product family through the life cycle using a cross functional team. That project was overviewed in the Summer 2004 issue of Pursuing Performance. One of the benefits of using the Perform-ance Model to guide SimEx design were the “performance gap” information captured on the Performance Model for each output. Those were the monkey wrenches we de-signers threw into the learning process to build fidelity. We did get complaints about our SimExs, but not about their relevance. Most were wishes for a return to passive learning and not more hard work, etc. Another benefit of the Performance Model in MCD design for any part of the Info-Demo-Appo pattern was that most decisions were not as arbitrary as they are in many ISD methods. Without an anchor or reference point such as a Performance Model and the K/S Matrix, many design decisions are left to a battle of the biases and personal preferences. And then page/screen-turning-review sessions typically introduce all sorts of nonsense that has no relevance to and no potential impact to the performance situation. Because the Performance Model captures the view of performance as articu-lated by a group of Master Performers it provides your first step to instructional fidelity. You now know what the outputs and key measures for each are. You know what tasks are performed and the roles involved with their specific respon-sibility in that task performance. And you’ll know the typical performance defi-ciencies and their probable causes. I hope you see that as meaningful, relevant data for impacting “Appos” as well as overall instructional fidelity. And I hope you find this issue of Pursuing Performance relevant for your needs! Cheers!

(Continued from page 1)

Page 2 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

...the “performance gap” informa-tion captured on the Perform-ance Model for each output... were the mon-key wrenches we designers threw into the learning process to build fidelity.

EPPIC’s Guy W. Wallace, CPT

Informa- Demonstra- Applica-

Lesson Lesson Est.

Lesson of Activities

Lesson Upon completion of the lesson, the learner will be able to .

LESSON ACTIVITIES SEQUENCE

©1999 CADDI, Inc. 4/9 Page 1 REPRODUCTION WITHOUT WRITTEN AUTHORIZATION IS STRICTLY PROHIBITE D

TMC Stores 4 Interviewing 85 Min .

• Apply basic interviewing principles to conduct an

1. Lesson

2. Laws/regulations/codes pertaining to interviewing 3. TMC policies relating to

interviewing 4. Interviewing steps 5. Interviewing do’s and

don'ts 6. Practice interviewing techniques

7. Lesson

30

Simula-

2

15

15

10 10

3

Informa- Demonstra- Applica-

Lesson Lesson Est.

Lesson of Activities

Lesson Upon completion of the lesson, the learner will be able to .

LESSON ACTIVITIES SEQUENCE

©1999 CADDI, Inc. 4/9 Page 1 REPRODUCTION WITHOUT WRITTEN AUTHORIZATION IS STRICTLY PROHIBITE D

TMC Stores 4 Interviewing 85 Min .

• Apply basic interviewing principles to conduct an

1. Lesson

2. Laws/regulations/codes pertaining to interviewing 3. TMC policies relating to

interviewing 4. Interviewing steps 5. Interviewing do’s and

don'ts 6. Practice interviewing techniques

7. Lesson

30

Simula-

2

15

15

10 10

3

Informa- Demonstra- Applica-

Lesson Lesson Est.

Lesson of Activities

Lesson Upon completion of the lesson, the learner will be able to .

LESSON ACTIVITIES SEQUENCE

©1999 CADDI, Inc. 4/9 Page 1 REPRODUCTION WITHOUT WRITTEN AUTHORIZATION IS STRICTLY PROHIBITE D

TMC Stores 4 Interviewing 85 Min .

• Apply basic interviewing principles to conduct an

1. Lesson

2. Laws/regulations/codes pertaining to interviewing 3. TMC policies relating to

interviewing 4. Interviewing steps 5. Interviewing do’s and

don'ts 6. Practice interviewing techniques

7. Lesson

30

Simula-

2

15

15

10 10

3

Informa- Demonstra- Applica-

Lesson Lesson Est.

Lesson of Activities Lesson Upon completion of the lesson, the learner will be able to .

LESSON ACTIVITIES SEQUENCE

©1999 CADDI, Inc. 4/99 Page

1

REPRODUCTION WITHOUT WRITTEN AUTHORIZATION IS STRICTLY PROHIBITE D

TMC Stores 4 Interviewing 85 .

• Apply basic interviewing principles to conduct an

1. Lesson

2. Laws/regulations/codes pertaining to interviewing 3. TMC policies relating to interviewing

4. Interviewing steps 5. Interviewing do’s and don'ts 6. Practice interviewing techniques 7. Lesson

30

Simula-

2

15

15

10 10

3

Informa- Demonstra- Applica-

Lesson Lesson Est.

Lesson of Activities

Lesson Upon completion of the lesson, the learner will be able to .

LESSON ACTIVITIES SEQUENCE

©1999 CADDI, Inc. 4/9 Page 1 REPRODUCTION WITHOUT WRITTEN AUTHORIZATION IS STRICTLY PROHIBITE D

TMC Stores 4 Interviewing 85 Min .

• Apply basic interviewing principles to conduct an

1. Lesson

2. Laws/regulations/codes pertaining to interviewing 3. TMC policies relating to

interviewing 4. Interviewing steps 5. Interviewing do’s and

don'ts 6. Practice interviewing techniques

7. Lesson

30

Simula-

2

15

15

10 10

3

Event Map ...of Lessons

PACT Event & Lesson Maps

Lesson Maps… of Instructional Activities

Also– see Pete Hybert’s article on Simulation Exercise and Action Learning beginning on page 9.

Page 3: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Note: Formatted for a booklet style

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Pete R. Hybert, CPT

The success of any project depends on all the primary stakeholders

participating in the process.

www.prhconsulting.com

Pursuing Performance is a free quarterly newsletter, published seasonally,

from EPPIC and Guy W. Wallace on both paper and on the EPPIC web site as a pdf file

under the tabs: Resources/Newsletters at

www.eppic.biz

Pursuing Performance is for the leaders, managers, and individual contributors of those business functions, systems and processes that ensure that the right human knowledge, skills and attributes are available in a timely, efficient and effective manner in support of the enterprise processes. For you we offer our insights and many examples of our integrated concepts, models, methods, tools and techniques regarding our PACT Processes for T&D, our T&D Systems View, and our Enter-prise Process Performance Improvement methods. Use! Enjoy!

Pursuing Performance—EPPIC’s Quarterly Newsletter

EPPIC Inc.Achieve Peak Performance

EPPIC Inc.EPPIC Inc.Achieve Peak Performance

to protect and improve the enterprise

EPPIC Inc.

20417 Harborgate Court #510 Cornelius NC 28031

office: 704– 895– 6364 mobile: 704– 746– 5126

email: [email protected] web: www.eppic.biz

EPPIC helps you orient your enterprise T&D/ Learning/ Knowledge Management

Systems to the Performance Requirements of the Enterprise Processes

Judy Hale, PhD, CPT

... how to evaluate the learning and performance improvement function presumed it operated as a

business; however, in reality, it ... exists at the whim or grace of the organization.

www.haleassociates.com

Page 4: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 4 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

The book, the method, the applications experience

lean-ISD Overview By Guy W. Wallace, CPT Open & Introduction I’ve been asked many times why I wrote the lean-ISD book with so much de-tail, giving away my ISD methodologies developed and tested in for over 20 years. After 25 years of being rewarded by kind/complimentary comments regarding my and my associates tendency for sharing and not teasing/selling in conference presentations and articles...it would be hard to write something that didn’t really help someone do something. I’ve spent too much time de-veloping training that really helped people perform their job tasks to produce what Tom Gilbert called “worthy outputs” to stop now. What is “lean?” The concept of lean is borrowed from the work of the mid-1990’s landmark M.I.T. benchmarking study of the global automotive industry and the best methods of production: The Machine that Changed the World, by James P. Womack, Daniel T. Jones, and Daniel Roos. Lean production is a compilation of the best of both mass and craft production techniques—the best of both worlds. Production produces products, and T&D is just another product. Why lean-ISD? The ultimate goal of training & development (T&D) is improved performance by the learners as measured by business metrics. With T&D, it’s not really important that someone learned something, even to a very high proficiency, if what was learned had a miniscule impact on the business given its costs. The PACT Processes for T&D are a lean approach to ISD—instructional systems design and/or development. ISD can be lean. What is/are the PACT Processes for T&D PACT is an acronym. EPPIC’s five proprie-tary components of the PACT Processes for T&D link together to create a power-ful, lean-ISD methodology. The PACT Processes for T&D include CAD, MCD/IAD and a common Analysis method and sup-ported by common Project Planning/Management tools and templates.

(Continued on page 5)

In 1999 I published my second book: lean-ISD. Dr. Geary A. Rummler created the cover artwork. The book was my attempt to create and expand on a concept of a Rummler-methodology-based ISD approach envisioned at Motorola’s Training & Education Center—MTEC—the forerunner of Motorola University while I was employed there in 1981-2. In late 1982 the PACT Processes began with the first of my 74 CAD projects, followed immediately by the first post-CAD MCD effort.

erformance-basedcceleratedustomer-/Stakeholder-drivenraining & DevelopmentSM

PACT

©2002 EPPIC, Inc.

erformance-basedcceleratedustomer-/Stakeholder-drivenraining & DevelopmentSM

PACT

©2002 EPPIC, Inc.

erformance-basedcceleratedustomer-/Stakeholder-drivenraining & DevelopmentSM

PACT

©2002 EPPIC, Inc.

Page 5: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. www.eppic.biz Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Spring 2004 ©2004EPPIC Inc. All rights reserved. www.eppic.biz

What is PACT’s Value Proposition? It is very quick, thorough, collaborative, and enterprise process performance-oriented set of ISD methodologies that are “deployment method/media” neu-tral, and that approach instructional design as a product manager/designer might...and be concerned with improving ROI by improving performance im-pact and/or reducing ISD life cycle costs through design strategies. How is PACT like most other ISD methods? The PACT Process of MCD leads to the development of instruction, for aware-ness creating communications, knowledge and enabling knowledge/skill cre-ating education, or performance capabil-ity creating training. How is this unlike most other ISD methods? Very few ISD method-ologies have a sys-tems engineering component such as CAD that leads to the ADDIE-type new prod-uct development ap-proaches. PACT is a more businesslike approach. It is pre-dictable in terms of cost, schedule and quality of output. It is a more rationale, smooth, fast, product-development approach with customer/stakeholder involvement. A lean approach to ISD that create high-quality, performance-based T&D in a reduced cycle time and at reduced costs. Both first costs and overall life cy-cle costs. CAD—Curriculum Architecture Design CAD is the macrolevel, performance-based, lean-ISD methodology at the sys-tems engineering design level. A CAD project systematically designs the macro T&D product line, using a modular approach that increases perform-ance impact and content sharability. A CAD

• Configures content to increase shareability across many potential target audiences

(Continued from page 4)

(Continued on page 6)

Page 5 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

PACT Analysis

Project Planning & Management

MCD Modular Curriculum Development

IAD Instructional Activity Development

CAD Curriculum Architecture Design

The PACT Processes for T&D

©2002 EPPIC Inc.

In PACT our design phi-losophies address “Designing for the ISD Product Life Cycle” by focusing on the life cycle costs and returns from these 7 areas: 1– Performance Impact 2– Modular Reuse Design Approach 3– Development 4– Administration 5– Inventory 6– Deployment 7– Maintenance

Page 6: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

The book, the method, the applications experience

lean-ISD Overview By Guy W. Wallace, CPT

Page 6 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

• Reduces overall first costs for development • Reduces life cycle costs for deployment and maintenance

The front-end Analysis data forces a performance orientation into T&D Events and Modules. CAD projects identify all the T&D that could be and prioritizes the T&D that should be, for resourcing MCD project for the T&D that will be. MCD—Modular Curriculum Development/Acquisition MCD is our ISD model, much like the ADDIE model from the work of Robert

Gagne, Leslie Briggs, Robert Moran, and Robert Branson. MCD is the most traditional ISD process within the PACT Processes for T&D. It is the process that develops or acquires T&D one course/instructional event at a time (or in small bundles or large concurrent efforts). This lean-ISD methodology provides a structured, gated, in-control process for MCD, with method-ology that provides for the quick analysis, design, develop-ment, pilot-testing, and revision/release of the T&D Modules and T&D Events of the CAD. MCD uses a gated, predefined project management process that is designed to incorporate representatives from key customer and stakeholder groups into the overall T&D project’s activities and tasks for business decision-making. MCD projects can be conducted with or with-out a prior CAD effort. lean-ISD Overview Summary lean-ISD is not your father’s ISD. lean-ISD as practiced via the PACT Processes is performance based and quick. It uses data and the wisdom of your master performers and business lead-ers in an efficient, predictable process. It has been proven by this author alone in 74 CAD and 50+ MCD projects to-date. Many others, over 150 have been trained/certified in the PACT Processes (known as MC/MI at General Motors in the mid 1990s). Several have developed award winning instruction using these methods. The book, lean-ISD, cover the PACT Processes extensively. EP-PIC also has performance-oriented workshops to train and cer-tify PACT Practitioners in the key roles of: PACT Performance Analyst, PACT CAD Designer, PACT MCD Designer, PACT Lead Developer, and the PACT Project Planner/Manager.

►►►►►

(Continued from page 5)

The ADDIE Model of ISD

A

D

D

I

E

Analysis

Design

Development

Implementation

Evaluation

Page 7: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

What reviewers have said about:

lean-ISD

The lean-ISD book is available at Amazon.com

for $50.00 plus s&h

lean-ISD describes in great detail the

PACT Processes for T&D

Recipient of

ISPI’s 2002 Award of Excellence

for Instructional Communication

Miki Lane, senior partner at MVM The Communications Group says,

“lean-ISD takes all of the theory, books, courses, and pseudo job aids that are cur-rently on the market about Instructional Systems Design and blows them out of the water. Previous ‘systems’ approach books showed a lot of big boxes and diagrams, which were supposed to help the reader become profi-cient in the design process. Here is a book that actually includes all of the information that fell through the cracks of other ISD training materials and shows you the way to actually get from one step to another. Guy adds all of the caveats and tips he has learned in more than 20 years of ISD prac-tice and sprinkles them as job aids and sto-ries throughout the book. However, the most critical part of the book for me was that Guy included the project and people management elements of ISD in the book. Too often, ISD models and mate-rials forget that we are working with real people in getting the work done. This book helps explain and illustrate best practices in ensuring success in ISD pro-jects.”

Page 8: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

For Your Tool/Technique Kit Simulation Exercises and Action Learning

By Peter R. Hybert, CPT

Page 8 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

To a participant, a group-paced simulation is simple…once they have gone through it. But, explaining simulations beforehand to steering teams, clients, participants, and even facilitators is anything but simple. We are occasionally asked if “action learning” is the same as simulation. It is easy to see why people can get these approaches mixed up. An effective simulation shares a number of characteristics with the action learning meth-odology. They clearly rely on participants doing things and then learning from that experience. But, there are also some key distinctions, which need to be clarified early in a project to ensure that stakeholders’ expectations are met and that the project does what it is supposed to.

What is Action Learning? Action learning originated as a specific approach advocated in the late 70s-early 80s by such authors as Reg Revans, David Casey, Alan Lawlor, and oth-ers. It is based on Chris Argyris’ double loop feedback model (described in Argyris’ article in the July-August 94 issue of the Harvard Business Review). Simplified, the basic concept was as follows:

• Assemble a group of executives from various parts of the organization with a facilitator (often called a “set advisor”).

• This group would select a significant, complex problem to work on over

an extended period of time (e.g., 1-2 yrs.). • The group would meet at regular intervals to review and reflect on the

approach for solving the problem and to discuss progress. • The facilitator would debrief (and intervene in) each group meeting to

point out communication issues, draw out the learning, and develop ac-tion plans.

• The work, meet, debrief, action-planning cycle would continue for the

duration of the action learning process. The action learning process, then, depends on the common experience of working on the problem with others in the same context. Over time, relation-

(Continued on page 9)

Pete Hybert has been in the training and performance improvement field since 1984 and has worked as a consultant since 1989. His national clients include many Fortune 500 firms, as well as small- and mid-sized organizations. Pete lives and works in the western Chicago suburbs. Guy and Pete were colleagues at SWI and business partners at CADDI.

Page 9: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

ships and new ideas develop and evolve so that the progress on the problem is dependent on the learning that takes place.

You Say “To-may-to” I Say “To-mah-to” Like action learning, simulations rely on a skilled facilitator to coach the group, intervening when needed, and to conduct extensive debriefings both during and after the simulation to distill the “lessons learned” and identify how they would transfer back to the work environment. These lessons learned should address the process, the interaction, the output, and the in-terpretation/analysis of the results. Who is the customer? Who is expected to “salute and follow orders” and how does this impact team relationships? Would this product idea work? What are the biggest risks? Was the strategy being discussed really the right way to go? What would you do differently if the product were a commodity? What information is missing or were “guesstimates” in the product plan or busi-ness case and how would you firm it up going forward? New insights can be reinforced in future rounds. A major difference between action learning and simulations is that the real focus of action learning is on the work output. In other words, an action learn-ing team may be working on developing a vision for an organization. In the process, they might go through significant learning about the customers, the market, the technology, employee opinions, and even their own mental para-digms. However, the goal is not a general competence in “vision develop-ment” but a specific vision for that organization for that situation. They would be attending to content more than process, concepts, or skills. Simulations, on the other hand, focus on process so that participants can ap-ply that process to new and unique situations they will encounter later. The simulation forms the basis for a shared experience and learning comes from trying something, getting feedback, and then trying something again (usually something slightly different or more difficult). A participant in a labor relations simulation will learn about content, such as union agreement terms, legal precedent, organizational roles, and yes, even their own mental paradigms, but, a simulated labor negotiation does not result in real changes in work rules. Another key difference between an effective simulation and action learning is that the simulation deals with more than one problem. (And, because it is a designed learning experience, the group does not select their own problem). Any individual work task or problem only presents a subset of the total diffi-culties that a learner will encounter on the job. Training a pilot to land a plane

(Continued from page 8)

(Continued on page 10)

Page 9 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 ©2004EPPIC Inc. All rights reserved. www.eppic.biz

A major difference between action learning and simulations is that the real focus of action learning is on the work output...However, the goal is not a general competence in “vision development” but a specific vision for that organization for that situation. They would be attending to content more than process, concepts, or skills.

Page 10: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

For Your Tool/Technique Kit Simulation Exercises and Action Learning

in clear skies, fog, rain, snow, wind, etc. is better than training him/her to take-off, fly, and land once, if your goal is to teach the process of piloting an airplane. One of the strengths of the simulation design is the ability to “zoom in” on the most critical and/or important parts of the job. And, it can expose participants to a wide range of performance conditions.

Principles to Adopt and Adapt David Casey cited several key principles or assumptions behind action learn-ing that are paraphrased below:

Principle 1: People like to learn from watching experienced practitioners work.

Principle 2: We learn more when we are motivated to achieve

something. Principle 3: Learning about oneself is threatening and may be resisted

unless there is a supportive environment. Principle 4: People only learn when they do something. Principle 5: Learning is deepest when it involves the whole person. Principle 6: Only the learner knows what he or she has learned.

An effective simulation heavily leverages these principles. Simulations are based on the assumption that there is no such a thing as passive learning. Regardless of who is learning (i.e., execs, managers, technical professionals, clerical staff, production personnel, etc.) new knowledge and skills need to be processed, tested, and used to be learned. Every simulation participant plays an active role—if participants play an ob-server role at one point, they should be participating more actively in another. We create multiple rounds so that participants get several attempts to reduce the threat of doing the performance (e.g., conducting a team meeting) and receiving feedback. Participants learn from watching each other and from “war stories” that come out during the debriefs.

(Continued from page 9)

(Continued on page 14)

Page 10 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

One of the strengths of the simulation design is the ability to “zoom in” on the most critical and/or important parts of the job. And, it can expose participants to a wide range of performance conditions.

Every simulation participant plays an active role—if participants play an observer role at one point, they should be participating more actively in another.

Page 11: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

“Guy Wallace has written an appropriate follow-up to his lean-ISD [book]. The breadth and depth of his latest book, T&D Systems View, is very impressive. He uses the analogy of a clockface to thor-oughly explain his 12-system process. The procedure in the book allows you to assess any training and development operation from a systems’ perspective. It is easy to read and follow thanks to its consistent structure and format from chapter to chapter. An excellent overview of the process is in-cluded, along with helpful checklists.”

—James D. Russell Professor of Educational Technology, Purdue University

Visiting Professor of Instructional Systems, Florida State University

What others have said about:

T&D Systems View

“Guy Wallace has done it again! After demystifying the ISD process in his lean-ISDSM book, he tackles the corporate training and development system and puts it in a business-focused perspective. Whether you are in-house or serving as an external consult-ant, you will find Guy’s model an invaluable tool for enterprise training and development. This analytic and design process ensures that you dot all the i’s and cross all the t’s when moving your company or client to Learning by Design, not Learn-ing by Chance. The elegant clockface model helps you develop a clear picture of any organization and clearly helps you map out how best to effectively manage all the elements of the enterprise. Once the elements are mapped out, the model, through en-closed assessment and prioritizing tools, helps de-termine where and when to put corporate assets to maximize corporate return on investment. This is a must-have book for any consultant or or-ganization that is concerned about improving the performance of their organization through improving processes and competencies.”

—Miki Lane Senior Partner

MVM Communications

This book is available from Amazon.com and ISPI.org

for $25.00 plus s&h

...and a 10% discount for ISPI member at ISPI.org

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

If you’d like to be more “process oriented” in operating and improving your T&D function, then this book can provide you the guidance to create your own “systems view” of your T&D/Learning/Knowledge Management operations!

- Guy W. Wallace

Page 12: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

The PACT Processes for T&D/Learning/Knowledge Management are data-driven, not opinion-driven. Data and decisions for the PACT Processes come from handpicked teams in a predictable process. Predictable in terms of output content quality, and process cost and schedule. Since 1982 the PACT Processes for CAD and MCD/IAD have proven themselves in hundreds of projects with dozens and doz-ens of PACT Practitio-ners. Here is the “logic scheme” of the PACT data that is collected and/or created in both upfront CAD ef-forts and downstream MCD/IAD efforts. Central to the entire approach is the “performance views” captured in the Per-formance Models, which orients all other analysis and design data to that specific, targeted perform-ance. A focus on perform-ance capability is maintained through-out all PACT Proc-esses.

Page 12 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Another in a series of EPPI Models

The PACT Processes for T&D “Data Logic” © 2002 EPPIC Inc.

PACT—Performance-based Accelerated Customer/Stakeholder-driven T&D EPPI — Enterprise Process Performance Improvement

erformance-basedcceleratedustomer-/Stakeholder-drivenraining & DevelopmentSM

PACT

©2002 EPPIC, Inc.

erformance-basedcceleratedustomer-/Stakeholder-drivenraining & DevelopmentSM

PACT

©2002 EPPIC, Inc.

erformance-basedcceleratedustomer-/Stakeholder-drivenraining & DevelopmentSM

PACT

©2002 EPPIC, Inc.

Curriculum Architecture Design Systems engineering of the performance-based T&D Product Line

Performance Analysis Output and Task Requirements and Enabling K/S

Performance Models

ETAs capture the assessment of all relevant, existing T&D in terms of it effectively deploy-ing the “enabling K/S” and developing “task performance capability”

T&D Paths T&D Paths present a suggested se-quence of T&D, either as “lock-step” or as “flexible menus,” as appro-priate to the target audiences’ real-world needs

Event Specs capture the first cut of “gap” T&D products needed to address the target audi-ences’ T&D needs (not covered adequately in existing T&D); they are used to prioritize all curriculum gaps for development

Module Specs are the first cut at the modular content of the T&D Events needed to address the target audiences’ T&D needs; they are also used to prioritize all curriculum gaps for development

Existing T&D Assessments

K/S Matrices capture the “enabling knowl-edge/skill requirements” of the target audi-ence as derived systematically from the Performance Model, per a group of facili-tated “Master Performers” and SMEs

Event Specifications

Module Specifications

Knowledge/Skill Matrices

PMs capture the “ideal performance require-ments” regarding outputs and tasks of the target audiences, as well as a “gap analysis” of the current state, per a facilitated group of “Master Performers” and SMEs

Performance Model Area of Performance:

Key Outputs -

Measures

Key Tasks

Typical Performance

Gaps

Probable Gap

Cause(s)

dE dK dI

Roles/Responsibilities

1 2 3 4

dE = deficiency

-

Environment

dK = deficiency

-

Knowledge

dI = deficiency

-

Individual

attribute/value

Codes:

1 = 2 = 3 = 4 =

Knowledge/Skill Matrix Link to Area of Performance

Depth A/K/S

Knowledge/Skill Category: K/S Item

A

B

C

D

E

F

G

Select/

Train S/T

Criticality

H/M/L

Volatility

H/M/L

Difficulty

H/M/L

Codes:

Link to Area of Performance A = B = C = D =

E = F = G =

Criticality/Difficulty/Volatility H = High M = Medium L = Low

Depth of Coverage A = Awareness K = Knowledge S = Skill

Existing T&D Assessment

Include as Is

Use as Source:

Do not use

Special Requirements for Delivery

Equipment

Facility

Instructor

License Requirement

Yes

No

Other

Number:

Title:

Provider: Length:

Hrs.

Pgs. Depth/Level:

Awareness

Knowledge

Skill

Course Fit Assessment Related Process(

es

), Area(s) of Performance, or

Tasks

Enabling Knowledge/Skill Items:

Course Description

Notes:

Class Size:

> 20

10

20

< 10

Other

Current Target Audience:

Continued on Side Two

Materials Attached (e.g., course description)

Course Contact: Phone:

Primary Delivery Method: GP

Group

- paced

Classroom

Lab

SP

Self

- paced

Readings/Exercises

Web Site Pages

CBT

Videotape

1

- 1

One

- on

- one

S

- OJT (training)

U

- OJT

Other:

Schedule/Frequency:

T&D Event

Specification Sheet Event #

CURRICULUM ARCHITECTURE DESIGN Event Title

This Event is Composed of the Following Modules (not sequenced) Availability

Status Fully Available Partially Available Not Available

Estimated Length ( +

25%) o

Hours o

Pages o

Other Volatility

o

High o

Medium o

Low

CAD Design by CADDI • Naperville, IL 60540 • 630/355 - 9800 v.1

##/## Page 1

Depth of Coverage o

Awareness o

Knowledge

o

Skill Learning Environment o

Group

- paced

o

Self

- paced o

Coached Predominant Delivery Strategy

Implementation Priority o

High o

Medium

o

Low of

Module #

Module Title

Est. Length

Availability

– – – – – – – – – – –

Module #

Module Title Est. Length

Availability – – – – – – – – – – –

Make/Buy o

Make o

Buy

Use As Is o

Buy

Modify

o

Continued on next page Notes

o

Continued on next page

T&D Paths o

1.

<NAME> o

2.

<NAME> o

3.

<NAME> o

4.

<NAME> o

5.

<NAME> o

6.

<NAME> o

7.

<NAME> o

8.

<NAME> o

9.

<NAME> o

10.

<NAME>

Deployment Platform(s) o

S

- OJT* (A)

-

Coach o

S

- OJT* (B)

-

Certified Coach o

GP Instructor

- led

o

GP

- L Instructor

- led Lab

o

SP Readings o

Videotape o

Audiotape o

Distance Learning o

CBT o

Performance Aid o

Other

T&D Module

Specification Sheet Module #

CURRICULUM ARCHITECTURE DESIGN Module Title

Preliminary Content Listing (Not All -

inclusive)

Availability Status

Fully Available Partially Available Not Available

Estimated Length ( +

25%) o

Hours o

Pages o

Other Volatility

o

High o

Medium o

Low T&D Paths o

1.

<Name> o

2.

<Name> o

3.

<Name> o

4.

<Name> o

5.

<Name> o

6.

<Name> o

7.

<Name> o

8.

<Name> o

9.

<Name> o

10.

<Name>

Depth of

o

Awareness o

Knowledge

o

Skill Learning Environment o

Group

- paced

o

Self

- paced o

Coached

Deployment Platform(s) o

S

- OJT* (A)

-

Coach o

S

- OJT* (B)

-

Certified Coach o

GP Instructor

- led

o

GP

- L Instructor

- led Lab

o

SP Readings Predominant Delivery Strategy

Implementation Priority o

High o

Medium

o

Low of Make/Buy

o

Make o

Buy

Use As Is o

Buy

Modify

o

Continued on next page Notes

o

Continued on next page

o

Videotape o

Audiotape o

Distance Learning o

CBT o

Performance Aid o

Other

CAD Design by CADDI • Naperville, IL 60540 • 630/355 - 9800 v.1

##/## Page 1

Page 13: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 13 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Modular Curriculum Development New Product Development for performance-based T&D Products

ImplementationPlanning

Project Planning & Kick-off

Analysis Design

CAD Phase 1

©2002 EPPIC, Inc.

CAD Phase 2 CAD Phase 3 CAD Phase 4

ImplementationPlanning

ImplementationPlanning

Project Planning & Kick-off

Project Planning & Kick-off

AnalysisAnalysis DesignDesign

CAD Phase 1

©2002 EPPIC, Inc.

CAD Phase 2 CAD Phase 3 CAD Phase 4

The PACT Processes for T&D/Learning/Knowledge Management

Analysis

Design

Project Planning & Kick-off

MCD MCD MCD

Revision & Release

MCD

Develop-ment/

Acquisition

Pilot Test

MCD MCD

©2002 EPPIC, Inc.

Change Manage-ment also be-comes easier to predict and con-duct when tied to the Performance Models, which could be main-tained by the com-munity of master performers. That community — which we often la-bel a Curriculum Advisory Council depending on their total scope of re-sponsibility — could be in charge of the processes and/or the recruit-ing process, as well as any train-ing. The community could be formal-ized and in place to oversee and co-ordinate with the various Owner Or-ganizations for one or many of the other Human As-set Management Systems (HAMS) and/or one or more of the Envi-ronmental Asset Management Sys-tems (EAMS)...of our extensive, comprehensive EPPI models and methods.

►►►►►

Module Inventory

Lesson Inventory

Event Definitions

Event Inventory

Of Defs and Specs

The Event Inventory stores all T&D Event Definitions (the ETA - existing Enterprise T&D Product Line), and the gap T&D’s Event Specs

The Module Inventory stores the inventory of Module Specs (not built) and Module Definitions (available Modules) to better facilitate reuse

Lesson Maps Lesson Maps specify the sequence of Instructional Activities (the ob-jects) of each Lesson within a T&D Event, as well as identify their orien-tation: information, demonstration, or application; and their level: awareness, knowledge, or skill

Activity Specifications

Activity Specs specify the substructure of content and flow for each Instruc-tional Activity, as well as define for the developers the deliverables for any acquisition/development effort

The Lesson Inventory stores the inventory of all Lessons and their Instructional Activities to better facilitate reuse of the Lesson or of its component Instructional Activities, and to provide a central repository for administra-tion and maintenance

Event Defini-tions capture the final definition of the T&D modular product actually built, and identify the lessons within each

Updated T&D Path

The updated T&D Path presents a post-MCD update to the suggested se-quence of T&D, based on additional, refined information learned during MCD design

Information:

Demonstration:

Application:

Lesson #:

Lesson Title:

Estimated Length:

Lesson Map of Activities Lesson Objectives:

Upon completion of the lesson, the learner will be able to . . .

LESSON ACTIVITIES SEQUENCE

©2001 CADDI, Inc.

Output Formats and Methodology © 2001 CADDI, Inc.

VER 106

Not for use or disclosure outside of the

Verizon

Companies, without prior written permission

Information:

Demonstration:

Application:

Lesson #:

Lesson Title:

Estimated Length:

Lesson Map of Activities Lesson Objectives:

Upon completion of the lesson, the learner will be able to . . .

LESSON ACTIVITIES SEQUENCE

©2001 CADDI, Inc.

Output Formats and Methodology © 2001 CADDI, Inc.

VER 106

Not for use or disclosure outside of the

Verizon

Companies, without prior written permission

Output Formats and Methodology ©2000 CADDI, IncNot for use or disclosure outside of theVerizoncompanies, without prior written permission

Activity Type

Sequence of Activity Steps

Evaluation Method/Location Support Resources for Development

Lesson/Activity # Activity Title

Learning Objectives

Estimated Time

InfoDemo“Appo ”

Deliverables Due from Developer

Module Number: LOKO

Description

Estimated Roll -up Length

Instructional ActivitySpecification

PACT Process

Consumer Sales AssociateCurriculum Redesign Project

Output Formats and Methodology ©2000 CADDI, IncNot for use or disclosure outside of theVerizoncompanies, without prior written permission

Activity Type

Sequence of Activity Steps

Evaluation Method/Location Support Resources for Development

Lesson/Activity # Activity Title

Learning Objectives

Estimated Time

InfoDemo“Appo ”

Deliverables Due from Developer

Module Number: LOKO

Description

Estimated Roll -up Length

Instructional ActivitySpecification

PACT Process

Consumer Sales AssociateCurriculum Redesign Project

Existing Training

T&D Event Definition Event Event

Form Design ©1999 CADDI, Inc. VER

106 Content

©2001

Verizon

Communications

Not for use or disclosure outside of the Verizon

companies, without prior written permission.

PACT

Delivery Delivery

Lesson Information Event

Lesson # Title Length

6/20/01

All PACT formats, outputs, methods, and logic ©2002 Guy W. Wallace and EPPIC, Inc. All rights reserved. www.eppic.biz

Page 14: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 14 Volume 4 Issue 4

For Your Tool/Technique Kit Simulation Exercises and Action Learning

Effective group-paced simulations tend to have a “real-time” portion (such as a team meeting or an employee-supervisor meeting) where the participants have to act and respond with time pressures and “noise” (such as distrac-tions, unclear next steps, messy or contradictory information, even nervous-ness) and an “off-line” portion (such as writing up the product plan or docu-menting the agreements reached in the meeting) where participants can think about where they are and where they should go next. The fact that there is no right answer forces participants to weigh pros and cons and have a ready defense for their conclusions. This is motivating and definitely involves the “whole person.” There is one significant difference and that is around Principle #6: simula-tions are designed learning experiences. Certainly there will be incidental out-comes, however, the developer and facilitator should have a defined learning in mind and ways of checking that every participant meets the criteria. It is true that each individual may focus (or need work) on different areas, how-ever, by the end of the course the participants should have demonstrated the necessary characteristics for competent performance to an appropriate level.

Choosing the Right Approach Actually, deciding whether you need action learning or simulation or even some other training strategy is fairly simple. As mentioned earlier, the first decision that needs to be made is whether the goal of the effort is an answer to a specific problem, as in “we need a new strategy” or general competence, as in “we need to be better at developing market-driven strategies.” Action learning is intended to deliver an answer, using learning as a key part of getting to that answer. Simulations, on the other hand, do not accomplish work—they develop the capability to do work. Action learning is used with executives for a number of reasons, including the nature of their work and the fact that there are typically a small number of them. But that is not to say that action learning is only appropriate for execu-tives. It is appropriate when there is a small size audience and the perform-ance of the particular work activity is infrequent. For example, everyone does-n’t need to create organizational value statements and even those that do

(Continued from page 10)

(Continued on page 15)

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

The fact that there is no right answer forces participants to weigh pros and cons and have a ready defense for their conclusions. This is motivating and definitely involves the “whole person.”

Action learning is intended to deliver an answer, using learning as a key part of getting to that answer. Simulations, on the other hand, do not accomplish work—they develop the capability to do work.

Page 15: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

shouldn’t be doing it very often. Action learning is an effective way to approach this need. If the audience is large, then there probably is sufficient reason to invest in researching the needed competencies and building a consistent, effective way of delivering them. Besides, an action learning approach limits delivery to the intact work team. A simulation can be effective with open enrollment.

The Bottom Line Simulations allow the training to be engineered to fit the needs of the audi-ence and business and performance. Specific job-related problems and diffi-culties can be built in to ensure trainees get exposed to what they will be likely to encounter in the workplace. Trained facilitators debrief the group to bring out the most important points. The audience can consist of members from various functions or intact work groups. And, simulations allow the instruc-tional designer to “zoom in” on the parts of the job that require skill practice without requiring them to do every step in the process. For the right application, simulations can provide some of the benefits of the action learning approach to situations where the goal is a competency to be performed in varying situations, where there is a large target audience, and when the performance being addressed is complex but can be modeled and/or structured into a process. There is gray area between doing work and building competencies in which a hybrid approach might be effective. Simulations and action learning may be used together to effect new process development, reengineering, and even corporate transformation. Something to try sometime.

►►►►►

(Continued from page 14)

Page 15 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Pete Hybert has been in the training and performance improvement field since 1984 and has worked as a consultant since 1989. His clients include Fortune 500 firms, as well as small- and mid-sized organizations. He has consulted in a wide range of subject areas including engineer-ing, manufacturing, marketing, supply chain, sales, operations, finance, IT, and oth-ers. Pete is the author of more than 30 articles and presentations. He has also served as the chairperson for ISPI’s Awards of Excellence Committee and as Chicago ISPI Chapter President. He is a certified performance technologist (CPT) and holds a Masters Degree in Instructional Design.

PRH Consulting focuses on “leveraging know-how to improve performance!” by ana-lyzing what makes top performers effective and then designing systems to transfer that capability across the organization. We specialize in curriculum and content archi-tecture, custom training and simulations, and performance-based qualification sys-tem design and development.

You can find out more at www.prhconsulting.com.

...an action learning approach limits delivery to the intact work team. A simulation can be effective with open enrollment.

Page 16: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 16 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Part 2 of 2

CRM Metrics That Really Work By Mark Graham Brown This continues Mark’s article from our last issue...

Outcome Metrics Output metrics examples are: gross margin dollars, sales revenue, reduced marketing costs, partnerships with important customers. If you asked any company why they are doing CRM they would mention the things I have listed under “outcome metrics”. These are the only things that put dollars in anyone’s bank account. Because of the importance of these measures, and the fact that it is harder to fake the data, I would put the most weight on these metrics in the CRM Index or gauge. The problem with outcome and even some output metrics is that they are mostly lagging indicators or measures of the past. A good CRM Index should include a mix of leading and lagging indicators in order to be a valid gauge. The leading indicators are the ones that track the inputs and the process or behavior measures. Tracking these metrics on a daily basis helps sales man-agers feel good that their people are out there working hard. When creating a CRM Index, I would begin by assigning a weight to each of the four types of measures as follows:

• Input metrics: 15% • Process metrics: 10% • Output metrics: 35% • Outcome metrics: 40%

The reason for such a heavy weight on the outputs and outcomes is that they are the real measures of value.

Input Metrics Input measures are important because they are the most forward-looking of all the four types of measures in the CRM Index. Input measures should be both quantitative and qualitative. Some examples of input metrics that might be counted are:

(Continued on page 17)

The problem with outcome and even some output metrics is that they are mostly lagging indicators or measures of the past. A good CRM Index should include a mix of leading and lagging indicators in order to be a valid gauge.

Page 17: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 17 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Inquiries received Web site hits Calls made Letters sent Customer profiles created Attendees at seminars Business cards received Brochures sent Leads generated RFPs received These are all measures of activities designed to generate new business or gain additional business from existing customers. It is not enough to just count things like those listed above. The input metrics need to have a qualitative component as well. I did a public workshop last year at a conference in Orlando, and my input metrics that I tracked looked very strong. I had about 70 people attend the workshop from 25 different companies, and had about 30 of the attendees give me their business card and ask for mine after the session. I followed up several times with letters, phone calls, books, etc. and not one of those pros-pects generated a dollar in income for me. It’s important to measure the quality of your inputs as well as the quantity. A client in the police radio business came up with a qualitative measure of prospects called “Opportunity Strategic Value” or OSV Score. Each prospect was given a score from 0-100%, depending upon factors such as:

• Location • Size • Name recognition (would impress other police departments if we

had them as customers) • Number of units needed • Potential gross margin • Political connections (e.g., do we have any friends working there)

Account managers were measured on the inputs they generated, but also on the quality of those inputs. This helped drive the right behavior by getting them to focus on bringing in important and profitable customers. Another client in the facilities maintenance business came up with a similar metric for both prospective and existing customers, and had similar success with using the measure to encourage salespeople to go after high quality pro-spective customers. Points were also assessed for different types of inputs, such as 10 points for a business card or letter sent, to 75 points for an RFP received.

(Continued from page 16)

(Continued on page 18)

The input metrics need to have a qualitative component as well.

Page 18: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Process Metrics Process measures in sales or account management are usually measures of behavior or activity. For example, a process metric might be the number of accounts called on per week, or the number of product demonstrations con-ducted, or the number of customers that attended your golf outing. Many or-ganizations have very specific standards for following processes that are sup-posed to build relationships with customers. For example, visiting A level customers once a week, B level customers, once a month, and so forth. There are also often standards for how to give a pres-entation or demonstration, right down to the exact words to say and hand movements to properly demonstrate the product. Salespeople are measured on the degree to which their behavior matches the standards. Call centers that deal with customers also have process measures such as call waiting time, abandoned calls (hang-ups), average call length, and so on. All of these process measures and standards are designed to increase sales, improve customer satisfaction, and increase customer loyalty. The problem with most of these process measures is that they have not been proven to link to anything of value in most companies. In other words, there is no evi-dence suggesting that following a scripted demo, or visiting accounts once a week results in increased business or even increased satisfaction from cus-tomers. Usually companies have some fairly good data on the link between input met-rics and output metrics. For example, my friend who sells executive compen-sation plans sends out about 100 letters to get 15 good leads that he calls, to get 5 appointments, to do 3 proposals to get one sale. With data like this it is fairly easy to set targets for input metrics, based on overall sales goals. I rarely see this kind of research backing up the validity of process measures. The process measures and standards are too frequently based on anecdotal evidence or superstition. Sales managers feel good if all their reps are call-ing on the right number of accounts, conducting demonstrations, handing out business cards, and entering all these activities on a daily basis in the CRM data base. Often I find that the people who get the best scores on the process or behav-ior measures are among the worst performers when you look at their achieve-ments. Remember my friend Jim I talked about earlier? He ignores the be-havior or process measures his company tracks. He does not dress appropri-

(Continued from page 17)

(Continued on page 19)

Page 18 Volume 4 Issue 4

I rarely see this kind of research backing up the validity of process measures. The process measures and standards are too frequently based on anecdotal evidence or superstition.

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

CRM Metrics That Really Work

Page 19: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

ately, refrains from using his company’s lingo when describing their prod-ucts/services, often cancels client dinners at the last minute, or refuses to socialize with them at all, and does not work well with other consultants and salespeople in the company. Yet, his performance on the outcome measures beats all of his peers. The key to having good process metrics in your CRM Index is to closely study the behavior of your superstar salespeople. Do not interview them – watch them. Most people who are good at what they do are unconsciously compe-tent, and have a hard time articulating why they are successful. By watching the star performers and comparing their behavior to the average ones, you are likely to find subtle difference that account for their success. These are the subtle differences upon which your process metrics should be based. By deriving your process measures from a study of your star perform-ers, you have at least a good hypothesis that by following a particular process or sequence of behavior, improved performance will result. Test these hy-potheses, however, before settling on them and making them the way you do business. Also, keep in mind that there is a lot of art to managing customer relation-ships, and the more you try to adopt a cookie cutter approach, the less likely your salespeople are to be successful. Take an approach more like Nord-strom: hire people with brains and personalities, and trust them to make good judgments, rather than turning them into robots. Even though there is a lot of art to sales or relationship building, there should be some basic processes followed and measured. For example, there may be a process for interviewing a prospect and defining her needs. There may be another process for determining and overcoming objections, and another process for reviewing the strengths of your product/service versus the com-petition. There should not be a script or defined process for having drinks or dinner with clients, or other activities where the approach needs to be tai-lored to each situation and individual customer. The reason the process metrics are given such a low weight in the CRM Index is that the behavior or process metrics are the measures with the least integ-rity. In other words, an organization can be getting a good score on all the behavior measures and still have unhappy customers and problems with the relationship they have with your firm. Input metrics are a little less risky because they involve tracking hard objec-tive things like leads, or requests for quotations/proposals. If you had solid research backing your process metrics, you might want to increase the weight of these measures in the CRM Index to make them equal to the input

(Continued from page 18)

(Continued on page 20)

Page 19 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

The reason the process metrics are given such a low weight in the CRM Index is that the behavior or process metrics are the measures with the least integrity.

By deriving your process measures from a study of your star performers, you have at least a good hypothesis that by following a particular process or sequence of behavior, improved performance will result. Test these hypotheses, however, before settling on them and making them the way you do business.

Page 20: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 20 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

metrics. In this case, the input and process metrics might each be worth 12.5% of the index.

Output Metrics An output is some sort of product or something that can be counted. An out-put is either a physical product such as a proposal, report, or contract, or an accomplishment such as a closed deal, a new client brought on board, an order, or a service performed successfully. For example, and output measure for a shipping firm might be packages or containers delivered., or on-time flights for an airline. Outputs tend to be things of value to either the organization or its customers. Customer contact employees like account managers, service delivery person-nel, and customer service reps generate all sorts of outputs that might be counted as part of a CRM Index. A number of my service industry clients have a metric called the Aggravation Index, wherein they track the number of things they do each day to aggravate a customer. Each aggravation is also weighted on a 1-10 scale, based on its severity. It turns out that as the aggravation index goes higher, more and more customers start leaving. In other words, there is a direct correlation be-tween the aggravation index, which is an output metric, and customer loyalty, which is an outcome measure. The aggravations are not caused by the salesperson or account rep., but by the people involved in delivering the service to the customer. Some of the factors that get counted in an airline’s aggravation index are: ►Late flights ►Cancelled flights ►Lost bags ►Middle seats ►Delays taking off ►Delays waiting for bags ►Damaged bags ►Long lines to get boarding passes ►Long hold time when ►Dropped calls calling to reserve a ticket ►Hard to use web site ►Gates without enough chairs ►Customer complaints Each of these aggravations is counted on a daily basis and multiplied by the number of passengers impacted, and the seriousness of the aggravation. A flight that is late 30 minutes would count a lot less than one that is cancelled and 200 passengers have to spend the night in O’Hare because every hotel room in Chicago is booked.

(Continued from page 19)

(Continued on page 21)

It turns out that as the aggravation index goes higher, more and more customers start leaving. In other words, there is a direct correlation between the aggravation index, which is an output metric, and customer loyalty, which is an outcome measure.

CRM Metrics That Really Work

Page 21: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

How your company performs or how your products perform has a major im-pact on the relationship with customers. If the copy machines in your law firm break down on almost a daily basis, the account rep is the one that takes the heat for this, and this sort of situation can ruin years spent trying to build a relationship with a customer. When tracking outputs, it is important to include those produced by the sales or customer contact people, but also to track outputs that will have a major impact on a customer’s satisfaction with your company. Most of the things that cause a relationship between two companies to sour are not the fault of salespeople or account reps. Rather, the most important outputs relate to how well products and services perform. Outputs are given a fairly heavy weight in the CRM Index for two reasons:

• They can be objectively counted and measured • They have a big impact on the relationship with customers.

Outputs for the sales or account management function that might also be included in the CRM Index are:

• Proposals won • New customers acquired • Extensions or sales of additional items • Referrals • Commendation letters from customers • Joint marketing efforts with customers • Customer problems solved

There may be a long list of outputs that get counted and assigned a weight, depending on their importance to the organization’s success. For example, a proposal for a $1million sale might be worth less than a signed contract for $200k. When constructing this section of your CRM Index, think about assigning a weight to the various outputs based on how important they are to the rela-tionship you have with customers. For this reason, I would count the outputs that relate to your product or service performing well as worth 25% of the to-tal 35%, and the remaining 10% for the outputs produced by the salespeople or account reps. A big sale is an important output for the company and the salesperson, but it does not necessarily strengthen the relationship a customer has with your

(Continued from page 20)

(Continued on page 22)

Page 21 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

When tracking outputs, it is important to include those produced by the sales or customer contact people, but also to track outputs that will have a major impact on a customer’s satisfaction with your company.

Page 22: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 22 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

company.

Outcome Metrics Outputs are products or accomplishments that can be easily counted or quantified. An output might have obvious value such as an order, or hope-fully lead to something of value, such as a proposal submitted. Outcomes, on the other hand, tend to be things that are harder to quantify and measure, but are often more important than the outputs. For example, a long-term trusting relationship with a profitable customer is an important outcome that is harder to measure than the number of re-sponses to a direct mail marketing campaign. Remember that the goal of a CRM process is to build a strong positive and lasting relationship with important customers that lead to greater profits. I have worked with a number of firms that have designed a Customer Relation-ship Index that is used as a measure of the type of relationship they have with each account. A facilities service firm in San Jose calls this metric the Customer Relation-ship Index. A textile firm in Orange County, CA has a similar metric they call their “Matrimonial Index.” The scale on both company’s metrics goes from a level one relationship which indicates that the customer is married to another supplier but willing to have a friendly lunch or take an occasional phone call from your sales rep. A level 5 relationship indicates a happy customer that has been buying from your company for several years, and has gotten to know the account rep and others in your company very well. A score of 10 on the relationship index is used to signify a true marriage or business partnership. A 10 score might in-dicate a long-term contract, many personal friendships among the employees at your two firms, political connections (e.g., your CEO and the customer’s CEO are personal friends), exit barriers that would make it difficult and expen-sive for the customer to find a new supplier, etc. Both firms I mentioned have specific criteria for each number of the matrimo-nial index that are measures of the strength of the relationship. Another cli-ent incorporates an input measure that their CEO calls the “ugly stick” with the outcome metric called the relationship index. There are two dimensions that go into calculating the index:

(Continued from page 21)

(Continued on page 23)

The 1-10 relationship index is also an important outcome measure. The company sets targets for certain types of customers and for the types of relationship with want to have with them.

CRM Metrics That Really Work

Page 23: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 23 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

►Attractiveness 1-10 scale ►Relationship 1-10 scale

A “perfect 10” or attractive customer is one who has the following character-istics:

• Pays invoices in 30 days or less • Generates healthy profit margins • Buys large volumes • Partners with suppliers • Stable company that will be around a long time • Ethical company that manages with integrity • Customer name will impress other potential customers • Easy to work with – e.g., low maintenance

A level 1 or ugly customer would be just the opposite of a “10”. Customer accounts are given an “ugly stick” rating twice a year by those who work with the account because often their status changes with time. Some customers get uglier with time, and some get better looking.

For example, a successful major airline might have been rated as a 9 in at-tractiveness several years ago, but they have fallen to a level 2 now because they are close to filing bankruptcy. The 1-10 relationship index is also an important out-come measure. The com-pany sets targets for cer-tain types of customers and for the types of rela-tionship with want to have with them. The relationship index should probably be one of the heavily weighted sub-measures in the outcome segment of your CRM Index. Other important outcomes should probably be included as well, such as:

(Continued from page 22)

(Continued on page 24)

Example Customer Attractiveness/Relationship Index ATTRACTIVNESS RELATIONSHIP LEVEL TARGET

Beautiful 9-10 9-10 happy marriage/partnership Very attractive 7-8 7-8 engaged; happy relationship Attractive, but some flaws 5-6 5-6 committed dating; mostly Good relationship Average, a number of flaws 3-4 Casual dating, no real trust yet; Customer using

other suppliers Ugly, many flaws 1-2 1-2 occasional lunch, customer; Married to another supplier.

Page 24: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 24 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

• $ in sales increases from existing customers • $ in gross margin increases from existing customers • $ in new sales attributable to referrals from existing customers • Customer satisfaction levels • Accounts where your company is the sole supplier • Accounts where your company is rated as a preferred supplier

Conclusions Like most other three-letter management programs that have delivered less than they promised, CRM has failed to result in stronger relationships with customers and increased profits in more organizations than it has succeeded in. Part of the reasons for this failure are the faulty assumptions that building relationships is as simple as having the right data on your customers, and following a systematic process of behaviors to build and sustain relation-ships. If it was really that easy there would not be so many books, training pro-grams, and models on how to create good business relationships. Busi-nesses have been doing CRM for thousands of years. Sales has always been about building relationships, and there have been many successful salespeo-ple over the years who did not have a flowchart on the selling process or CRM software and a laptop computer. Companies have invested millions of dollars in software, training, and meet-ings over the years to improve their ability to build relationships with custom-ers. What many have not done is to measure the effectiveness of their CRM efforts. A single measure of something as complex as customer relationships is unlikely to reflect your performance. What is needed is a Customer Rela-tionship Index that summarizes results for four different aspects of perform-ance:

• Inputs • Processes or behaviors • Outputs • Outcomes

Such an index can provide executives with the data they need to monitor the progress of their company in creating and sustaining positive and profitable relationships with important customer

References • Joni, Saj-nicole A. “The Geography of Trust” Harvard Business Re-

(Continued from page 23)

(Continued on page 25)

CRM Metrics That Really Work

Like most other three-letter management programs that have delivered less than they promised, CRM has failed to result in stronger relationships with customers and increased profits in more organizations than it has succeeded in.

Page 25: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 25 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

view, March, 2004, 82-88. • Malony, David and Bustos-McNeil, Robert. “A New Window Into

CRM” Strategy + Business, Spring, 2004, 28-31 • Rigley, Jennifer. “Overcoming CRM Failure in Financial Services:

What’s Not Working” crmguru.com, February 18, 2003. • Surmacz, Jon. “A Second Look at CRM” cio.com/metrics, June

11, 2003.

About the Author Mark Graham Brown has been consulting with companies on measuring and improving performance for over 25 years. His clients include many of the Fortune 50, as well as a number of Federal and State government and mili-tary organizations. He is the author of two books on performance measurement: Keeping Score – Using the Right Metrics to Drive World-Class Performance (1996) and Win-ning Score – How to Design and Implement Organizational Scorecards (2000). His latest book called: Get it, Set it, Move it, Prove it, was published in 2004. Mr. Brown has his own consulting practice in Manhattan Beach, California and may be contacted via his web site: markgrahambrown.com.

►►►►►

(Continued from page 24)

A single measure of something as complex as customer relationships is unlikely to reflect your performance. What is needed is a Customer Relationship Index that summarizes results for four different aspects of performance.

We have over 90 downloadable

resources and references for you on the EPPIC web site-

• Articles • Presentations

• Job Aids & Tools & Templates

www.eppic.biz

The EPPIC Web Site

“Gopher” moreat eppic.biz“Gopher” moreat eppic.biz

Page 26: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 26 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

In the conclusion of this 2 part series we finish up with some guidance on the process of empowerment

and some “baby steps” you might consider for your organization... The Process of Empowering Employees want empowerment; in fact, once they understand the concept, they will demand empowerment. You won’t have to force it upon them, they will gladly take it—not because they are power-hungry crazies, but because they want to contribute. Remember that they are capable people trying to do a good job. They want to be empowered because it makes them feel as though the company recognizes them as the valuable asset they know they are truly capable of being. Empowerment must be understood by all that are to participate in the proc-ess. The concepts and precepts of empowerment at your enterprise will differ from others depending upon the situation specifics and the needs of the peo-ple for freedom or control. The needs of the executives differ from the middle manager, supervisor, or the individual contributor. Highly regulated busi-nesses need more controls. Marketing may get more freedoms than the Fi-nance departments. At the crux of implementing empowerment is management’s support of the concept. Recognize that there is a strong fear factor at work for those who must buy-in to make empowerment a reality—namely, middle management and supervision (or team leaders/coaches/etc.). The organization may be asking them to change ingrained patterns that have worked over the years. They remain responsible for the overall performance of their area/department but should share the decision-making with their employees. Some may not be too excited about participating in this “experiment” with their careers. Executive management must ensure that reasonable risk-taking is never pun-ished, even when the results are disastrous. Accepting and learning from fail-ures is a stretch for everyone. Only a strong, confident management group can put into place a measurement and reward system that uses failure data in nonpunitive ways. Both failures and successes can offer valuable lessons. A celebrated failure may significantly reduce the risks of recurrence. Just as a baby grows into a child, so can empowerment grow from a concept to a well-implemented management style and part of the culture. The process

(Continued on page 27)

They want to be empowered because it makes them feel as though the company recognizes them as the valuable asset they know they are truly capable of being.

Part 2 of 2 Empowerment By Guy W. Wallace, CPT

Page 27: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 27 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

is composed of the following “baby steps” (with apologies to Neil Simon and the movie “What about Bob?”). These steps are for those determining how to install a process for establish-ing empowerment.

Step 1: Look Around and Determine Where You Are and Where You Need to Go

Step 2: Roll Over (in the Right Direction) Step 3: Crawl Slowly at First Step 4: Walk (and Talk and Listen, and Walk the Talk and Listen) Step 5: Run

Step 1: Look around and Determine where You Are and where You Need to Go Don’t even roll over yet baby. Determine where you are and orient yourself. Look for insights. Determine where it is you need to go with empowerment. Ask the following questions:

• What is the current situation? • What is the current relationship between management and employ-

ees? • How are most decisions made now? • What is our current culture? What barriers can we see that will inhibit

our success? • Where is our Process Improvement effort now and what will be imple-

mented in the future? What’s been working, and what hasn’t?

(Continued from page 26)

(Continued on page 28)

Only a strong, confident management group can put into place a measurement and reward system that uses failure data in non-punitive ways.

Page 28: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 28 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Think about the other plans with which you need to integrate your empower-ment efforts.

Step 2: Roll Over (in the Right Direction) Now that you can more clearly see what you want to do, you need to position your organization to get there. Don’t start moving until you have everything that you may need on this journey. Do you have the preliminary systems for training, the preliminary information systems, and the preliminary reinforcement systems in place? Have you de-cided upon a consistent message that will facilitate this effort? Do we have the required management buy-in, or do they have questions or unresolved concerns? Will management really participate, or will they only pay lip service to the effort?

Step 3: Crawl Slowly at First Explore your surroundings as you begin to move out. Move out slowly. When crawling, midcourse corrections are much easier; there is less momentum to slow down, and efforts are easier to redirect. Plan a few trials or pilot tests. Pick several different processes and/or func-tions for your tests. Teach the process participants how to define their stake-holders, how to identify their complex sets of requirements, and how to bal-ance out their product/service portfolios to best balance the needs of stake-holders in a manner consistent with the goals of the overall business. Experiment with different empowering techniques. Measure your progress not for speed, but for effect. Ask for feedback and suggestions and adapt your techniques accordingly. Share your successes and celebrate and reward the efforts of all your failures. Encourage all learning, even that which comes as a result of pain. Determine the requirement for infrastructure that needs changes/adaptations/replacements/discontinuance. Systems such as policies and procedures, information, appraisal, and compensation may present barriers to the implementation of empowerment. Acknowledge those barriers and dis-seminate plans for tackling the issues and opportunities. Share the nonsensi-tive information where you can.

(Continued from page 27)

(Continued on page 29)

Do we have the required management buy-in, or do they have questions or unresolved concerns? Will management really participate, or will they only pay lip service to the effort?

Part 2 of 2 Empowerment

Page 29: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Page 29 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Step 4: Walk (and Talk and Listen, and Walk the Talk and Listen) Once your testing (crawling) is completed, stand up and pick up speed, but don’t run. Walk slowly, even though like a child you may be tempted to run too soon; patient parents caution against running too fast too soon. But as we all know, much of the learning occurs from the falls, bumps, and bruises. Expand the implementation. Fix the support systems and processes that pre-sent barriers. Empower the owners of such systems. Teach them how to de-fine their stakeholders, their complex sets of requirements, and how to bal-ance out their product/service portfolios to meet the needs of all stake-holders in a way that is still profitable for the business. Provide the guidance and support needed without disempowering teams. Tout all of the efforts as you implement empowerment throughout the organi-zation. Demonstrate support via all of your actions. But most importantly, lis-ten and invite feedback. Listen carefully to ensure that you truly understand the message contained in the feedback, and adapt the system accordingly.

Step 5: Run Once you find that you can walk without too much stumbling, it’s time to pick up speed. This can be done only when you feel confident that you’ll be able to pick yourself up if you run and fall. Try never to let the data from the informa-tion systems be used in a punitive manner. Instead, let that data guide you in reshaping behaviors. This is done by providing an appropriate balance of con-sequences. Summary Remember to take things slowly. Walk before you run; crawl before you walk; roll over before you crawl; lie there and observe before you make your first move. As we wrote before… It’s an imperfect world with lots of variability. Mistakes are a reality. In-creased mistakes are a reality inherent with risk-taking. Risk-taking is inher-ent with change, and change is the goal of continuous improvement. But…

If you can’t control or live with the risks, don’t empower.

►►►►►

(Continued from page 28)

Walk slowly, even though like a child you may be tempted to run too soon; patient parents caution against running too fast too soon. But as we all know, much of the learning occurs from the falls, bumps, and bruises.

Page 30: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

What is an ISPI HPT ProComm? Professional Communities (ProComms) were created as an outcome of the 2002-2004 Presidential Initiative Task Force on "Clarifying Human Performance Technology." These communities provide a new way for ISPI members to network with others who have similar interests, and serve as a catalyst for sharing and gaining knowledge from each area of expertise. Anyone can participate, actively or pas-sively, in any or all of the ProComms. Over time, the ProComms will help shape ISPI’s professional practices and influence our conferences, publications, awards, certification, institutes, and other areas in which our technical content is important. ProComms should foster an appreciation and capability for "inter-HPT collaboration" in achieving the goals of performance improvement at the individual, process, organization, and society levels. What are the Seven ProComms? Science & Research (SR) The intellectual pursuit and critical analysis of basic principles, conditions, mechanisms, functional relationships, and theories related to human performance. Analysis, Evalua-tion, & Measurement (AEM) The process of assessment, decision, and action relevant to the mainte-nance and adaptation of a system. Process Improvement (PI) Efforts involving the efficiency and/or effectiveness of the sequence of activities in a value chain that produces outcomes and results. Organizational Design/Alignment (OD/A) The examination of the allocation of decision-making author-ity, business processes, values, business practices, and conduct of people’s performance within an organization to ensure that actions are aligned to produce desired results. Motivation, Incentives, & Feedback (MIF) The determination of the means by which the likelihood of performance can be in-creased, decreased, or sustained through modifications in performers’ arousal, attention, and anxiety, or through adjustments to performers’ desire and expectance of success. Instructional Systems (IS) The determination of when learning should occur and the best means by which to achieve learning through manipulation of display, response demand, and instructional management. Management of Organizational Performance (MOP) The pursuit of organizational results by examining the whole system to determine the major sources of performance variance and to address them with appropriate organ-izational change processes and techniques. It Is Still Early In Their Evolution The specific Value Proposition for each ProComm will likely include products/services that are both common and unique relative to each other. Examples of common P/Ss are the Quarterly Newsletter and Discussion Groups on the Society’s web site. HPT “content” overlap among ProComms at the inter-vention, tool, and technique levels was and is inevitable. These will need to be managed via a "shared ownership" by two or more ProComms. Overlap deliberately exists with 2 of the 7: Science & Research (SR) overlaps all other 6 ProComms as this group focuses on the scientific underpinnings of all as-pects of HPT; and Management of Organizational Performance (MOP) overlaps in that it brings all of the pieces of HPT into an integrated approach to new design and/or improvement.

For more information and to participate...click the ProComm button at www.ispi.org

Page 30 Volume 4 Issue 4

Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

A mechanism for more focused networking and learning...

ISPI’s 7 Human Performance Technology

Professional Communities

Page 31: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

Trust — doesn’t come easily. Experience — doesn’t come quickly.

Guy W. Wallace’s consulting clients since 1982 ... while at Svenson & Wallace Inc. (1982-1997) and CADDI Inc. (1997-2002) and EPPIC (2002 to today)…

28 of the current Fortune 1000, including 3 of the top 5, and 10 of the top 50, and 25 of the Fortune 500 (as of 4/05)

2000—Today Abbott Laboratories, AMS, Eli Lilly, Fireman’s Fund Insurance, General Motors, GTE, Johnson Controls, NASCO, NAVAIR, NAVSEA, Norfolk Naval Shipyard, Siemens Building Technologies, SYSCO Corporation, and Verizon.

1990—1999 Abbott Laboratories, ALCOA, ALCOA Labs, Alyeska Pipeline Services Company, American Management Systems, Amoco, AT&T Network Systems, Bandag, Bank of America, Baxter, Bellcore-Tech, British Petro-leum-America, Burroughs, CCH, Data General, Detroit Ball Bearing, Digital Equipment Company, Discover Card, Dow Chemical, EDS, Eli Lilly, Ford, General Dynamics, General Motors, H&R Block, HP, Illinois Bell, Imperial Bondware, MCC Powers, NCR, Novacor, Occidental Petroleum Labs, Spartan Stores, Sphinx Phar-maceuticals, Square D Company, and Valuemetrics.

1982—1989 ALCOA, ALCOA Labs, Ameritech, Amoco, Arthur Anderson, AT&T Communications, AT&T Microelectronics, AT&T Network Systems, Baxter, Burroughs, Channel Gas Industries/Tenneco, Dow Chemical, Exxon, Ford, General Dynamics, HP, Illinois Bell, MCC Powers, Motorola, Multigraphics, NASA, Northern Telecom, Northern Trust Bank, and Westinghouse Defense Electronics.

Project Overviews for each project are available at www.eppic.biz

Pursuing Performance—The EPPIC Newsletter Winter 2003/2004 ©2003 EPPIC Inc. All rights reserved. Pursuing Performance—The EPPIC Newsletter Fall 2005 ©2005 EPPIC Inc. All rights reserved. www.eppic.biz

Guy W. Wallace has been in the T&D field since 1979 and a training and per-formance improvement consultant since 1982. His clients over the years have included over 40 Fortune 500 firms, plus NASA, NAVAIR, NAVSEA, and non-US firms: BP, Novacor, Opel, and Siemens.

He has analyzed and designed/ developed training and development for al-most every type of business function and process. He is the author of three books, several chapters, and more than 50 articles. He has presented more than 50 times at international conferences and local chapters of ISPI, ASTD, and at IEEE, Lakewood Conferences, ABA, the Conference on Nuclear Training and Education, and at the Midwest Nuclear Training Association.

He has served on the ISPI Board of Directors as the Treasurer on the Execu-tive Committee (1999–2001) and later as the President-Elect for 2002-2003, and President for 2003-2004.

Guy’s professional biography was listed in Marquis’ Who’s Who in America in 2001. He was designated a CPT — Certified Performance Technologist in 2002.

EPPIC’s Guy W. Wallace, CPT

Page 32: Newsletter EPPIC V4 I4--Fall 2005 final bCheers! (Continued from page 1) Volume 4 Issue 4 Page 2 ... the enterprise processes. For you we offer our insights and many examples of our

to protect and improve the enterprise

EPPIC Inc.

[email protected] www.eppic.biz

We help you performance-orient the business systems that enable the human side of your

enterprise process performance, including:

performance-based Recruiting & Selection systems,

Training & Development systems, Appraisal & Performance Management systems,

and Compensation systems

If your current Master Performers can do it, why not everyone else?

EPPIC Inc.Achieve Peak Performance

EPPIC Inc.EPPIC Inc.Achieve Peak Performance

9/3/2005