you: the most important piece of the bulk power system
DESCRIPTION
You: The most important piece of the bulk power system Human factors in supporting reliability and recovery from physical and cyber events. Mike Legatt, Ph.D. Principal Human Factors Engineer Electric Reliability Council of Texas, Inc. [email protected]. Introduction. - PowerPoint PPT PresentationTRANSCRIPT
1
You: The most important piece of the bulk power system
Human factors in supporting reliability and recovery from physical and cyber events
Mike Legatt, Ph.D.Principal Human Factors Engineer
Electric Reliability Council of Texas, [email protected]
2
IntroductionThis exercise is intended to prepare you for
things that don’t seem “quite right” – how you can communicate and collaborate to identify events and reduce their impacts
Furthermore, it’s intended to serve as a brief primer on maintaining human performance by tracking stress, accuracy, and keeping cognitive biases in check
3
ObjectivesYou will:Identify the systematic strengths and
weaknesses of people, technology, and their interactions
Recognize the role of your “gut feelings” in operations
Identify when and how to share these feelings within and outside your organization and prevent against biases
4
Definitions• Running estimate• Common operational picture (COP)• Cognitive bias• Situation awareness• Selective attention• Ego depletion• Hyperstress / hypostress
5
PATTERN RECOGNITION:The core human activity
6
Running Estimate Process
7
Attention, Memory and Mistakes
8
Selective Attention
9
2. In which state of stress are you most likely to make a mistake in an emergency?
a) Hypostressb) Hyperstressc) Hypostress before the emergency, then
hyperstress when it happensd) Being in the “zone of maximum adaptation”
10
Human Performance Under Stress
Stress and performance, from Hancock (2008)
11
2. In which state of stress are you most likely to make a mistake in an emergency?
a) Hypostressb) Hyperstressc) Hypostress before the emergency, then
hyperstress when it happensd) Being in the “zone of maximum adaptation”
12
3. If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause?
a) Inattentionb) Misinterpretation of a rulec) Inaccurate mental modeld) Organizational bias
13
How we make mistakes
From: NERC Cause Analysis Methods
14
3. If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause?
a) Inattentionb) Misinterpretation of a rulec) Inaccurate mental modeld) Organizational bias
15
5. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for?
a) Ego depletionb) Semmelweis reflexc) Outgroup homogeneityd) Hindsight bias
16
Ego Depletion• Self-control is a limited resource, and like
a muscle, it tires out.
17
Situation Awareness
18
COGNITIVE BIASES
19
Cognitive Biases (a sampling)
• “Apparently, when you publish your social security number prominently on your website and billboards, people take it as an invitation to steal your identity.” – Zetter, K. “LifeLock CEO’s Identity Stolen 13 Times.” Wired.com, April 2010.
20
1. There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs?
a) Zero-risk biasb) IKEA affectc) Organizational biasd) Confirmation bias
21
4. You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then?
a) Cognitive dissonance avoidanceb) Google effectc) IKEA effectd) Attentional bias
22
Cognitive Biases (a sampling)• Anchoring – something you’ve seen before
seems like the benchmark (e.g., first time you paid for gas)
• Attentional bias – You’re more likely to see something if you’re thinking about it
• Cognitive dissonance – uncomfortable to have to conflicting thoughts
• Confirmation bias – pay attention to things that support your belief
23
Cognitive Biases (a sampling)• Diffusion of responsibility – “someone else
will take care of it”• Google effect – easy to forget things that
are easily available electronically• Groupthink – people less likely to
contradict ideas in a large group• Hindsight bias – the past seems perfectly
obvious
24
Cognitive Biases (a sampling)• IKEA effect – things you’ve built seem
more valuable to you than things others have built
• Illusion of transparency-expect others to understand your thoughts/feelings more than they can
• Loss aversion – you’re more likely to try avoid losing than gaining
25
Cognitive Biases (a sampling)• Organizational bias – you’re likely to think
ideas within your organization are better• Outgroup homogeneity – you’re likely to
think that people in another group all think the same
• Semmelweis reflex – rejecting new ideas that conflict with older, established ones
• Zero-risk bias – likely to choose worse overall solutions that seem less risky
26
1. There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs?
a) Zero-risk biasb) IKEA affectc) Organizational biasd) Confirmation bias
27
4. You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then?
a) Cognitive dissonance avoidanceb) Google effectc) IKEA effectd) Attentional bias
28
5. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for?
a) Ego depletionb) Semmelweis reflexc) Outgroup homogeneityd) Hindsight bias
29
SCENARIOS
30
Group exercise: Scenario 1
• Substation X• Camera malfunction• Low oil level alarm on a transformer• Dispatch troubleshooter• Bullet holes in camera and transformer• Random act of vandalism, ploy or
directed threat?
31
Group exercise: Scenario 2• Substation Y
• Communications vaults for 2 providers damaged (AT&T and Level3).
• > 100 shots fired at transformers, oil leakages in several transformers (> 51k gallons spilled).
• Only energized transformers shot.• Attackers never entered substation• Initial assumption: vandalism?• Dress rehearsal for future attacks?• It happened: April 16, 2013, Metcalf Substation
32
Group exercise: Scenario 3
• Utility control room• Telemetry doesn’t look quite right – not
sure why• Sees significant flow into substation
without a load, then goes away • RTU failure, manipulated data,
cyberattack?
33
Group exercise: Scenario 4
• ISO Control Room• News report of civil unrest in an area• Call from utility: substation transformer• Call from utility: telemetry issues• Several other “below the line” calls• To whom do you share this information?
34
Summary• Value of communication and collaboration
when “things are not quite right.”• Reporting structure for handling incidents• Remember – your data may just be part of
something larger
35
ReferencesNERC CAP Annex D, Phase 0 (draft)
NERC CIPC Report to Texas RE MRC
NERC Cause Analysis Methods
Macmillan, N.A., Creelman, C.D. (1991). Detection theory: a user’s guide. New York: Cambridge University Press
Hancock, P.A., & Szalma. J.L. (Eds.). (2008). Performance under stress. Ashgate, Chichester, England..
36
Questions
??
37
1. There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs?
a) Zero-risk biasb) IKEA affectc) Organizational biasd) Confirmation bias
38
2. In which state of stress are you most likely to make a mistake in an emergency?
a) Hypostressb) Hyperstressc) Hypostress before the emergency, then
hyperstress when it happensd) Being in the “zone of maximum adaptation”
39
3. If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause?
a) Inattentionb) Misinterpretation of a rulec) Inaccurate mental modeld) Organizational bias
40
4. You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then?
a) Cognitive dissonance avoidanceb) Google effectc) IKEA effectd) Attentional bias
41
5. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for?
a) Ego depletionb) Semmelweis reflexc) Outgroup homogeneityd) Hindsight bias