automation with humans in mind: making complex systems predictable, reliable and humane
TRANSCRIPT
Automation, with Humans
in MindMaking Complex Systems Predictable, Reliable and
Humane
Hi, folks.
I do things to/with computers.
I build real-time systems.
I build fault-tolerant
systems.
The whole survives.
I build critical systems.
Failure is catastrophic.
Complex Systems
Complex Systems
• Non-linear feedback
Complex Systems
• Non-linear feedback
• Coupled to external systems
Complex Systems
• Non-linear feedback
• Coupled to external systems
• Difficult to model, understand
AdRoll
Let’s talk about the future.
Let’s talk automation.
Let’s talk human cooperating with machine.
Apollo 13
A complex craft.
It wasn’t clear how to orient the system.
Rocket with a tin can and some humans on top?
More elaborate rocket plane?
A matter of significant debate.
A balance was struck.
Saturn V was a big, completely automatic rocket.
With a space plane on top.
A wee problem with the Service Module.
No fuel, no O2 and a
dead boat.
What to do?
Improvise.
Mission Control puzzled out new budgets.
Used the Lunar Module rocket for main propulsion.
Bridged incompatible systems with
available materials.
Tools aid experts to overcome
catastrophic failure.
Automation, done right, relieves tedium.
Automation, done right, reduces errors.
Automation, done right, liberates.
Let’s talk human versus machine.
Chernobyl
Graphite-moderated boiled water reactor.
Requires active cooling.
Worse, unstable at low power levels.
Even worse, very high positive void coefficient.
Worst of all, Soviet political dynamics.
During a test of a backup system, the reactor was driven into a failure-prone state.
Warning signs were ignored.
Boom
In the immediate aftermath vital
equipment is not available.
It’s all locked in a safe.
The sole man with a key is
dead, crushed under rubble.
There’s nothing to be done.
The reactor
fails according
to its nature.
Much is irradiated.
Many die. An entire region of Ukraine is abandoned.
Automation, done wrong, mechanizes humans.
Automation, done wrong, misdirects.
Automation, done wrong, entraps.
Every system carries the potential for its own destruction.
“Normal Accidents”
Failure is inevitable.
The design of any system must include
failure as a first class concern.
Otherwise, system failure happens in
completely arbitrary ways.
How do you design for failure?
Cyborg it up a little.
Don’t do it alone.
Have resources you’re willing to sacrifice.
Accept failure. Learn from it.
Study the accidents of others.
Some things aren’t worth building.
Understand what you build.
Thanks!<3
@bltroutwine