the ludic fallacy applied to automated planning
DESCRIPTION
This is a short talk I gave to the Strathclyde Planning Group on deficiencies I can see in the way we thing and reason about planning in non-deterministic environments. PPDDL - the accepted standard - is overly simplistic and can get us into hot water because we focus on solving the PPDDL problem, rather than the Real World problem it models. The breakout session that followed was very useful for generating a lot of ideas about different threads we could use to attack the weaknesses of PPDDL and work being done around the edges, which I hope to summarise at some point.TRANSCRIPT
![Page 1: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/1.jpg)
The Ludic FallacySPG
11th Feb 2011
![Page 2: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/2.jpg)
• Ludic - Of or pertaining to games of chance
• Fallacy - An argument which seems to be correct but which contains at least one error.
![Page 3: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/3.jpg)
Example
![Page 4: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/4.jpg)
Example• Suppose you flip a coin, what is the chance it comes
up heads?
![Page 5: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/5.jpg)
Example• Suppose you flip a coin, what is the chance it comes
up heads?
• 50/50
![Page 6: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/6.jpg)
Example• Suppose you flip a coin, what is the chance it comes
up heads?
• 50/50
• Suppose you flip the coin 100 times and the first 99 were tails. What is the chance of the final flip giving heads?
![Page 7: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/7.jpg)
Example• Suppose you flip a coin, what is the chance it comes
up heads?
• 50/50
• Suppose you flip the coin 100 times and the first 99 were tails. What is the chance of the final flip giving heads?
• Independent variables, still 50/50.
![Page 8: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/8.jpg)
Example• Suppose you flip a coin, what is the chance it comes
up heads?
• 50/50
• Suppose you flip the coin 100 times and the first 99 were tails. What is the chance of the final flip giving heads?
• Independent variables, still 50/50.
• ...or is it?
![Page 9: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/9.jpg)
Origins
![Page 10: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/10.jpg)
Origins
• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".
![Page 11: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/11.jpg)
Origins
• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".
• Broadly, the ability to describe the outcomes of events gives an impression of control. It does not give ACTUAL control of the events.
![Page 12: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/12.jpg)
Origins
• Originally postulated by Nassim Nicholas Taleb in "The Black Swan".
• Broadly, the ability to describe the outcomes of events gives an impression of control. It does not give ACTUAL control of the events.
• A complex but inaccurate model is most importantly inaccurate.
![Page 13: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/13.jpg)
"Gambling With the Wrong Dice"
![Page 14: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/14.jpg)
"Gambling With the Wrong Dice"
• Case Study based on Las Vegas casino.
![Page 15: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/15.jpg)
"Gambling With the Wrong Dice"
• Case Study based on Las Vegas casino.
• Extensive and sophisticated systems and models to account for potential cheating.
![Page 16: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/16.jpg)
"Gambling With the Wrong Dice"
• Case Study based on Las Vegas casino.
• Extensive and sophisticated systems and models to account for potential cheating.
• Aim was to manage risk.
![Page 17: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/17.jpg)
"Gambling With the Wrong Dice"
• Case Study based on Las Vegas casino.
• Extensive and sophisticated systems and models to account for potential cheating.
• Aim was to manage risk.
• But the vast majority of losses came from non-gambling activity : a disgruntled ex-employee, onstage accidents, failure to file correct paperwork and a kidnap ransom.
![Page 18: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/18.jpg)
Blinded By Probability
![Page 19: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/19.jpg)
Blinded By Probability
• Because we see numbers as solvable, we focus on solving them.
![Page 20: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/20.jpg)
Blinded By Probability
• Because we see numbers as solvable, we focus on solving them.
• Lose sight of the broader picture.
![Page 21: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/21.jpg)
Blinded By Probability
• Because we see numbers as solvable, we focus on solving them.
• Lose sight of the broader picture.
• The "game" becomes our main focus rather than the world it represents.
![Page 22: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/22.jpg)
Back to Coins
![Page 23: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/23.jpg)
Back to Coins• We flip 99 times, all tails.
![Page 24: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/24.jpg)
Back to Coins• We flip 99 times, all tails.
• 0.5^99 = 1.8x10^-30
![Page 25: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/25.jpg)
Back to Coins• We flip 99 times, all tails.
• 0.5^99 = 1.8x10^-30
• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model don't hold true?
![Page 26: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/26.jpg)
Back to Coins• We flip 99 times, all tails.
• 0.5^99 = 1.8x10^-30
• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model don't hold true?
• Is the coin fair?
![Page 27: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/27.jpg)
Back to Coins• We flip 99 times, all tails.
• 0.5^99 = 1.8x10^-30
• Which is more likely, this highly improbable event is happening, or the assumptions that we used to build the model don't hold true?
• Is the coin fair?
• What actually is the probability of getting heads next?
![Page 28: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/28.jpg)
Off-model Consequences
![Page 29: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/29.jpg)
Off-model Consequences
• When we have a model, we risk getting blinkered into thinking about the model instead of the world.
![Page 30: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/30.jpg)
Off-model Consequences
• When we have a model, we risk getting blinkered into thinking about the model instead of the world.
• But models are abstract representations.
![Page 31: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/31.jpg)
Off-model Consequences
• When we have a model, we risk getting blinkered into thinking about the model instead of the world.
• But models are abstract representations.
• No PDDL model describes the effect of a meteorite hitting a robot, yet it is an (unlikely) possibility.
![Page 32: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/32.jpg)
Off-model Consequences
• When we have a model, we risk getting blinkered into thinking about the model instead of the world.
• But models are abstract representations.
• No PDDL model describes the effect of a meteorite hitting a robot, yet it is an (unlikely) possibility.
• Outcomes of actions, or events, cannot be fully enumerated. There exist "off-model consequences"
![Page 33: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/33.jpg)
Coins Again
![Page 34: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/34.jpg)
Coins Again• We talk about coins having a head and a tail side and
50/50 chance of either.
![Page 35: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/35.jpg)
Coins Again• We talk about coins having a head and a tail side and
50/50 chance of either.
• This isn't strictly true - there's a third possibility we don't model :
![Page 36: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/36.jpg)
Coins Again• We talk about coins having a head and a tail side and
50/50 chance of either.
• This isn't strictly true - there's a third possibility we don't model :
• Edge
![Page 37: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/37.jpg)
Coins Again• We talk about coins having a head and a tail side and
50/50 chance of either.
• This isn't strictly true - there's a third possibility we don't model :
• Edge
• This is Taleb's "Black Swan", highly unlikely but theoretically possible events that are ignored.
![Page 38: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/38.jpg)
Coins Again• We talk about coins having a head and a tail side and
50/50 chance of either.
• This isn't strictly true - there's a third possibility we don't model :
• Edge
• This is Taleb's "Black Swan", highly unlikely but theoretically possible events that are ignored.
• A true Black Swan must also be "high impact"
![Page 39: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/39.jpg)
What Am I Driving At?
![Page 40: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/40.jpg)
Probabilistic Planning
![Page 41: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/41.jpg)
Probabilistic Planning
• PPDDL is a prime example of "doing it wrong"
![Page 42: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/42.jpg)
Probabilistic Planning
• PPDDL is a prime example of "doing it wrong"
• Extends PDDL by applying probabilities to sets of effects. P(X=i) I occurs, P(X=j) J occurs etc.
![Page 43: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/43.jpg)
Probabilistic Planning
• PPDDL is a prime example of "doing it wrong"
• Extends PDDL by applying probabilities to sets of effects. P(X=i) I occurs, P(X=j) J occurs etc.
• Is the world really so cut and dry? Or is this simply shoehorning probabilities into PDDL in the most obvious way possible.
![Page 44: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/44.jpg)
Summary
![Page 45: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/45.jpg)
Summary• Models are typically incomplete.
![Page 46: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/46.jpg)
Summary• Models are typically incomplete.
• Models are frequently wrong.
![Page 47: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/47.jpg)
Summary• Models are typically incomplete.
• Models are frequently wrong.
• Probabilistic models make even more assumptions!
![Page 48: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/48.jpg)
Summary• Models are typically incomplete.
• Models are frequently wrong.
• Probabilistic models make even more assumptions!
• We allow ourselves to be deceived by numbers into believing we can quantify the unquantifiable.
![Page 49: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/49.jpg)
Summary• Models are typically incomplete.
• Models are frequently wrong.
• Probabilistic models make even more assumptions!
• We allow ourselves to be deceived by numbers into believing we can quantify the unquantifiable.
• As a result, we get bogged down solving a problem that isn't necessarily reflective of the real world.
![Page 50: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/50.jpg)
So What Can We Do?
![Page 51: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/51.jpg)
Introduce Noise
![Page 52: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/52.jpg)
Introduce Noise
• Most basic approach is to add noise to probabilistic models.
![Page 53: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/53.jpg)
Introduce Noise
• Most basic approach is to add noise to probabilistic models.
• If the model has P(x) = 0.2, test generated plans at say P(x) = 0.2+-0.05
![Page 54: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/54.jpg)
Introduce Noise
• Most basic approach is to add noise to probabilistic models.
• If the model has P(x) = 0.2, test generated plans at say P(x) = 0.2+-0.05
• Allows for a rudimentary "what happens if these values are not spot on" check
![Page 55: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/55.jpg)
Epsilon-separation of states
![Page 56: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/56.jpg)
Epsilon-separation of states
• Similar concept to that used in temporal actions.
![Page 57: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/57.jpg)
Epsilon-separation of states
• Similar concept to that used in temporal actions.
• In this case epsilon denotes a marginal probability of transitioning between any pair of states.
![Page 58: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/58.jpg)
Epsilon-separation of states
• Similar concept to that used in temporal actions.
• In this case epsilon denotes a marginal probability of transitioning between any pair of states.
• Still not ideal, but at least captures the possibility of events changing the state in an undetermined way.
![Page 59: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/59.jpg)
Epsilon-separation of states
• Similar concept to that used in temporal actions.
• In this case epsilon denotes a marginal probability of transitioning between any pair of states.
• Still not ideal, but at least captures the possibility of events changing the state in an undetermined way.
• Somewhat analogous to Van Der Waals forces.
![Page 60: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/60.jpg)
State Charts
![Page 61: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/61.jpg)
State Charts
• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems
![Page 62: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/62.jpg)
State Charts
• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems
• One process interrupts the other, acts and the the first can resume from its previous state.
![Page 63: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/63.jpg)
State Charts
• In the FSM family, State Charts frequently used to represent interruptible processes e.g. Embedded Systems
• One process interrupts the other, acts and the the first can resume from its previous state.
• Can we use this model to capture the consequences of unmodelled events?
![Page 64: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/64.jpg)
Abstract / Anonymous Actions
![Page 65: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/65.jpg)
Abstract / Anonymous Actions
• In Prolog _ represents the anonymous variable.
![Page 66: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/66.jpg)
Abstract / Anonymous Actions
• In Prolog _ represents the anonymous variable.
• Nothing analogous to this in PDDL.
![Page 67: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/67.jpg)
Abstract / Anonymous Actions
• In Prolog _ represents the anonymous variable.
• Nothing analogous to this in PDDL.
• Would introducing this give us flexibility to patch plans when off-model events occur?
![Page 68: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/68.jpg)
Abstract / Anonymous Actions
• In Prolog _ represents the anonymous variable.
• Nothing analogous to this in PDDL.
• Would introducing this give us flexibility to patch plans when off-model events occur?
• Could this be used for actions (perhaps based on DTG clusterings) be useful for this?
![Page 69: The Ludic Fallacy Applied to Automated Planning](https://reader033.vdocument.in/reader033/viewer/2022052600/558698e1d8b42a74508b4583/html5/thumbnails/69.jpg)
Brainstorm!