gismo
TRANSCRIPT
GISMO: Towards a Domain-‐Specific Language for Executable Modelling and
VerificaBon of Gestural InteracBon
Deshayes Romuald and Tom Mens University of Mons
GISMO -‐ Screencast Video
This presentaBon contains a video illustraBng the different aspects of GISMO, a domain-‐
specific modeling language, and its underlying framework based on high-‐level Petri net models (called ICO models), for integraBng executable models of gestural interacBon with soOware applicaBons relying on a graphical rendering
engine.
Coming up next
Performing hand gestures to animate a virtual character through a domain-‐specific model – BoRom leO: Gestures performed by a person are captured in real Bme by the Kinect interacBon controller
– Right: Kinect’s raw data is sent to, and interpreted by, the domain-‐specific GISMO model
– Top leO: The GISMO model sends domain-‐specific methods to the graphical rendering engine to animate a virtual character
Coming up next
How it works – Step 1
– On the right: A domain-‐specific GISMO model is created by the domain expert
– On the leO: at the same Bme an underlying high-‐level Petri net model (ICO model) is generated by the GMOD framework, based on model-‐to-‐model transformaBons
Coming up next
How it works – Step 2
– On the right: A precondi)on is added to a Gesture object in the domain-‐specific GISMO model to specify the required duraBon and distance of the gesture.
– Once it is set, the underlying ICO model (on the leO) is modified accordingly
Coming up next
How it works – Step 3
– The generated ICO model can be executed to interpret, at runBme, the gestures received from the Kinect controller.
– The example shown here illustrates the effect of closing and opening a hand, resulBng in a token that travels from the open place to the closed place and back in the corresponding ICO model.
Coming up next
How it works – Step 4
– The generated ICO model can be executed to interpret, at runBme, the gestures received from the Kinect controller.
– The example shown here illustrates the effect of open or closing one or two hands, resulBng in a token that travels between the places oneClosed, twoClosed and twoOpen in the corresponding ICO model.
Coming up next
Step 5 -‐ Pu_ng it all together
• Top leO: raw gestural hand input is received from the Kinect controller
• Right: a domain-‐specific GISMO model interprets these gestures in terms of domain-‐specific concepts (in this case, the states and gestures to control a virtual archer character)
• Middle: a low-‐level ICO model (based on Petri nets) is used to execute the model at runBme.
Thanks for watching !