embodied cognition with pproject intu

17
© 2016 IBM Corporation 1 Embodied Cognition with Project Intu Grady Booch Chief Architect, Watson/M [email protected] @grady_booch

Upload: diannepatricia

Post on 15-Apr-2017

200 views

Category:

Technology


1 download

TRANSCRIPT

Page 1: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 1

Embodied Cognition with Project Intu

Grady Booch

Chief Architect, Watson/M

[email protected]

@grady_booch

Page 2: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 2

Imagine unleashing Watson in the physical world. Give it eyes, ears, and touch, then let it act in that world with hands and feet and a face, not just as an action of force but also as an action of influence. This is embodied cognition: by placing the cognitive power of Watson in a robot, in an avatar, an object in your hand, or even in the walls of an operating room, conference room, or spacecraft, we take Watson's ability to understand and reason and draw it closer to the natural ways in which humans live and work. In so doing, we augment individual human senses and abilities, giving Watson the ability see a patient's complete medical condition, feel the flow of a supply chain, or orchestrate the tasks in a day in the life of an individual.

Page 3: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 3

Embodied Cognition Defined

§ An embodied cognition must– Be in and of the world– Reason– Learn– Have identity

§ An embodied cognition is not– Simply an STT -> NLC -> action pipeline

Page 4: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 4

An Agent May Be Embodied In A Robot…

Page 5: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 5

An Avatar…

Page 6: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 6

A Space…

Page 7: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 7

Or An Object

Page 8: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 8

EmbodiedCognition

8

UseCases

Concierge(Robot, Avatar,

Space)

Whereistheelevator?

Retail(Robot, Avatar,

Space)

Doyouwannabuildasnowman?

Elder Care(Robot, Avatar,

Space)

I’vefallenandIcan’tgetup!

Cobot(Robot)

Getmeascrewdriver.Manufacturing

(Robot)

Watchmedothis.

Transportation(Robot, Avatar,

Space)

Openthepodbaydoors,Watson.

Boardroom(Avatar, Space)

Helpmedecide. Companion(Avatar, Device)

Let’splayagame.

Page 9: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 9

PointsofTechnicalConfluence

Whosaidthat?

Sensory Fusion

Ifeelsad.

Theory of Mind

IneedthePhillipsheadscrewdriver.

Context

Givemethatone;no,Imeanthatone!

Embodied Conversation

It’sgoodtoseeyouagain,Sandia!

Learning

I’vejustpickedupafaultintheAE35unit.

Devices

InwhatroomwasAlyssaworkingyesterday?

Knowledge Representation

Page 10: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 10

Self Basic Principles

§ Augment human capabilities

§ Learn, don’t program

§ Theory of mind and social intelligence

§ Self understanding

§ Platform agnostic

§ Embodiment as a robot, an avatar, a device, or a space

§ Deployment as middleware with microservices in the cloud

Page 11: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 11

Self Significant Design Decisions

§ Self is a hybrid architecture, encompassing explicit symbolic computation in the center together with neural networks at the edges.

§ Inspired by Minsky’s Society of Mind, behavior takes place in the context of multiple concurrent agents that communicate opportunistically via blackboards and deterministically via peer to peer connections.

§ Inspired by Brooks’s subsumption architecture, behavior takes place in a hierarchy of cognition, from involuntary reflexes to voluntary skills to goals and planning.

§ We maintain a clear separation of concerns among perception, actuating, models, and behavior.

§ As much as possible, behavior is either taught or is learned, not programmed.

§ As much as possible – driven by these separation of concerns, the needs of packaging, and performance – all components are made manifest as RESTful microservices.

§ As much as possible, plans, skills, and reflexes are extensible.

§ Self is intentionally full of strange loops: components of Self are also parts of the models of itself.

§ Self is intentionally fractal: an instance of Self may have models of others, which themselves are other instances of Self.

Page 12: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 12

Self

12

Sensors&Percep,on

Actuators

Agents

Goals&Planning

Pla5ormManagement Infrastructure

MetaManagement

Models

VoluntaryBehavior&Skills

InvoluntaryBehavior&Reflexes

Others

World

Self

Architecture

Page 13: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 1313

Project IntuAn Experimental platform for Embodied Cognition

Intu is about behavior -- about creating presence; about giving the cognitive mind a body through which it can express itself with intelligent relevance

Coordinates cognitive services seamlesslyFramework for tying multiple services together with different external sensors, actuators, and services (i.e. Speaker/Microphone with STT/Conversation Services)

Connects & manages various devices for scaleAbility to extend cognitive services to new devices bringing reasoning and learning in the physical world with minimum coding (i.e. Avatars and Devices)

Expands accessCapability to connect disparate and trained knowledge sources together and integrate 3rd-party services for robust experience(i.e. Weather API or Nexmo API, etc.)

Simplifies and enhances cognitive servicesPre-trained capabilities or “Behaviors” that work across various devices and operating systems enrich and create an immersive experience(i.e. Emotion and Gestures)

Ibm.biz/ProjectIntu

avatars In-the-walls

devices

spaces wearables

robotics

Page 14: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 14

Project IntuAn Experimental platform for Embodied Cognition

Ibm.biz/ProjectIntu

Page 15: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 15

Project IntuAn Experimental platform for Embodied Cognition

Ibm.biz/ProjectIntu

Page 16: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 16

EmbodiedCognitionCore Technology Topic Project Name Lab

Goals & Planning

Planning Yorktown

Embodied Action Almaden

Multi-modal Contextual Interaction Yorktown

Ethical Embodied Decision Support Systems Yorktown

Models of the World, Others, and Self

Spatial Intelligence Zurich, Yorktown

People Intelligence Haifa

Theory of Mind Yorktown

Learning Teachable Agents Yorktown

Sensors & Perception/ActuatorsPhysical Interaction Tokyo

Edge Intelligence Australia

Platform & Meta Management Tooling Yorktown

Solution Area ProjectRobots, Avatars, & Spaces Watson Labs

Robots & Spaces Watson IoT

Robots Industrial Robots

Spaces Elder Care

Spaces M&A

Objects Cognitive Objects

Page 17: Embodied Cognition with Pproject Intu

© 2016 IBM Corporation 17

Imagine unleashing Watson in the physical world. Give it eyes, ears, and touch, then let it act in that world with hands and feet and a face, not just as an action of force but also as an action of influence. This is embodied cognition: by placing the cognitive power of Watson in a robot, in an avatar, an object in your hand, or even in the walls of an operating room, conference room, or spacecraft, we take Watson's ability to understand and reason and draw it closer to the natural ways in which humans live and work. In so doing, we augment individual human senses and abilities, giving Watson the ability see a patient's complete medical condition, feel the flow of a supply chain, or orchestrate the tasks in a day in the life of an individual.