hw #4. hierarchical temporal...
TRANSCRIPT
1
HW #4. Hierarchical Temporal Memory
Implement episodic memory using an HTM network. Skeleton codes are available from http://rit.kaist.ac.kr/home/EE788-2017
Using the code, you are required to store and predict task episodes.
After the learning process, confirm if the HTM can predict the next event for the following input cue: The same events learned (consecutively applied to the HTM) A first event of an episode An event in the middle of an episode An incomplete event (either an action or an object is missing)
2
In encoding, Discuss the advantage of using 3 bits for an object, instead of using one bit.
In spatial pooling, Try with the various values of NUM_COL and NUM_ACTIVATED_COL,
and find the required minimum number of NUM_COL. Discuss what happens if the SP process is skipped, i.e.
Encoding → Temporal pooling → Decoding
In temporal pooling, Try with NUM_CELL from 1 to 20 and discuss the role of cells.
In anomaly detection Plot the cell status when an anomaly occurs. Explain why all the cells
within the activated columns are activated.
Compare the performance of HTM to that of EM (Deep) ART
HW #3. Hierarchical Temporal Memory
3
Using the skeleton codes, fill the following functions in ‘Fill’ folder: anomalyDetection.py
Submit the report along with your codes with a zip file name: HW4_yourname.zip
Due date: May 9, 2017 Send to: [email protected]
HW #3. Hierarchical Temporal Memory
4
Nupic: A python library developed by Numenta, Inc., implementing HTM Installation of Nupic in Ubuntu
Install python $ sudo apt-get install python-pip $ sudo apt-get install python-dev
Install nupic bindings $ pip install https://s3-us-west-
2.amazonaws.com/artifacts.numenta.org/numenta/nupic.core/releases/nupic.bindings/nupic.bindings-0.4.1-cp27-none-linux_x86_64.whl--user
Install nupic $ pip install nupic --user
Export environment variable $ sudo vi /etc/bash.bashrc Register your nupic path, e.g., - export NUPIC=/home/RITL/nupic Save and quit $ source /etc/bash.bashrc Check your environment variable
$ echo $NUPIC
HW #3. Hierarchical Temporal Memory
5
Verification of Nupic installation Install pythest and git for testing nupic
$ sudo apt-get install python-pytest $ sudo apt-get install git
Download nupic example files from the server $ git clone https://github.com/numenta/nupic.git
$ cd nupic Run nupic installation tests
$ ./scripts/run_nupic_tests –u Tests should be ‘PASSED’ or ‘SKIPPED’
For more information on Nupic installation, https://www.youtube.com/watch?v=1fIpgXHXAZA https://github.com/numenta/nupic
HW #3. Hierarchical Temporal Memory
6
PyCharm Python IDE for developers, developed by Jetbrain, Inc.
Installation of Pycharm 5 Copy the attached file ‘pycharm-5.04.tar.gz’ to the desired installation
location Unpack the ‘pycharm-5.04.tar.gz’ using the following command:
$ tar xfz pycharm-5.04.tar.gz Remove the ‘pycharm-5.04.tar.gz’ to save disk space (optional)
$ rm pycharm-5.04.tar.gz Run pycharm.sh from the bin directory
$ cd pycharm-5.04-community/bin $ ./pycharm.sh
For more information on installing PyCharm, https://www.jetbrains.com/pycharm/
HW #3. Hierarchical Temporal Memory
7
Matplotlib Python library for visualization
Installation of matplotlib $ sudo apt-get install python-matplotlib
HW #?. Hierarchical Temporal Memory
8
Episodic memory for task intelligence
Episode #1: Water the flowers Episode #2: Bring a book in a drawer Episode #3: Pour the contents of a bottle Episode #4: Sort the toys Episode #5: Toast a slice of bread
HW #3. Hierarchical Temporal Memory
10
HTM procedure:1. Encoding2. Spatial Pooling3. Temporal Pooling4. Decoding (for classification)
1. Encoding To change sensory data into a binary string understood by HTM. Make sure the semantics of the particular data type should be
captured. Encoder types:
Category: actions, objects, etc. Scalar: real numbers, indoor temperature change, etc. Coordinate: GPS log, location, etc. Date …
HW #3. Hierarchical Temporal Memory
11
5 task episodes
HW #3. Hierarchical Temporal Memory
Sort toys Water flowers Bring an object Toast a bread Pour the contents
Action Object Action Object Action Object Action Object Action Object Grasp RS Walk to WP Walk to Drawer Grasp Bread Grasp Vinegar
Move to Box_1 Bend Grasp to Handle Move to Toaster Move to Bowl
Release Grasp WP Pull Put into Toaster Tilt
Grasp RC Raise Release Move to Button Stand
Move to Box_1 Straight Grasp Red arch Press Button Move to Table
Release Walk to FP Raise Bread Grasp Release
Grasp RT Move to FP Grasp Handle Move to Dish Grasp Mustard
Move to Box_1 Tilt Push Release Move to Bowl
Release Stand Release Squeeze
Grasp GS Move to Ground Walk to Box_1 Move to Table
Move to Box_2 Release Move to Box_1 Release
Release Release Grasp Sesame
Grasp GC Move to Bowl
∗RS = red square, RC = red cylinder, RT = red triangle, GS = Green square, GC = Green cylinderWP = watering pot, FP = flower pot
*Some part of ‘sort toys’ scenario are skipped for the lack of the space of the slide
Shake
Move to Table
Release
12
68 encoded inputs (total # of events)
Total 138 bits for each encoded input of which element by 3 bits: 15 actions:
Grasp, move to, release, tilt, stand, push, pull, put into, shake, squeeze, walk, bend, press, straight, raise.
28 objects: Red-square, re-cylinder, red-arch, red-triangle, green-triangle,
green-cylinder, green-arch, green-triangle, blue-square, blue-cylinder, blue-arch, blue-triangle, watering-pot, dish, bread, handle, red-vinegar, mustard, sesame, box-1, box-2, box-3, flower pot, toaster, button, drawer, bowl, table
2 human detection information: Detected (appear), not detected (disappear)
1 end of a sequence: Completed
HW #3. Hierarchical Temporal Memory
13
Examples of encoded inputs
The structure of an encoded input
HW #3. Hierarchical Temporal Memory
Grasp | Red square : [111 000 000 … 000 | 111 000 000 … 000 | 000 000 | 000]
Move | Box _ 1 : [000 111 000 … 000 | 000 000 111 … 000 | 000 000 | 000]
Release | : [000 000 111 … 000 | 000 000 000 … 000 | 000 000 | 000]Actions Objects Human
detectionEnd
14
Code In five_scenarios.py:
HW #3. Hierarchical Temporal Memory
encodedInput = five_scenarios_input.input_generator_total(INPUT_DIM=INPUT_DIM) # total 68 encoded inputs
def input_generator_total(INPUT_DIM):encodedInput = numpy.zeros((68, INPUT_DIM), dtype='int')encodedInput[0] = input_generator(0 , 0 , 99, 99) # grasp, red square, none, noneencodedInput[1] = input_generator(0 , 1 , 99, 99) # grasp, red cylinder, none, none… …encodedInput[67] = input_generator(99, 99, 99, 1) # none, none, none, finish
return encodedInput
def input_generator(val_1, val_2, val_3, val_4):'''Generate HTM inputs (binary vectors) given four kinds of symbols.
Parameters----------val_1 : Action type ( grasp, move, … release ) from 0 to 14 val_2 : Object type ( red square, red cylinder, … table ) from 0 to 27val_3 : Human type ( disappear, appear ) 0 and 1val_4 : End type ( finish ) 0
Returns-------encoded binary vector'''
: output
In five_scenarios_input.py:
15
2. Spatial pooling To change input into sparse distributed representation (SDR). The dimension of SDR equals the column dimension of the HTM.
In this homework, the dimension of SDR = 200 (NUM_COL) 10 columns (NUM_ACTIVATED_COL) are activated out of 200.
HW #3. Hierarchical Temporal Memory
------- Inputs -------- ------------Spatial pooler results------------------(binary string) (represented as the indices of the activated columns)
Grasp, Red square [40, 92, 168, 172, 178, 181, 182, 183, 184, 187]Grasp, Red cylinder [16, 20, 27, 68, 85, 114, 126, 174, 177, 178]Grasp, Red arch [27, 55, 90, 100, 140, 177, 182, 186, 187, 193]
16
Code In five_scenarios.py
HW #3. Hierarchical Temporal Memory
# Initialize spatial poolersp = SpatialPooler(inputDimensions=(INPUT_DIM,), columnDimensions=(NUM_COL,),
potentialRadius=120, numActiveColumnsPerInhArea=ACTIVE_BIT,globalInhibition=True, synPermActiveInc=0.04,potentialPct=1.0)
# Start Learning SP
SP_output = numpy.zeros((len(encodedInput), NUM_COL), dtype="int")for i in range(SP_ITER): # repeat 200 times
for j in range(len(encodedInput)):sp.compute(encodedInput[j], learn=True, activeArray=SP_output[j])
: output
17
3. Temporal pooling Core part of HTM To change the SDR (the pattern of the activation of the columns)
into a pattern of activation of cells, and Also to make a prediction through a pattern of predicted cells. A pattern of predicted cells represents a prediction of a next-coming
input. In this homework, the number of cells per a columns = 20 (NUM_CELL)
HW #3. Hierarchical Temporal Memory
18
Code In five_scenarios.py
HW #3. Hierarchical Temporal Memory
# Initialize temporal poolertp = TP(numberOfCols=NUM_COL, cellsPerColumn=NUM_CELL, initialPerm=0.5, connectedPerm=0.5,
minThreshold=5, newSynapseCount=10, permanenceInc=0.05, permanenceDec=0.01, activationThreshold=5, globalDecay=0.01, burnIn=1, checkSynapseConsistency=False, pamLength=10)
# Start Learning TPfor i in range(TP_ITER): # learning for 40 times
# Through temporal pooling, HTM learns the temporal sequence of spatial patterns (SDR).# When a temporal sequence ends, use tp.reset() so that HTM knows that the temporal sequence has ended.# When enableLearn=True, tp learns the temporal sequence between inputs.# When computeInfOutput=True, tp infers the next-coming input through a pattern of predicted cells.
##### Sorting toys #####tp.compute(SP_output[0], enableLearn=True, computeInfOutput=False) # grasp, red square,…tp.compute(SP_output[67], enableLearn=True, computeInfOutput=False) # finishtp.reset()#### taking a contents ####tp.compute(SP_output[16], enableLearn=True, computeInfOutput=False) # grasp, red-vinegar,…tp.compute(SP_output[67], enableLearn=True, computeInfOutput=False) # finishtp.reset()
# Learning process is finished. Decoding is not needed during learning.
19
4. Decoding
To decode the pattern of predicted columns, so that users can easily understand the meaning of the pattern.
In this homework, decoding is implemented by the following function:
five_scenarios_input.find_pattern(Columns,NUM_COL,SP_output)
This function finds the meaning of Columns. The function compares the current input with all learned spatial pattern
(SP_output), by counting the number of overlapping bits between the current input and a learned spatial pattern.
The one with the largest number of overlapping bits is selected.
HW #3. Hierarchical Temporal Memory
20
Code In five_scenarios.py
HW #3. Hierarchical Temporal Memory
for t in range(1000):#This is NOT a process for learning; this is a process for Decoding.
# Encodeing : get an input from a user and transform it as a form of a binary string.encodedInput_for_simulation= five_scenarios_input.input_generator_2(t, NUM_COL, INPUT_DIM)# Spatial pooling : get a encodedInput and put out the corresponding SDR (SP_output_for_simulation)sp.compute(encodedInput_for_simulation, learn=False, activeArray=SP_output_for_simulation)# Temporal pooling : get a SDR and put out a pattern of predicted cells (predictedCells)tp.compute(SP_output_for_simulation, enableLearn=False, computeInfOutput=True)predictedCells = tp.getPredictedState()# Decoding : get a pattern of predicted columns # put out the predicted next-coming input(procedural[k+1], character string)# Change the pattern of predicted cells into a pattern of predicted columns.predictedColumns = five_scenarios_input.Cells2Columns(Cells=predictedCells, NUM_COL=NUM_COL)# Compare the next-predicted SDR (predictedColumns) with all previously leaned SDR patterns (SP_output)# The best-mating event becomes the output, as a form of a character string.procedural.append(five_scenarios_input.find_pattern(Columns=predictedColumns,
NUM_COL=NUM_COL, SP_output=SP_output,INPUT_DIM=INPUT_DIM,encodedInput=encodedInput)[0])
: output of TP
: output of decoding
21
5. Anomaly detection
HTM can easily detect anomalies, by comparing predicted columns at time t-1 and activated columns at time t.
Anomaly score is the fraction of active columns that were not predicted. Anomalydetectionisimplementedbyusingthefollowingequation:
⋂
wherePredictedcolumnsattime ‐1Activecolumnsattime
Thenumberofcolumns
HW #3. Hierarchical Temporal Memory
22
Code In five_scenarios.py:
In Fill/anomalyDetection.py:
HW #3. Hierarchical Temporal Memory
# Anomaly Detection ## except first loop (k=0), calculate anomaly valueif k > 0:
anomalyScore = anomalyDetection.anomalyScore(activeColumns=SP_output_for_simulation,predictedColumns=predictedColumns)
# If the anomaly score exceeds a threshold (ANOMALY_THRESHOLD=0.5) it is considered as an anomaly.if anomalyScore > ANOMALY_THRESHOLD:
print "anomaly detected“
def anomalyScore(activeColumns, predictedColumns):
# You are required to implement anomalyDetection.
return anomalyScore
: output
23
HW #3. Hierarchical Temporal Memory
FOR (i = 0; i<NUM_DATASET; i++) DO :
EncodedInput[i] = Encode(SensoryInput[i]);
FOR (i = 0; i<NUM_DATASET; i++) DO :
SP_output[i] = SpatialPooling(EncodedInput[i], Learn=True);
FOR (i = 0; i<NUM_DATASET; i++) DO :
TemporalPooling(SP_output[i], Learn=True, Prediction = False);
t = 0;
WHILE(TRUE) DO:
oneEncodedInput = Encode(SensoryInputFromUser);
oneSP_output = SpatialPooling(oneEncodedInput, Learn = False);
oneTP_output = TemporalPooling(oneSP_output, Learn = False, Prediction = True);
Procedural[t+1] = Decode(oneTP_output)
PRINT ‘Next Predicted Event =’ , Procedural[t+1]
t++;
Pseudo code of the overall HTM procedure
Learning process
Prediction
& decoding
process
24
Results ‘pouring the contents of a bottle’ scenario At 0, the first input of the scenario, (Grasp | Red-vinegar) to HTM
HW #3. Hierarchical Temporal Memory
Next predicted state: Move to a bowlCurrent state: Grasp a red‐vinegar bottle
25
At 1, the second input of the scenario, (Move-to | Bowl) to HTM
HW #3. Hierarchical Temporal Memory
Next predicted state: TiltCurrent state: Move to a bowl