Professional Documents
Culture Documents
Pentti Kanerva
UC Berkeley,
Redwood Center for Theoretical Neuroscience
Stanford, CSLI
pkanerva@csli.stanfrod.edu
. Complex behavior
- Perception, learning
- Concepts, thought, language, ambiguity
- Flexibility, adaptivity
. Robustness
. Energy efficiency
- 20 W
2
10
- 40 billion (4 x 10 ) neurons
14
- 240 trillion (2.4 x 10 ) synapses
. Speed of learning
. Reasoning by analogy
. ...
4
KEY OBSERVATIONS
man ≉ lake
man ≈ fisherman ≈ fish ≈ lake
man ≈ plumber ≈ water ≈ lake
What is new?
AN EXAMPLE OF HD ALGORITHM:
Identify the Language
METHOD
DATA
COMPUTING a PROFILE
Rotation of coordinates
.->------------->------------->.
/ \
/ \
T = (+1 -1 -1 +1 -1 -1 . . . +1 +1 -1 -1) . .
H = (+1 -1 +1 +1 +1 +1 . . . +1 -1 +1 -1) .
E = (+1 +1 +1 -1 -1 +1 . . . +1 -1 +1 +1)
-------------------------------------------------
THE = (+1 +1 -1 +1 . . . . . . +1 +1 -1 -1)
11
In symbols:
THE = rrT * rH * E
where
cos(X, X) = 1
cos(X,-X) = -1
cos(X, Y) = 0 if X and Y are orthogonal
14
Italian
* *Romanian
Portuguese
* *Spanish
*Slovene *French
*Bulgari *Czech
*Slovak *English
*Greek
*Polish *Lithuanian
*Latvian
*Estonian
* *Finnish
Hungarian
*Dutch
*Danish *German
*Swedish
15
X ≈ E, #, A, I, R, O
16
Summary of Algorithm
Speed
Reference
- Computing in superposition
. "ADD" vectors
- Coordinatewise majority: A = [B + C + D]
. "MULTIPLY" vectors
Example: h = {x = a, y = b, z = c}
ENCODING h = {x = a, y = b, z = c}
ENCODING h = {x = a, y = b, z = c}
ENCODING h = {x = a, y = b, z = c}
ENCODING h = {x = a, y = b, z = c}
Y = 10001...10
B = 11111...00
----------------
Y*B = 01110...10 -> 0 1 1 1 0 ... 1 0 y = b
..24
ENCODING h = {x = a, y = b, z = c}
Y = 10001...10
B = 11111...00
----------------
Y*B = 01110...10 -> 0 1 1 1 0 ... 1 0 y = b
Z = 01101...01
C = 10001...01
----------------
Z*C = 11100...00 -> 1 1 1 0 0 ... 0 0 z = c
.24
ENCODING h = {x = a, y = b, z = c}
Y = 10001...10
B = 11111...00
----------------
Y*B = 01110...10 -> 0 1 1 1 0 ... 1 0 y = b
Z = 01101...01
C = 10001...01
----------------
Z*C = 11100...00 -> 1 1 1 0 0 ... 0 0 z = c
----------------------------
Sum = 2 2 3 1 1 ... 2 0
24
ENCODING h = {x = a, y = b, z = c}
Y = 10001...10
B = 11111...00
----------------
Y*B = 01110...10 -> 0 1 1 1 0 ... 1 0 y = b
Z = 01101...01
C = 10001...01
----------------
Z*C = 11100...00 -> 1 1 1 0 0 ... 0 0 z = c
----------------------------
Sum = 2 2 3 1 1 ... 2 0
Majority = 1 1 1 0 0 ... 1 0 = H
...25
H = 1 1 1 0 0 ... 1 0
X = 1 0 0 1 0 ... 0 1
.25
"Un"bind H H = 1 1 1 0 0 ... 1 0
with the X = 1 0 0 1 0 ... 0 1
inverse ------------------------
of X X*H = 0 1 1 1 0 ... 1 1 = A' ≈ A
25
"Un"bind H H = 1 1 1 0 0 ... 1 0
with the X = 1 0 0 1 0 ... 0 1
inverse ------------------------
of X X*H = 0 1 1 1 0 ... 1 1 = A' ≈ A
|
v
.----------------------.
| |
| ASSOCIATIVE MEMORY |
| finds |
| nearest neighbor |
| among stored vectors |
| |
'----------------------'
|
v
0 0 1 1 1 ... 1 1 = A
26
In symbols:
Encoding of h = {x = a, y = b, z = c}
with majority rule
- Addition
- Multiplication
- Permutation
70
60
30
o o
N = 15
20 /
Any point o o
10 /
/ o o
0o o 0.1 o 0.2
o o o 0.9 o o1
0 0.3 0.4 0.5 0.6 0.7 0.8
| |
0 .47 .53 1
| | | |
| 0.999999998 |
|<-- 0.000000001 --->| |<--- 0.000000001 -->|
Major similarities
Major differences
- transparent,
- scalable
33
SUMMARY
- Autonomous robots
WHAT NEXT?
. Explore HD algorithms
- Machine learning
- Language
- Big Data
. Explore nanomaterials
T h a n k Y o u!
Supported by
Abstract
Circuits for computing in high-D are thousands of bits wide but the
components need not be ultra-reliable nor fast. Thus the architecture
is a good match to emerging nanotechnology, with applications in many
areas of machine learning. I will demonstrate high-dimensional
computing with a simple algorithm for identifying languages.