This document discusses sequences and their importance in learning. It provides examples of sequences without memory to predict the next element. The document also includes musical sequences, such as a Bach invention and lyrics from Eminem's "Lose Yourself". Finally, it lists flexibility, autonomy and efficiency as important concepts.
6. AA B C A B C B
Past Present Future
Sequence Memory
7. A B C D X B C Y
A B C D X B C Y
A B C D X B C ?
Goal: Predict next element in this sequence
Assume: No prior memory
8. A B C D X B C Y
A B C D X B C Y
A B C D X B C Y
Goal: Predict next element in this sequence
Assume: No prior memory
9. E J E G E G J I F J B J E D A D D B I I C B F D C B J
J F F D G I F A J J C F H G I F C F G C G G F E J E
G E G J I F J B J E D A D D B I I C B F D C B J J F F
D G I F A J J C F H G I F C F G C G G F E J E G E G
J I F J B J E D A D D B I I C B F D C B J J F F D G I
F A J J C F H G I F C F G C G G F E J E G ?
14. His palms are sweaty, knees weak, arms are heavy
There's vomit on his sweater already, mom's spaghetti
He's nervous, but on the surface he looks calm and ready to drop bombs,
But he keeps on forgetting what he wrote down,
The whole crowd goes so loud
He opens his mouth, but the words won't come out
He's choking how, everybody's joking now
The clock's run out, time's up, over, bloah!
Snap back to reality, Oh there goes gravity
Oh, there goes Rabbit, he choked
He's so mad, but he won't give up that
Easy, no
He won't have it, he knows his whole back's to these ropes
It don't matter, he's dope
He knows that but he's broke
He's so stagnant, he knows
When he goes back to his mobile home, that's when it's
Back to the lab again, yo
This whole rhapsody
He better go capture this moment and hope it don't pass him
You better lose yourself in the music, the moment
You own it, you better never let it go (go)
You only get one shot, do not miss your chance to blow
This opportunity comes once in a lifetime yo
You better lose yourself in the music, the moment
You own it, you better never let it go (go)
You only get one shot, do not miss your chance to blow
This opportunity comes once in a lifetime yo
Lose Yourself
#2: My name is Rian
And for my demo
we are going to be doing
some sequence hacking.
#3: Sequence
A sequence is
A collection of
Ordered elements.
Numbers are sequences.
Words are sequences.
My speech is a sequence.
Sequences are important
Because learning occurs sequentially.
Think about anything you have ever learned;
Counting, writing, programming,
walking, singing, dancing,
brushing your teeth, getting dressed
These are all sequences
That involve a start, an end,
and a sequence of steps in between.
#4: Take a look at this sequence.
A B C A B C A ?
If I were to ask you to predict the next element in this sequence, assuming no prior memory, what would you predict?
No prior memory -> no knowledge of what the alphabet and what these symbols represent.
#5: No prior memory -> no knowledge of what the alphabet and what these symbols represent.
Cut to the demo here
#6: Why do we predict B?
We use our knowledge of the present (in this case, the blue A)
and our knowledge of the past (the repetitive ABC structure)
To generate a prediction about what the future might look like (B).
If we were to expose Sequence Memory
to this same sequence
and also ask it to
predict the next element in the sequence
we would also expect it
to predict B because
Sequence memory learns
to predict the future
based on the past and present.
Lets test this out. (demo)
During Demo:
-Pre-run
The programming language I am using for this demo is called Racket.
Racket is a dialect of Lisp, which was traditionally used for AI applications.
Lisp is known for its use of parentheses.
The input is ABCABCA.
The name of the function
that will operate on this sequence
to make predictions is sequence memory.
The display function displays the output.
-Post-run
The first column is the input data (raw values)
The second column is the predicted output (expected values)
Each row is input and output at a timestep.
First when this sequence memory algorithm
is exposed to A it doesn’t make a prediction
Because A is a novel input it has never been exposed to before.
The same is true for B and C,
But when Sequence Memory is exposed to A for the second time it predicts B
-Conclusion
What we have learned from this example is that Sequence Memory can learn to make predictions from a short, simple sequence.
#7: Why do we predict B?
We use our knowledge of the present (in this case, the blue A)
and our knowledge of the past (the repetitive ABC structure)
To generate a prediction about what the future might look like (B).
If we were to expose Sequence Memory
to this same sequence
and also ask it to
predict the next element in the sequence
we would also expect it
to predict B because
Sequence memory learns
to predict the future
based on the past and present.
Lets test this out. (demo)
During Demo:
-Pre-run
The programming language I am using for this demo is called Racket.
Racket is a dialect of Lisp, which was traditionally used for AI applications.
Lisp is known for its use of parentheses.
The input is ABCABCA.
The name of the function
that will operate on this sequence
to make predictions is sequence memory.
The display function displays the output.
-Post-run
The first column is the input data (raw values)
The second column is the predicted output (expected values)
Each row is input and output at a timestep.
First when this sequence memory algorithm
is exposed to A it doesn’t make a prediction
Because A is a novel input it has never been exposed to before.
The same is true for B and C,
But when Sequence Memory is exposed to A for the second time it predicts B
-Conclusion
What we have learned from this example is that Sequence Memory can learn to make predictions from a short, simple sequence.
#8: No prior memory -> no knowledge of what the alphabet and what these symbols represent.
#9: No prior memory -> no knowledge of what the alphabet and what these symbols represent.
#10: Lets move on
to the next example
Which is even longer & even more complicated.
If I were to show you this sequence:
(point to sequence) and ask you
To predict the next element in this sequence
based on the present element
which in this case is G (in blue), and
the entire past which in this case is everything that precedes the blue G
What would you predict?
If it is at all possible
to make a prediction
From this input sequence
This example is more challenging
that the last couple of examples.
But if I were to show you this differently
#12: Johann Sebastian Bach
is one of the greatest
composers and pianists ever.
#13: Before Demo
This melody called Invention 1
is one of his shorter pieces.
What I am going to show now
is the SAME Sequence Memory algorithm
Learning to play this song.
All I am doing is feeding in data
Corresponding to
What the note is (C, D, E, B-flat)
The duration of the note (how long it’s pressed)
And where in the sequence the note occurs.
After Demo
This is remarkable because
the same algorithm that just a few slides ago
was learning 123’s and ABC’s,
that started with no prior knowledge,
Or in computer science lingo
as an empty list,
Has learned the sequence of these notes
Learned the overall structure of the song
(where it starts and ends)
After only being exposed to the song,
or hearing the song twice,
In like 20 seconds.
I used to play the piano,
In fact I was classically trained
for about 10 years
And in my first 20 seconds
of learning to play the piano I could not play this.
No one can, not even the great Bach
To play a song like this well
takes at least 3-4 years of study
if you are good and practice a lot.
Re-Emphasize
So I want to re-emphasize that
this is the same model,
This same exact algorithm used in past examples,
So now lets move on
from one legendary musician,
to another…
#14: Marshall Bruce Mathers III
Aka Eminem
Eminem is considered to be
one of the greatest lyricists and rappers
Of all time.
For those of you familiar with Eminem and his music
Know that one of his most famous songs
is called ….
#15: Before Demo
Lose Yourself.
These are the lyrics to Lose Yourself.
I wanted to find out if I could feed these lyrics
into Sequence Memory Algorithm and have it
learn to rap this song Lose Yourself.
After Demo
This is also very impressive because
After being exposed to this sequence of lyrics
Just twice, it learned it almost perfectly.
Another nice feature about Sequence Memory
Is that unlike a biological cortex
It won’t ever forget a sequence unless you tell it to.
#16: Flexible Sequence Learning
Sequence Memory should be used because
it is flexible sequence learning
meaning it can learn
an arbitrary amount of sequences
of arbitrary type, arbitrary length,
And arbitrary context
(assuming your computer meets the required memory capacity).
Autonomous Sequence Learning
Sequence Memory is autonomous sequence learning
Meaning it can learn
sequences with little to no human supervision.
This is unsupervised machine learning.
Robust Sequence Learning
Sequence Memory is robust sequence learning,
meaning it is fault tolerant
and can handle noisy data well.