# Recurrent Neural Network Writes Music and Shakespeare Novels | Two Minute Papers #19

## Метаданные

- **Канал:** Two Minute Papers
- **YouTube:** https://www.youtube.com/watch?v=Jkkjy7dVdaY
- **Дата:** 23.10.2015
- **Длительность:** 3:53
- **Просмотры:** 34,302
- **Источник:** https://ekstraktznaniy.ru/video/14934

## Описание

Artificial neural networks are powerful machine learning techniques that can learn to recognize images or paint in the style of Van Gogh. Recurrent neural networks offer a more general model that can learn input sequences and create output sequences. The resulting technique (Long Short-Term Memory in these examples) can write novels in the style of Tolstoy, Shakespeare, or write their own music.

________________________

Andrej Karpathy's original article is available here:
http://karpathy.github.io/2015/05/21/rnn-effectiveness/

Source code: https://github.com/karpathy/char-rnn

The paper "Long Short-Term Memory" by Sepp Hochreiter and Jürgen Schmidhuber is available here:
http://www.bioinf.jku.at/publications/older/2604.pdf
http://deeplearning.cs.cmu.edu/pdfs/Hochreiter97_lstm.pdf

Continuing "Let It Go" from Disney with a recurrent neural network:
https://ericye16.com/music-rnn/

Recommended for you:
Artificial Neural Networks and Deep Learning - https://www.youtube.com/watch?v=rCW

## Транскрипт

### <Untitled Chapter 1> []

dear fellow Scholars this is 2minute papers with K artificial neural networks are very useful tools that are able to learn and

### Recurrent Neural Networks [0:08]

recognize objects on images or learn the style of Van go and paint new pictures in his style today we're going to talk about recurrent neural networks so what does the recurrent part mean with an artificial neural network we usually have a onetoone relation between the input and the output this means that one image comes in and one classification result comes out where the image depicts a human face or a train with recurrent neural networks we can have a one too many relation between the input and the output the input would still be an image but the output would not be a word but a sequence of words a sentence that describes what we see on the image for a many to one relation a good example is

### Sentiment Analysis [0:48]

sentiment analysis this means that a sequence of inputs for instance a sentence is classified as either negative or positive this is very useful for processing movie reviews where we'd like to know whether the user liked or hated the movie without reading pages and pages of discussion and finally recurrent new networks can also deal with many to many relations translating an input sequence into an output sequence examples of this can be machine translations that take an input sentence and translate it to an output sentence in a different language for another example of a many to many relation let's see what the algorithm learned after reading too's War and Peace novel by asking it to write exactly in that style it should be noted that generating a new novel happens letter by letter so the algorithm is not allowed to memorize words let's look at the results at different stages of the training process the initial results are well gibberish but the algorithm seems to recognize immediately that words are basically a big bunch of letters that are separated by spaces if we wait a bit more we see that it starts to get a very rudimentary understanding of structures for instance a quotation mark that you have opened must be closed and a sentence can be closed by a period and it is followed by an uppercase letter later it starts to learn shorter and more common words such as fall that the for me if we wait for longer we see that it already gets a grasp of longer words and smaller parts of sentences actually start to make sense here's a piece of Shakespeare that

### Shakespeare [2:25]

was written by the algorithm after reading all of his works you see names that make sense and you really have to check the text thoroughly to conclude that it's indeed not the real deal it can also try to write math papers I had to look for quite a bit until I realized that something is fishy here it is not unreasonable to think that it can very easily deceive a non-expert reader can you believe this is insanity it is also capable of learning the source code of the Linux operating system and generate new code that looks quite sensible it can also try to continue the song Let It Go from the famous Disney movie Frozen so recurrent newal networks are amazing tools that open up completely New Horizons for solving problems where either the inputs or the output s are not one thing but a sequence of things and now signing off with a piece of recurrent new network wisdom well your wit is in the care of side and debt bear this in mind wherever you go thanks for watching and I'll see you next time
