Post by dominic on Nov 30, 2016 21:13:48 GMT
I will start compiling the poster and report in this thread. I will update this post with more info soon. Putting in the links to some use pages that I will reference for the write ups.
The two links we have been referring to mostly
karpathy.github.io/2015/05/21/rnn-effectiveness/
www.tensorflow.org/versions/r0.12/tutorials/seq2seq/index.html#sequence-to-sequence-models
Recurrent Neural Network for Sentence Completion
cs.nyu.edu/~mirowski/pub/MirowskiVlachos_ACL2015_DependencyTreeRNN.pdf
Conversional neural model translating french to english
arxiv.org/pdf/1506.05869.pdf
Understanding LSTM models
colah.github.io/posts/2015-08-Understanding-LSTMs/
Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models
arxiv.org/abs/1507.04808
DEEP LEARNING FOR CHATBOTS
www.wildml.com/2016/04/deep-learning-for-chatbots-part-1-introduction/
www.wildml.com/2016/07/deep-learning-for-chatbots-2-retrieval-based-model-tensorflow/
The next two papers are referenced from this link
Attention with Intention for a Neural Network Conversation Model
arxiv.org/abs/1510.08565
A Persona-Based Neural Conversation Model
arxiv.org/abs/1603.06155
I think thats enough links and papers to satisfy the graduate requirements as well as to write the report and get some reference information for the poster. I will read through these and start up a document so we can get the poster and report done asap.
Recurrent Neural Networks Regularization
arxiv.org/pdf/1409.2329.pdf
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
arxiv.org/pdf/1406.1078.pdf
Sequence to Sequence Learning with Neural Networks
arxiv.org/pdf/1409.3215.pdf
-------------------------THE REPORT------------------------------
1drv.ms/w/s!AoNIuW-GkoxOgcMUmNPvtos2oGvU3w
---------------------------------------------------------------------
NIPS style format
www.sharelatex.com/templates/57591dd15174a1b0103c9973/v/0/pdf?inline=true&name=Neural%20Information%20Processing%20Systems%20(NIPS)%20Conference%202016
The two links we have been referring to mostly
karpathy.github.io/2015/05/21/rnn-effectiveness/
www.tensorflow.org/versions/r0.12/tutorials/seq2seq/index.html#sequence-to-sequence-models
Recurrent Neural Network for Sentence Completion
cs.nyu.edu/~mirowski/pub/MirowskiVlachos_ACL2015_DependencyTreeRNN.pdf
Conversional neural model translating french to english
arxiv.org/pdf/1506.05869.pdf
Understanding LSTM models
colah.github.io/posts/2015-08-Understanding-LSTMs/
Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models
arxiv.org/abs/1507.04808
DEEP LEARNING FOR CHATBOTS
www.wildml.com/2016/04/deep-learning-for-chatbots-part-1-introduction/
www.wildml.com/2016/07/deep-learning-for-chatbots-2-retrieval-based-model-tensorflow/
The next two papers are referenced from this link
Attention with Intention for a Neural Network Conversation Model
arxiv.org/abs/1510.08565
A Persona-Based Neural Conversation Model
arxiv.org/abs/1603.06155
I think thats enough links and papers to satisfy the graduate requirements as well as to write the report and get some reference information for the poster. I will read through these and start up a document so we can get the poster and report done asap.
Recurrent Neural Networks Regularization
arxiv.org/pdf/1409.2329.pdf
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
arxiv.org/pdf/1406.1078.pdf
Sequence to Sequence Learning with Neural Networks
arxiv.org/pdf/1409.3215.pdf
-------------------------THE REPORT------------------------------
1drv.ms/w/s!AoNIuW-GkoxOgcMUmNPvtos2oGvU3w
---------------------------------------------------------------------
NIPS style format
www.sharelatex.com/templates/57591dd15174a1b0103c9973/v/0/pdf?inline=true&name=Neural%20Information%20Processing%20Systems%20(NIPS)%20Conference%202016