Update 2019-05-25: Google integrates our Neural Turing Machine implementation in TensorFlow.

We received the best-paper award at the 27th International Conference on Artificial Neural Networks (ICANN 2018) for our paper Implementing Neural Turing Machines. Our student, and co-author of the paper, Mark Collier was at ICANN to present our work about implementing a Neural Turing Machine.

ICANN 2018 Best Paper Award - Implementing Neural Turing Machines

ICANN 2018 Best Paper Award – Implementing Neural Turing Machines

Neural Turing Machines (NTMs) are an instance of Memory Augmented Neural Networks, a new class of recurrent neural networks which decouple computation from memory by introducing an external memory unit. Neural Turing Machines have demonstrated superior performance over Long Short-Term Memory Cells in several sequence learning tasks. A number of open source implementations of Neural Turing Machines exist but are unstable during training and/or fail to replicate the reported performance of NTMs. This paper presents the details of our successful implementation of a Neural Turing Machine. Our implementation learns to solve three sequential learning tasks from the original NTM paper. We find that the choice of memory contents initialization scheme is crucial in successfully implementing a Neural Turing Machine. Networks with memory contents initialized to small constant values converge on average 2 times faster than the next best memory contents initialization scheme.

Read more about our paper here…


Joeran Beel

Please visit https://isg.beel.org/people/joeran-beel/ for more details about me.

0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *