Natural Language Processing

Published: 09 Oct 2015 Category: deep_learning

Tutorials

Practical Neural Networks for NLP

Structured Neural Networks for NLP: From Idea to Code

Understanding Deep Learning Models in NLP

http://nlp.yvespeirsman.be/blog/understanding-deeplearning-models-nlp/

Deep learning for natural language processing, Part 1

https://softwaremill.com/deep-learning-for-nlp/

Neural Models

Unifying Visual-Semantic Embeddings with Multimodal Neural Language Models

Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

Visualizing and Understanding Neural Models in NLP

Character-Aware Neural Language Models

Skip-Thought Vectors

A Primer on Neural Network Models for Natural Language Processing

Character-aware Neural Language Models

Neural Variational Inference for Text Processing

Sequence to Sequence Learning

Generating Text with Deep Reinforcement Learning

MUSIO: A Deep Learning based Chatbot Getting Smarter

Translation

Learning phrase representations using rnn encoder-decoder for statistical machine translation

Neural Machine Translation by Jointly Learning to Align and Translate

Multi-Source Neural Translation

Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism

Modeling Coverage for Neural Machine Translation

A Character-level Decoder without Explicit Segmentation for Neural Machine Translation

NEMATUS: Attention-based encoder-decoder model for neural machine translation

Variational Neural Machine Translation

Neural Network Translation Models for Grammatical Error Correction

Linguistic Input Features Improve Neural Machine Translation

Sequence-Level Knowledge Distillation

Neural Machine Translation: Breaking the Performance Plateau

Tips on Building Neural Machine Translation Systems

Semi-Supervised Learning for Neural Machine Translation

EUREKA-MangoNMT: A C++ toolkit for neural machine translation for CPU

Deep Character-Level Neural Machine Translation

Neural Machine Translation Implementations

Google’s Neural Machine Translation System: Bridging the Gap between Human and Machine Translation

Learning to Translate in Real-time with Neural Machine Translation

Is Neural Machine Translation Ready for Deployment? A Case Study on 30 Translation Directions

Fully Character-Level Neural Machine Translation without Explicit Segmentation

Navigational Instruction Generation as Inverse Reinforcement Learning with Neural Machine Translation

Neural Machine Translation in Linear Time

Neural Machine Translation with Reconstruction

A Convolutional Encoder Model for Neural Machine Translation

Toward Multilingual Neural Machine Translation with Universal Encoder and Decoder

MXNMT: MXNet based Neural Machine Translation

Doubly-Attentive Decoder for Multi-modal Neural Machine Translation

Massive Exploration of Neural Machine Translation Architectures

Depthwise Separable Convolutions for Neural Machine Translation

Deep Architectures for Neural Machine Translation

Marian: Fast Neural Machine Translation in C++

Sockeye

Summarization

Extraction of Salient Sentences from Labelled Documents

A Neural Attention Model for Abstractive Sentence Summarization

A Convolutional Attention Network for Extreme Summarization of Source Code

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

textsum: Text summarization with TensorFlow

How to Run Text Summarization with TensorFlow

Reading Comprehension

Text Comprehension with the Attention Sum Reader Network

Text Understanding with the Attention Sum Reader Network

A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task

Consensus Attention-based Neural Networks for Chinese Reading Comprehension

Separating Answers from Queries for Neural Reading Comprehension

Attention-over-Attention Neural Networks for Reading Comprehension

Teaching Machines to Read and Comprehend CNN News and Children Books using Torch

Reasoning with Memory Augmented Neural Networks for Language Comprehension

Bidirectional Attention Flow: Bidirectional Attention Flow for Machine Comprehension

NewsQA: A Machine Comprehension Dataset

Gated-Attention Readers for Text Comprehension

Get To The Point: Summarization with Pointer-Generator Networks

Language Understanding

Recurrent Neural Networks with External Memory for Language Understanding

Neural Semantic Encoders

Neural Tree Indexers for Text Understanding

Better Text Understanding Through Image-To-Text Transfer

Text Classification

Convolutional Neural Networks for Sentence Classification

Recurrent Convolutional Neural Networks for Text Classification

Character-level Convolutional Networks for Text Classification

A C-LSTM Neural Network for Text Classification

Rationale-Augmented Convolutional Neural Networks for Text Classification

Text classification using DIGITS and Torch7

Recurrent Neural Network for Text Classification with Multi-Task Learning

Deep Multi-Task Learning with Shared Memory

Virtual Adversarial Training for Semi-Supervised Text Classification

Adversarial Training Methods for Semi-Supervised Text Classification

Sentence Convolution Code in Torch: Text classification using a convolutional neural network

Bag of Tricks for Efficient Text Classification

Actionable and Political Text Classification using Word Embeddings and LSTM

Implementing a CNN for Text Classification in TensorFlow

fancy-cnn: Multiparadigm Sequential Convolutional Neural Networks for text classification

Convolutional Neural Networks for Text Categorization: Shallow Word-level vs. Deep Character-level

Tweet Classification using RNN and CNN

Hierarchical Attention Networks for Document Classification

AC-BLSTM: Asymmetric Convolutional Bidirectional LSTM Networks for Text Classification

Generative and Discriminative Text Classification with Recurrent Neural Networks

Adversarial Multi-task Learning for Text Classification

Deep Text Classification Can be Fooled

Deep neural network framework for multi-label text classification

Multi-Task Label Embedding for Text Classification

Text Clustering

Self-Taught Convolutional Neural Networks for Short Text Clustering

Alignment

Aligning Books and Movies: Towards Story-like Visual Explanations by Watching Movies and Reading Books

Dialog

Visual Dialog

Papers, code and data from FAIR for various memory-augmented nets with application to text understanding and dialogue.

Neural Emoji Recommendation in Dialogue Systems

Memory Networks

Neural Turing Machines

Memory Networks

End-To-End Memory Networks

Reinforcement Learning Neural Turing Machines - Revised


Learning to Transduce with Unbounded Memory

How to Code and Understand DeepMind’s Neural Stack Machine


Ask Me Anything: Dynamic Memory Networks for Natural Language Processing

Ask Me Even More: Dynamic Memory Tensor Networks (Extended Model)

Structured Memory for Neural Turing Machines

Dynamic Memory Networks for Visual and Textual Question Answering

Neural GPUs Learn Algorithms

Hierarchical Memory Networks

Convolutional Residual Memory Networks

NTM-Lasagne: A Library for Neural Turing Machines in Lasagne

Evolving Neural Turing Machines for Reward-based Learning

Hierarchical Memory Networks for Answer Selection on Unknown Words

Gated End-to-End Memory Networks

Can Active Memory Replace Attention?

A Taxonomy for Neural Memory Networks

Papers

Globally Normalized Transition-Based Neural Networks

A Decomposable Attention Model for Natural Language Inference

Improving Recurrent Neural Networks For Sequence Labelling

Recurrent Memory Networks for Language Modeling

Tweet2Vec: Learning Tweet Embeddings Using Character-level CNN-LSTM Encoder-Decoder

Learning text representation using recurrent convolutional neural network with highway layers

Ask the GRU: Multi-task Learning for Deep Text Recommendations

From phonemes to images: levels of representation in a recurrent neural model of visually-grounded language learning

Visualizing Linguistic Shift

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks

Deep Learning applied to NLP

https://arxiv.org/abs/1703.03091

Attention Is All You Need

Recent Trends in Deep Learning Based Natural Language Processing

HotFlip: White-Box Adversarial Examples for NLP

No Metrics Are Perfect: Adversarial Reward Learning for Visual Storytelling

Interesting Applications

Data-driven HR - Résumé Analysis Based on Natural Language Processing and Machine Learning

sk_p: a neural program corrector for MOOCs

Neural Generation of Regular Expressions from Natural Language with Minimal Domain Knowledge

emoji2vec: Learning Emoji Representations from their Description

Inside-Outside and Forward-Backward Algorithms Are Just Backprop (Tutorial Paper)

Cruciform: Solving Crosswords with Natural Language Processing

Smart Reply: Automated Response Suggestion for Email

Deep Learning for RegEx

Learning Python Code Suggestion with a Sparse Pointer Network

End-to-End Prediction of Buffer Overruns from Raw Source Code via Neural Memory Networks

https://arxiv.org/abs/1703.02458

Convolutional Sequence to Sequence Learning

DeepFix: Fixing Common C Language Errors by Deep Learning

Hierarchically-Attentive RNN for Album Summarization and Storytelling

Project

TheanoLM - An Extensible Toolkit for Neural Network Language Modeling

NLP-Caffe: natural language processing with Caffe

DL4NLP: Deep Learning for Natural Language Processing

Combining CNN and RNN for spoken language identification

Character-Aware Neural Language Models: LSTM language model with CNN over characters in TensorFlow

Neural Relation Extraction with Selective Attention over Instances

deep-simplification: Text simplification using RNNs

lamtram: A toolkit for language and translation modeling using neural networks

Lango: Language Lego

Sequence-to-Sequence Learning with Attentional Neural Networks

harvardnlp code

Seq2seq: Sequence to Sequence Learning with Keras

debug seq2seq

Recurrent & convolutional neural network modules

Datasets

Datasets for Natural Language Processing

Blogs

How to read: Character level deep learning

Heavy Metal and Natural Language Processing

Sequence To Sequence Attention Models In PyCNN

https://talbaumel.github.io/Neural+Attention+Mechanism.html

Source Code Classification Using Deep Learning

http://blog.aylien.com/source-code-classification-using-deep-learning/

My Process for Learning Natural Language Processing with Deep Learning

https://medium.com/@MichaelTeifel/my-process-for-learning-natural-language-processing-with-deep-learning-bd0a64a36086

Convolutional Methods for Text

https://medium.com/@TalPerry/convolutional-methods-for-text-d5260fd5675f

Word2Vec

Word2Vec Tutorial - The Skip-Gram Model

http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/

Word2Vec Tutorial Part 2 - Negative Sampling

http://mccormickml.com/2017/01/11/word2vec-tutorial-part-2-negative-sampling/

Word2Vec Resources

http://mccormickml.com/2016/04/27/word2vec-resources/

Demos

AskImage.org - Deep Learning for Answering Questions about Images

Talks / Videos

Navigating Natural Language Using Reinforcement Learning

Resources

So, you need to understand language data? Open-source NLP software can help!

Curated list of resources on building bots

Notes for deep learning on NLP

https://medium.com/@frank_chung/notes-for-deep-learning-on-nlp-94ddfcb45723#.iouo0v7m7