|
 |
 |
 |
 |
19-475-0402 NATURAL LANGUAGE PROCESSING WITH DEEP LEARNING
|
Core/Elective:
Elective Semester: 4 Credits:
4 |
Course Description |
Natural language processing (NLP) is one of the most important technologies of the information
age. Applications of NLP are everywhere because people communicate mostly everything in
language: web search, advertisement, emails, customer service, language translation, radiology
reports, etc. Recently, deep learning approaches have obtained very high performance across many
different NLP tasks. In this course students will learn to implement, train, debug, visualize and
invent their own neural network models. The course provides a deep excursion into cutting-edge
research in deep learning applied to NLP |
Course Objectives |
To understand
the neural network approach to learn and process natural
language data
To know advanced concepts in natural language processing
To learn to implement, train, debug and visualize deep
neural network models for language processing |
Course Content |
Module I
Word Vectors-Singular Value Decomposition- Skip-gram-Continuous
Bag of Words (CBOW)-Negative Sampling- Distributed Representations
of Words and Phrases and their CompositionalityEfficient
Estimation of Word Representations in Vector Space-
Advanced word vector representations- language models-softmax-single
layer networks
Module II
Neural Networks and backpropagation for named entity
recognition-A Neural Network for Factoid Question Answering
over Paragraphs-Grounded Compositional Semantics for
Finding and Describing Images with Sentences-Deep Visual-Semantic
Alignments for Generating Image Descriptions-Recursive
Deep Models for Semantic Compositionality over a Sentiment
Treebank
Module III
Introduction to Tensorflow- Large-Scale Machine Learning
on Heterogeneous Distributed SystemsRecurrent neural
networks for language modeling and Extensions of recurrent
neural network language model-Opinion Mining with Deep
Recurrent Neural Networks
Module IV
GRUs and LSTMs for machine translation- Recursive neural
networks for parsing- Parsing with Compositional Vector
Grammars-Subgradient Methods for Structured Prediction-Parsing
Natural Scenes and Natural Language with Recursive Neural
Networks-Recursive Deep Models for Semantic Compositionality
Over a Sentiment Treebank-Dynamic Pooling and Unfolding
Recursive Autoencoders for Paraphrase Detection-Improved
Semantic Representations From Tree-Structured Long Short-Term
Memory Networks
Module V
Convolutional neural networks for sentence classification-
Sequence to Sequence with Neural Networks-Neural Machine
Translation by Jointly Learning to Align and Translate-
Dynamic Memory Networks for NLP
|
REFERNCES |
1. Yoav
Goldberg, Neural Network Methods for Natural Language
Processing, Morgan & Claypool Publishers, 1ed, 2017
2. Ian Goodfellow, YoshuaBengo, Aaron Courville, Deep
Learning, 1e, MIT Press, 2017
3. Nikhil Buduma and Nicholas Locascio, Fundamentals
of Deep Learning: Designing NextGeneration Machine Intelligence
Algorithms, 1e, Shroff/O'Reilly, 2017
4. Josh Patterson and Adam Gibson, Deep Learning: A
Practitioner's Approach, 1e, Shroff/O'Reilly, 2017 |
|
 |
 |
 |
 |
|
|
|
|
|
|