Natural Language Processing with Deep Learning

Overview Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation.

Beginner 0(0 Ratings) 0 Students enrolled English
Created by Admin corner
Last updated Wed, 08-Jun-2022
+ View more
Course overview
Lecture Details

Lecture 1 introduces the concept of Natural Language Processing (NLP) and the problems NLP faces today. The concept of representing words as numeric vectors is then introduced, and popular approaches to designing word vectors are discussed.

Key phrases: Natural Language Processing. Word Vectors. Singular Value Decomposition. Skip-gram. Continuous Bag of Words (CBOW). Negative Sampling. Hierarchical Softmax. Word2Vec.

-------------------------------------------------------------------------------

Natural Language Processing with Deep Learning

Instructors:
- Chris Manning
- Richard Socher

Natural language processing (NLP) deals with the key artificial intelligence technology of understanding complex human language communication. This lecture series provides a thorough introduction to the cutting-edge research in deep learning applied to NLP, an approach that has recently obtained very high performance across many different NLP tasks including question answering and machine translation. It emphasizes how to implement, train, debug, visualize, and design neural network models, covering the main technologies of word vectors, feed-forward models, recurrent neural networks, recursive neural networks, convolutional neural networks, and recent models involving a memory component.

For additional learning opportunities please visit:
http://stanfordonline.stanford.edu/

Curriculum for this course
19 Lessons 25:03:49 Hours
Lecture
19 Lessons 25:03:49 Hours
  • Natural Language Processing with Deep Learning
    Preview 01:11:41
  • Word Vector Representations: word2vec
    01:18:17
  • GloVe: Global Vectors for Word Representation
    01:18:40
  • Word Window Classification and Neural Networks
    01:16:43
  • Backpropagation and Project Advice
    01:18:20
  • Dependency Parsing
    01:23:07
  • Introduction to TensorFlow
    01:12:33
  • Recurrent Neural Networks and Language Models
    01:18:03
  • Machine Translation and Advanced Recurrent LSTMs and GRUs
    01:20:28
  • Review Session: Midterm Review
    01:25:01
  • Neural Machine Translation and Models with Attention
    01:21:24
  • Gated Recurrent Units and Further Topics in NMT
    01:20:00
  • End-to-End Models for Speech Processing
    01:16:35
  • Convolutional Neural Networks
    01:22:11
  • Tree Recursive Neural Networks and Constituency Parsing
    01:22:08
  • Coreference Resolution
    01:20:45
  • Dynamic Neural Networks for Question Answering
    01:18:14
  • Issues in NLP and Possible Architectures for NLP
    01:18:58
  • Tackling the Limits of Deep Learning for NLP
    01:20:41
+ View more
Other related courses
12:40:21 Hours
0 1 Free
29:17:17 Hours
Updated Tue, 07-Jun-2022
0 0 Free
08:06:34 Hours
Updated Tue, 07-Jun-2022
0 0 Free
31:58:27 Hours
0 0 Free
18:45:19 Hours
0 0 Free
About instructor

Admin corner

0 Reviews | 5 Students | 48 Courses
Student feedback
0
0 Reviews
  • (0)
  • (0)
  • (0)
  • (0)
  • (0)

Reviews

Free
Includes: