Deep Learning for NLP: Memory-based Networks
Natural Language Processing
| Intermediate
- 12 videos | 1h 27m 5s
- Includes Assessment
- Earns a Badge
In the journey to understand deep learning models for natural language processing (NLP), the subsequent iterations are memory-based networks, which are much more capable of handling extended context in languages. While basic neural networks are better than machine learning (ML) models, they still lack in more significant and large language data problems. In this course, you will learn about memory-based networks like gated recurrent unit (GRU) and long short-term memory (LSTM). Explore their architectures, variants, and where they work and fail for NLP. Then, consider their implementations using product classification data and compare different results to understand each architecture's effectiveness. Upon completing this course, you will have learned the basics of memory-based networks and their implementation in TensorFlow to understand the effect of memory and more extended context for NLP datasets.
WHAT YOU WILL LEARN
-
Discover the key concepts covered in this courseOutline the importance of memory-based learning and the different networks it supportsOutline gated recurrent unit (gru) and how it differs from recurrent neural networks (rnns)Outline long short-term memory (lstm) networks and how they differ from rnnIllustrate how lstm networks work better and solve the vanishing gradient problemIllustrate different types of lstm networks
-
Perform data preparation for lstm and gru networksPerform review classification using gruPerform review classification using lstmPerform review classification using bidirectional long short-term memory (bi-lstm)Compare results of important features across different networksSummarize the key concepts covered in this course
IN THIS COURSE
-
1m 17s
-
4m 28sIn this video, you will outline the importance of memory-based learning and the different types of memory it supports. FREE ACCESS
-
9m 9sIn this video, you will outline a gated recurrent unit (GRU) and how it differs from recurrent neural networks (RNNs). FREE ACCESS
-
8m 40sIn this video, you will outline long short-term memory (LSTM) networks and how they differ from traditional recurrent neural networks (RNNs). FREE ACCESS
-
1m 53sAfter completing this video, you will be able to illustrate how LSTM networks work better and solve the vanishing gradient problem. FREE ACCESS
-
4m 39sUpon completion of this video, you will be able to illustrate different types of LSTM networks. FREE ACCESS
-
10m 35sIn this video, learn how to prepare data for LSTM and GRU networks. FREE ACCESS
-
10m 16sLearn how to classify reviews using GRU. FREE ACCESS
-
10m 58sFind out how to classify reviews using LSTM. FREE ACCESS
-
12m 9sLearn how to classify reviews using bidirectional long short-term memory (Bi-LSTM). FREE ACCESS
-
11m 35sLearn how to compare the results of important features across different networks. FREE ACCESS
-
1m 26s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.