Final Exam: Natural Language Processing
Natural Language Processing
| Intermediate
- 1 video | 32s
- Includes Assessment
Final Exam: Natural Language Processing will test your knowledge and application of the topics presented throughout the Skillsoft Aspire Natural Language Processing Journey.
WHAT YOU WILL LEARN
-
Illustrate phonomes, morpheme and lexemesdescribe syntactic and semantic analysis for nlpdescribe heuristic approaches to solve nlp tasksspecify challenges with nlp in real world problem solvingillustrate various tools in nlp used across different industriesexplain basic overview of nltk ecosystemclassify the difference between spacy and nltkexplain overview of spacy models and language supportperform spacy installations and models, part of speech, morphology, lemmatization; demonstrate dependency parsing, named entities, entity linking, tokenization, merging & splitting; demonstrate sentence segmentation, sentence similarity,illustrate and extract synonyms and identify wordnet hierarchies - hypernyms and hyponymsdemonstrate python re module, re - search, find all, finditer, groups, find and replace, splitrecall anchors, character classes, greedy, lazy and backtracking, performanceidentify anchors, character classes, greedy, lazy and backtracking, performanceperform exploration of meronyms and holonymmsillustrate one-hot-encoding, bag of words, n-gram, tfidfdemonstrate data loading and basic overview of columnsdemonstrate simple model building & evaluationrestate logistic regression, svm, naive bayes & boosting modelsexplain the existence of polyglot & textblob and describe the benefits over nltk & spacy, use-casesexplain the existence of gensimdemonstrate query similarity using gensimdemonstrate building an lda model for topic modeling using genismexplain nlp with deep learning;illustrate various use cases in nlp across different industriesexplain the basic overview of spacy, tensorflow ecosystemillustrate overview of dataillustrate the basics of data loading and columnsillustrate single layer perceptron architecture of neural network; illustrate multi layer perceptron architecture of neural networkdescribe recurrent neural network architecture and how it can capture context in languageillustrate different application of basic neural network based architecture
-
describe the steps to load the amazon product reviews dataset into google colaboratoryexplain the importance of memory based learning and different networks it supportsperform review classification using grudescribe gated recurrent unit and it's difference from rnnillustrate different types of lstm networksexplain what is transfer learning and how it help's to get better results, explore different types of learnings and different types of transfer learning; specify advantages & challenges of transfer learning in real world problem solvingdescribe and explore fasttext & word2vecimplement and fine tune the lm model using ulmfitdescribe and explore elmodescribe and explore fasttext & word2vecexplain github bug data and problem statement; perform library loading, data loading and basic overview of columnsdemonstrate exploratory data analysis (eda) - word count analysis and label analysisexplain github bug data and problem statementdemonstrate eda - punctuation analysis, stop word analysis, and word clouddemonstrate fundamentals of transformer architecturediscover the use of the attention method to improve nlp model resultsexplain the encoder-decoder network architecture and its applicationdemonstrate fundamentals of the self-attention layer in transformer architectureexplore bert transfer learning approachesperform model evaluation and predicting sentiment for raw textdiscover different types of bert models and their significanceperform data pre-processing and bert tokenizationdiscover fundamentals of language models, explain basics of gpt modelsdescribe few shot learning concept used in gpt 3discover different industry use cases and challenges for gptgreedy search, beam search, and basic samplingexplore machine translation problem and its applicationtransformer encoder to produce a new representation of input sequenceframe the problem end to end. understand how single sentence translation workstransformer decoder to produce output target sequence