Advanced NLP
Dive into RNNs, GPU efficient code, & more.
About
Before starting the course, let’s go over the content and what you need before jumping in.
​
This course covers more advanced topics in NLP, such as subword tokenization (what ChatGPT uses), optimizing models for GPU performance, and RNNs (recurrent neural networks).
Is this course for you?
Before jumping into this course, I would recommend understanding most of Generative LLMs. If you’re iffy on a couple topics, don’t worry since we’ll do a quick review first.
If you have an understanding of how ChatGPT works and want to dive deeper into NLP, this course is for you.
Topics
-
Advanced Tokenization Techniques (e.g. Byte Pair Encoding)
-
Types of Attention (more than just Self Attention)
-
Sampling Algorithms (Beam Search, Top-K, etc.)
-
Optimizing code for GPU performance (e.g. vectorization)
-
Optimizers (e.g. Adam)
-
RNNs (Recurrent Neural Networks)
-
LSTMs and GRUs (better RNNs)
-
Transformer Modifications (e.g. Mixture of Experts)
-
RLHF (Reinforcement Learning from Human Feedback)
​
Modules coming soon.