Step by step NLP examples using PyTorch and Lightning
- Character RNN that's trained with a covid_19 faq file
- After training, it generates text based on a prime input.
- Generated text and loss values are recorded to and observed
- Same as above but this time it uses LSTM block
- Seq2Seq training and inference using LSTM layer
- Same as above but utilizes GRU instead of LSTM and more input used for GRU and decoder outputs
- Same as above but utilizes Bahdanau attention (not self attention) using bidirectional GRU
- Same as above but utilizes pack_padded_sequence, pad_packed_sequence, and source masking in attention to optimize paddings
- Basic implementation of transformer architecture for translation problem
References
- seq2seq examples by Ben Trevett https://github.com/bentrevett/pytorch-seq2seq
- Deep Learning Course by Prof.Sebastian Raschka https://www.youtube.com/watch?v=1nqCZqDYPp0&list=PLTKMiZHVd_2KJtIXOW0zFhFfBaJJilH51
- Deep Learning Tutorials by UvA https://uvadlc-notebooks.readthedocs.io/en/latest/index.html
- TSAI-DeepNLP-END2.0 by extensive-nlp https://github.com/extensive-nlp/TSAI-DeepNLP-END2.0/