Getting started with transformers-based natural language processing (NLP)
Natural Language Processing with transformers. The objects that this project is:
- NLP beginners, transformer beginners
- Have a certain basic programming of python and pytorch
- Interested in cutting-edge transformer models
- Understand and know simple deep learning models
The vision of this project is:
We hope to combine vivid explanations of principles and multiple hands-on practical projects to help beginners quickly get started with NLP in the deep learning era.
The main reference materials for this project are:
- Huggingface/Transformers code library
- Many excellent Transformers explain and share
Project Members:
- erenup (Duoduo Notes), Peking University, person in charge
- Zhang Fan, Datawhale, Tianjin University, Chapter 4
- Zhang Xian, Harbin Institute of Technology, Chapter 2
- Li Liqiu, Zhejiang University, Chapter 3
- Cai Jie, Peking University, Chapter 4
- hlzhang, McGill University, Chapter 4
- Taiwan Yunpeng Chapter 2
- Chapter 2 of Zhang Hongxu
This project has summarized and studied many excellent documents and sharing, and has marked sources in each chapter. If there is any infringement, please contact the project members in time. Thank you. Go to Github and click Star and learn more with half the effort? Thank you.
Project content
Chapter 1 - Preface
- 1.0-Local reading and code running environment configuration.md
- 1.1-The rise of Transformers in NLP
Chapter 2-Related Principles of Transformer
- 2.1-Illustrated attention
- 2.2- Graphic transformer
- 2.2.1-Pytorch writing Transformer.md
- 2.2.2-Pytorch writing Transformer-optional reading.md
- 2.3-Illustrated BERT
- 2.4-Graphic GPT
- 2.5-Chapter Test
Chapter 3 - Write a Transformer model: BERT
- 3.1-How to implement a BERT
- 3.2-How to apply a BERT
- 3.3-Chapter Test
Chapter 4 - Use Transformers to solve NLP tasks
- 4.0-Foreword
- 4.1-Text Classification
- 4.2-Sequence labeling
- 4.3-Q&A task-extracted Q&A
- 4.4-Q&A Task-Multiple Choice Questions and Answers
- 4.5-Generate Task-Language Model
- 4.6-Generate Tasks-Machine Translation
- 4.7-Generate Task-Summary Generation
- 4.8-Chapter Test