This repository contains code for the O'Reilly Live Online Training for BERT
This training will focus on how BERT is used for a wide variety of NLP tasks including text classification, question answering, and machine translation. The training will begin with an introduction to necessary concepts including language models and transformers and then build on those concepts to introduce the BERT architecture. We will then move into how BERT is architected specifically as a language model that can be used for multiple natural language processing tasks with hands-on examples of fine-tuning pre-trained BERT models.
BERT is one of the most relevant NLP architectures today and it is closely related to other important NLP deep learning models like GPT-3. Both of these models are derived from the newly invented transformer architecture and represent an inflection point in how machines process language and context.
Notebook 1: [Introduction to BERT](Introduction to BERT.ipynb)
Notebook 2: [Pre-training and fine-tuning BERT](Fine-tuning BERT.ipynb)
Sinan Ozdemir is currently the Director of Data Science at Directly, managing the AI and machine learning models that power the company’s intelligent customer support platform. Sinan is a former lecturer of Data Science at Johns Hopkins University and the author of multiple textbooks on data science and machine learning. Additionally, he is the founder of the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. He holds a Master’s Degree in Pure Mathematics from Johns Hopkins University and is based in San Francisco, CA.