What is BERT? | Deep Learning Tutorial 46 (Tensorflow, Keras & Python)

What is BERT (Bidirectional Encoder Representations From Transformers) and how it is used to solve NLP tasks? This video provides a very simple explanation of it. I am not going to go in details of how transformer based architecture works etc but instead I will go over an overview where you understand the usage of BERT in NLP tasks. In coding section we will generate sentence and word embeddings using BERT for some sample text. We will cover various topics such as, * Word2vec vc BERT * How BERT is trained on masked language model and next sentence completion task ⭐️ Timestamps ⭐️ 00:00 Introduction 00:39 Theory 11:00 Coding in tensorflow Code: BERT article: Word2Vec video: Deep learning playlist: Machine learning playlist : 
Back to Top