​Webinar on “Let’s explore Language Model: Google BERT”​

Organized By: WIE IEEE AU SB​

Total number of participants: 50​participants (approx) ​ ​.

Date: 19​th July 2020

Venue: Google Meet​

Speaker: Krupa Galiya​

Topics Covered: 

  • What is NLP and basics of NLP
  • Introduction to BERT. Bidirectional Encoder Representations from Transformers​ (BERT) is a technique for NLP (Natural Language Processing) pre-training developed by Google.
  • History of BERT: BERT has its origins from pre-training contextual representations​ including Semi-supervised Sequence Learnin​g,​ Generative Pre-Trainin​g,​ Elmo,​ and ULMFi​ Unlike previous models, BERT is a deeply bidirectional, unsupervised language representation, pre-trained using only a plain text corpus
  • How to apply the BERT model for search
  • Comparison of BERT model with other models.
  • Application of BERT: applying BERT models to both ranking and featured snippets​ in Search
  • BERT using Tensorflow