Skip to content

Latest commit

 

History

History
9 lines (7 loc) · 882 Bytes

README.md

File metadata and controls

9 lines (7 loc) · 882 Bytes

Intent_Recognition_with_BERT-TPU

Intent Recognition with Bert Model In Google Colab TPU using Tensorflow

BERT(Bidirectional Encoder Representations from Transformers) is pretrained model for Natural Language Processing(NLP) tasks. It is introduced by Google AI team in October 2018.It is trained on Wikipedia and Corpus dataset, it knows the language and context which is quiet decent. It has two versions Base(12 encoders) and Large (24 encoders). BERT is built on top of he multiple clever ideas by the NLP community. Some examples are ELMo, the OpenAI Transformer and The Transformer.

If you want to get more information about BERT, follow the following links.