In the last post , we have talked about Transformer pipeline , the inner workings of all important tokenizer module and in the last we made predictions using the exiting pre-trained models. During fine-tuning, we can adjust the weights of the model in the following two ways: Update the weights of the pre-trained BERT model along…
Tag: huggingface
1.0 – Getting started with Transformers for NLP
In this post we will go through about how to do hands-on implementation with Hugging Face transformers library for solving few simple NLP tasks, we will mainly talk about hands-on part , in case you are interested to learn more about transformers/attention mechanism , below are few resources – Getting started with Google BertNeural Machine…