The BERT Error – Some weights of the model checkpoint at were not used when initializing BertModel occurs when you try to load a pre-trained BERT model and face an error related to mismatched weights.
How to fix it?
Verify the pre-trained model checkpoint
Ensure you are using the correct pre-trained model checkpoint for the BERT model you want to use.
Import the correct BERT model class
Depending on the task you are working on, you might need a different BERT model class. For instance, you should use BertForSequenceClassification for text classification, BertForTokenClassification for named entity recognition, etc. Make sure you import and initialize the correct BERT model class.
from transformers import BertTokenizer, BertForSequenceClassification
# Load pre-trained model and tokenizer
model_name = "bert-base-uncased"
tokenizer = BertTokenizer.from_pretrained(model_name)
model = BertForSequenceClassification.from_pretrained(model_name)
Check the weights
If you still encounter the error after verifying the pre-trained model checkpoint and importing the correct BERT model class, ensure the weights in the checkpoint match the model architecture.
You need to modify your model architecture to match the pre-trained model or fine-tune the model with your specific dataset to create a custom checkpoint.
I hope this will help you resolve your error.

Krunal Lathiya is a seasoned Computer Science expert with over eight years in the tech industry. He boasts deep knowledge in Data Science and Machine Learning. Versed in Python, JavaScript, PHP, R, and Golang. Skilled in frameworks like Angular and React and platforms such as Node.js. His expertise spans both front-end and back-end development. His proficiency in the Python language stands as a testament to his versatility and commitment to the craft.