ImportError: cannot import name ‘AutoModelWithLMHead’ from ‘transformers’ error occurs when you try to import AutoModelWithLMHead class from the transformers library, but the AutoModelWithLMHead class has been deprecated and removed from the transformers library.
To fix the ImportError: cannot import name ‘AutoModelWithLMHead’ from ‘transformers’ error, you should use the AutoModelForCausalLM, AutoModelForMaskedLM, or AutoModelForSeq2SeqLM classes, depending on your use case.
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "gpt2"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Tokenize and generate text
input_text = "Once upon a time"
input_tokens = tokenizer.encode(input_text, return_tensors="pt")
output_tokens = model.generate(input_tokens)
output_text = tokenizer.decode(output_tokens[0])
print(output_text)
Output
You can also replace the AutoModelForCausalLM with AutoModelForMaskedLM or AutoModelForSeq2SeqLM as needed.
That’s it.

Krunal Lathiya is a seasoned Computer Science expert with over eight years in the tech industry. He boasts deep knowledge in Data Science and Machine Learning. Versed in Python, JavaScript, PHP, R, and Golang. Skilled in frameworks like Angular and React and platforms such as Node.js. His expertise spans both front-end and back-end development. His proficiency in the Python language stands as a testament to his versatility and commitment to the craft.