The ImportError: cannot import name ‘AutoModelWithLMHead’ from ‘transformers’ error occurs when you try to import AutoModelWithLMHead class from the transformers library, but the AutoModelWithLMHead class has been deprecated and removed from the transformers library.
To fix the ImportError: cannot import name ‘AutoModelWithLMHead’ from ‘transformers’ error, you should use the AutoModelForCausalLM, AutoModelForMaskedLM, or AutoModelForSeq2SeqLM classes, depending on your use case.
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "gpt2"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Tokenize and generate text
input_text = "Once upon a time"
input_tokens = tokenizer.encode(input_text, return_tensors="pt")
output_tokens = model.generate(input_tokens)
output_text = tokenizer.decode(output_tokens[0])
print(output_text)
Output
You can also replace the AutoModelForCausalLM with AutoModelForMaskedLM or AutoModelForSeq2SeqLM as needed.