-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: ModulesToSaveWrapper has no attribute dense
#2326
Comments
I could not reproduce the error. This is what I tried: from transformers import EsmForSequenceClassification
from peft import OFTConfig, TaskType, get_peft_model, AutoPeftModelForSequenceClassification
model = EsmForSequenceClassification.from_pretrained("facebook/esm2_t6_8M_UR50D", num_labels=7)
config = OFTConfig(task_type=TaskType.SEQ_CLS, target_modules=['dense'])
model_OFT = get_peft_model(model, config)
model_OFT.save_pretrained("/tmp/peft/2326")
# no error:
model_peft = AutoPeftModelForSequenceClassification.from_pretrained("/tmp/peft/2326", num_labels=7) Could you please check how this differs from what you're doing? Ideally, you can post a complete reproducer for me to check. Training the model should not be necessary for this type of error. |
The code I use:
First of all, thank you very much for your reply. I'm sorry if it took up your time.I used the same code as you in jupyter, and the same error occurred. The complete error message is as follows:
Is there something wrong with the peft/transformers package version I'm using? Here is my version:
And I have always wanted to know whether the error "Some weights of EsmForSequenceClassification were not initialized from the model checkpoint at model/esm2_35M and are newly initialized: ['classifier.dense.bias', 'classifier.dense.weight', 'classifier.out_proj.bias', 'classifier.out_proj.weight'] |
Since you're using a local model, I can't reproduce. Is it different from the one from HF that I used?
If the same code errors for you, it could very well be the versions. Could you try updating both PEFT and transformers to their latest versions?
Yes, PEFT should take care of that when you specify |
@BenjaminBossan
I modified the Trainer using the following code:
But of course the following error occurred again:
But in the end, my problems were solved using the 0.14.0 version of peft.Thank you very much for your help. |
Great to hear that the initial issue was solved by updating the PEFT version. I'll close this issue then. Regarding your new issue, I don't have an idea ad hoc what the reason could be. If this continues to bother you, feel free to open a new issue, ideally with a full reproducer and using an open model. Thanks. |
System Info
Original model architecture:
my code:
Peft model architecture:
adapter_config.json:
Who can help?
@BenjaminBossan
Information
Tasks
examples
folderReproduction
After training, I load the model from the saved checkpoint, using the following codes:
Got this error:
Expected behavior
Find out the cause and solve the problem
The text was updated successfully, but these errors were encountered: