-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
微调LLaVA报错 #6
Comments
已修复,感谢您的反馈。 |
作者您好,除了在benchmark上测试之外,或许有推理代码可供参考吗?即对于预训练的ckpt或微调之后的ckpt,进行简单的单样本推理。或者我应该参考哪个仓库/模型的代码? |
您可以使用 from vlrlhf.eval.utils import load_model_and_processor
model,processor,generation_kwargs = load_model_and_processor(YourModelPath,None)
image_path = 'a.jpg'
prompt = 'Describe this image'
prompt = processor.format_multimodal_prompt(prompt,image_path)
inputs = processor(texts=[prompt], images_path=[image_path], check_format=False)
inputs.pop('label',None)
outputs = model.generate(**inputs, use_cache=True, **generation_kwargs) |
想问下现在的代码仓库支持KTO吗,我看scripts里面有kto相关的脚本,例如kto_qwenvl?如果还不支持的话,后续有计划吗 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
请问这个报错该如何解决?
The text was updated successfully, but these errors were encountered: