You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you so much for open-sourcing your implementation and for your great work! I have two questions, and your help would be much appreciated!
I followed the decoding using pre-trained models and evaluation as described in your repo, and I was not able to reproduce your few-shot results in Table 2. There is up to ~5 points difference in BERTScore for both contrastive and common summaries and significant difference in ROUGE score up to ~4 points. Would there be any other hyperparameters I should fix? As a comparison, self-supervised results are much closer to what's reported on your paper.
My process keeps getting killed while trying to create train_comm_pair.jsonl at line
, and my machine has 192B RAM. I was wondering if you could share the file or help me figure out this/optimize that line?
Thank you so much.
The text was updated successfully, but these errors were encountered:
belizgunel
changed the title
Out of memory while creating train_comm_pair.jsonl
Cannot reproduce few-shot results (Table 2) && Out of memory while creating train_comm_pair.jsonl
Oct 26, 2022
Thank you so much for open-sourcing your implementation and for your great work! I have two questions, and your help would be much appreciated!
cocosum/prep.py
Line 154 in 2a94132
Thank you so much.
The text was updated successfully, but these errors were encountered: