Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How long does inference take? #12

Open
CosimoRulli opened this issue Jan 14, 2021 · 1 comment
Open

How long does inference take? #12

CosimoRulli opened this issue Jan 14, 2021 · 1 comment

Comments

@CosimoRulli
Copy link

Hello developers,
I followed the guidelines in your ReadMe to generate the dense representations for MS Marco Document Ranking, using the MaxP checkpoint that you provide. My process has been running for more than 80 hours, on a server with a T4 Tesla GPU and Intel Xeon Platinum CPU (looking at htop, I observe that it is running with a single thread). Is such a long inference time normal? Am I missing something to speedup this process?

@juyongjiang
Copy link

Hi, do you find the BM25 index generation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants