Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with LlamaParse Vendor Model Gemini 1.5 Pro #594

Open
qasim-mobi opened this issue Jan 29, 2025 · 0 comments
Open

Issue with LlamaParse Vendor Model Gemini 1.5 Pro #594

qasim-mobi opened this issue Jan 29, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@qasim-mobi
Copy link

qasim-mobi commented Jan 29, 2025

Describe the bug
Previously LlamaParse Is Correctly extracting the Content from the files using gemini 1.5 pro as vendor model but However, now, for every file I test, it is returning 'NO_CONTENT_HERE'.
I have tried multiple files, but the result remains the same.

Job ID
84d8517e-9a44-4b25-95d4-4ff11cee2b90
173d9800-7a87-432c-96b8-4f48b3a52279
b1b7c9e9-6c3f-4d45-9a20-d7119b169016
faff6482-5396-4316-a05f-3f831cb68c9f
e9cfbaa1-8cf0-4b9f-8d79-a4b306d2af0a

Client:

  • Python Library
  • Frontend (cloud.llamaindex.ai)

Options I use:

result_type="markdown",
verbose=True,
num_workers=7,
use_vendor_multimodal_model=True,
vendor_multimodal_model_name="gemini-1.5-pro",
formatting_instruction="""PROMPT"""

@qasim-mobi qasim-mobi added the bug Something isn't working label Jan 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant