You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
Trying to implement a way to question PDFs locally and get answers only based on data from the docs. I have already find a way to embed the data into a vector db (using Chroma) and then retrieve with a "similarity_search" the most relevant data from our query into the doc. I would like now to find a way to give to my model this context to generate answer on it, maybe by using a prompt into the generate call ?
query = "What is the date of the start of the battle ?"
docs = db.similarity_search(query)
print(docs[0].page_content)
llm = Ollama(
model=llm_model_name,
callbacks=[StreamingStdOutCallbackHandler()],
)
my_retriever = db.as_retriever(search_kwargs={"k": 8})
response = ollama.generate(
"model": llm_model_name,
)```
Thank you for your help !
The text was updated successfully, but these errors were encountered:
Hello,
Trying to implement a way to question PDFs locally and get answers only based on data from the docs. I have already find a way to embed the data into a vector db (using Chroma) and then retrieve with a "similarity_search" the most relevant data from our query into the doc. I would like now to find a way to give to my model this context to generate answer on it, maybe by using a prompt into the generate call ?
The text was updated successfully, but these errors were encountered: