-
Notifications
You must be signed in to change notification settings - Fork 589
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
System Message causing no answer from Assistant #72
Comments
For general use as shown in most examples, you should have a local ollama server running to be able to continue. To do this:
|
This is verbage as part of the PR: #64 |
It is also worth noting that you are using an For a non async client you do not need await: import ollama
response = ollama.chat(model='llama2', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content']) For an async client, you should use an await. import asyncio
from ollama import AsyncClient
async def chat():
message = {'role': 'user', 'content': 'Why is the sky blue?'}
response = await AsyncClient().chat(model='llama2', messages=[message])
asyncio.run(chat()) |
@connor-makowski Thanks for your feedback. I used both solution (sync and Async clients). the problem is that when assuming a message with Role: System, LLM is not giving answer. |
what model are you using? your snippet doesn't stream. is it possible the llm is responding but hasn't completed yet? in this mode, ollama will wait until it has the full response before returning to the call. this could look like non-response if it's also generating tokens at a slow rate (due to hardware limitations) |
Hello all,
I´m trying to use the system message as described below. Evertytime I use it I don´t have any answer from the LLM.
I was trying to find if there is any issue reported but I didn´t found it. Can someone help me on this ?
Thanks
The text was updated successfully, but these errors were encountered: