-
Notifications
You must be signed in to change notification settings - Fork 589
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: ollama.create() fails with ollama server message indicating missing path or modelfile, but CLI cclient works with same information #451
Comments
Working Modelfile is attached for reference. |
Are you trying to create a model? Based on your kwargs it seems that you are looking for https://github.com/ollama/ollama-python/blob/main/examples/chat.py |
I can't speak to the internal workings. I just know that, yes, I am trying
to make a model. I am using `ollama.create()` as described in the README,
and this is the end result and backtrace.
…On Mon, Feb 24, 2025, 12:55 PM Parth Sareen ***@***.***> wrote:
Are you trying to create a model? Based on your kwargs it seems that you
are looking for
https://github.com/ollama/ollama-python/blob/main/examples/chat.py
—
Reply to this email directly, view it on GitHub
<#451 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAXYEBNLDDXC5FX7U5WNU4L2RNMI7AVCNFSM6AAAAABXWOPYSOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNZZGIZTMMJRGA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
[image: ParthSareen]*ParthSareen* left a comment
(ollama/ollama-python#451)
<#451 (comment)>
Are you trying to create a model? Based on your kwargs it seems that you
are looking for
https://github.com/ollama/ollama-python/blob/main/examples/chat.py
—
Reply to this email directly, view it on GitHub
<#451 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAXYEBNLDDXC5FX7U5WNU4L2RNMI7AVCNFSM6AAAAABXWOPYSOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNZZGIZTMMJRGA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
To be clear, I am not using `ollama.chat()` as shown in the example you
linked and do not want to.
On Mon, Feb 24, 2025, 7:11 PM Josh Marshall <
***@***.***> wrote:
… I can't speak to the internal workings. I just know that, yes, I am
trying to make a model. I am using `ollama.create()` as described in the
README, and this is the end result and backtrace.
On Mon, Feb 24, 2025, 12:55 PM Parth Sareen ***@***.***>
wrote:
> Are you trying to create a model? Based on your kwargs it seems that you
> are looking for
> https://github.com/ollama/ollama-python/blob/main/examples/chat.py
>
> —
> Reply to this email directly, view it on GitHub
> <#451 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AAXYEBNLDDXC5FX7U5WNU4L2RNMI7AVCNFSM6AAAAABXWOPYSOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNZZGIZTMMJRGA>
> .
> You are receiving this because you authored the thread.Message ID:
> ***@***.***>
> [image: ParthSareen]*ParthSareen* left a comment
> (ollama/ollama-python#451)
> <#451 (comment)>
>
> Are you trying to create a model? Based on your kwargs it seems that you
> are looking for
> https://github.com/ollama/ollama-python/blob/main/examples/chat.py
>
> —
> Reply to this email directly, view it on GitHub
> <#451 (comment)>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AAXYEBNLDDXC5FX7U5WNU4L2RNMI7AVCNFSM6AAAAABXWOPYSOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNZZGIZTMMJRGA>
> .
> You are receiving this because you authored the thread.Message ID:
> ***@***.***>
>
|
Think I see the issue - can you make sure your ollama version is at 0.5.12? - https://github.com/ollama/ollama/releases/tag/v0.5.12 |
The ollama chat and chatResponse functions are broken after Ollama has been updated recently. so far only Client works, I see a bunch of issues here all related to this bug. Please ping me so I can confirm it for you, and you can associate all the open issues under one. I also reached out in the discord. |
If it wasnt clear, this issue is related, hence why I am commenting here |
On NixOS branch 24.11 with ollama 0.3.12 for the client and server and the python module ollama 0.4.7 I and receiving the following from executing my program in it's traceback:
The input immediately preceding the failure is:
The following is printed from the server logs:
This error does not occur when using the CLI utility. There are two points of redress. The first is that, according to documentation, my call is valid and should not result in error. The other is that the error produced is too vague to fix my code on my own.
The text was updated successfully, but these errors were encountered: