Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

error: undefined - MacOS Ollama #224

Open
nicolas-g opened this issue Jun 25, 2024 · 27 comments
Open

error: undefined - MacOS Ollama #224

nicolas-g opened this issue Jun 25, 2024 · 27 comments
Labels
bug Something isn't working

Comments

@nicolas-g
Copy link

Describe the bug
I'm not able to search running Perplexica as a docker container locally and Ollama.

To Reproduce

Ollama installation

$ brew install ollama 
$ brew services start ollama
$ ollama pull llama3:latest

confirm the service is working

$ curl  http://127.0.0.1:11434

Ollama is running%

config file:

[API_ENDPOINTS]
SEARXNG = "http://localhost:32768" # SearxNG API URL
OLLAMA = "http://host.docker.internal:11434"

open http://localhost:3000 and do any search ...

Expected behavior
Perplexica be able to return answers for any query with no errors

Additional context

Anytime I try to search for something I see error: undefined in the log entries:

docker logs perplexica-perplexica-backend-1 -f
yarn run v1.22.19
$ node dist/app.js
info: WebSocket server started on port 3001
info: Server is running on port 3001


error: undefined
error: undefined
@nicolas-g nicolas-g added the bug Something isn't working label Jun 25, 2024
@ItzCrazyKns
Copy link
Owner

ItzCrazyKns commented Jun 25, 2024

Tried reproducing but couldn't see anything, can you provide more information? Can you provide a screenshot of the settings menu.

@delfireinoso
Copy link

delfireinoso commented Jun 26, 2024

I run Perplexica on a Mac. Ollama is running on the Mac app because I see when Ollama has a pending update with the icon.

if you follow the Perplexica compose installation, and you put OLLAMA = "http://host.docker.internal:11434" on the Config.toml file you should have Perplexica running.

By default Perplexica frontend is on localhost:3000, see if you have anything listening there

@Arun-chaitanya
Copy link

Getting the same bug

@Arun-chaitanya
Copy link

Screenshot 2024-06-29 at 3 00 41 PM When I try to run it for first time.

@ItzCrazyKns
Copy link
Owner

Screenshot 2024-06-29 at 3 00 41 PM When I try to run it for first time.

What are the logs from the backend container

@Arun-chaitanya
Copy link

Arun-chaitanya commented Jun 29, 2024

2024-06-29 15:08:06 yarn run v1.22.19
2024-06-29 15:08:06 $ node dist/app.js
2024-06-29 15:08:06 info: WebSocket server started on port 3001
2024-06-29 15:08:06 info: Server is running on port 3001
2024-06-29 15:08:15 error: undefined
2024-06-29 15:08:16 error: undefined
2024-06-29 15:08:25 node:internal/process/promises:289
2024-06-29 15:08:25             triggerUncaughtException(err, true /* fromPromise */);
2024-06-29 15:08:25             ^
2024-06-29 15:08:25 
2024-06-29 15:08:25 TypeError: Invalid URL
2024-06-29 15:08:25     at new URL (node:internal/url:775:36)
2024-06-29 15:08:25     at OpenAI.buildURL (/home/perplexica/node_modules/openai/core.js:318:15)
2024-06-29 15:08:25     at OpenAI.buildRequest (/home/perplexica/node_modules/openai/core.js:199:26)
2024-06-29 15:08:25     at OpenAI.makeRequest (/home/perplexica/node_modules/openai/core.js:274:44)
2024-06-29 15:08:25     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2024-06-29 15:08:25     at async /home/perplexica/node_modules/@langchain/openai/dist/chat_models.cjs:684:29
2024-06-29 15:08:25     at async RetryOperation._fn (/home/perplexica/node_modules/p-retry/index.js:50:12) {
2024-06-29 15:08:25   code: 'ERR_INVALID_URL',
2024-06-29 15:08:25   input: 'null/chat/completions'
2024-06-29 15:08:25 }
2024-06-29 15:08:25 
2024-06-29 15:08:25 Node.js v20.8.1
2024-06-29 15:08:25 error Command failed with exit code 1.
2024-06-29 15:08:25 info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
2024-06-29 15:08:25 yarn run v1.22.19
2024-06-29 15:08:25 $ node dist/app.js
2024-06-29 15:08:25 info: WebSocket server started on port 3001
2024-06-29 15:08:25 info: Server is running on port 3001

@ItzCrazyKns
Copy link
Owner

2024-06-29 15:08:06 yarn run v1.22.19
2024-06-29 15:08:06 $ node dist/app.js
2024-06-29 15:08:06 info: WebSocket server started on port 3001
2024-06-29 15:08:06 info: Server is running on port 3001
2024-06-29 15:08:15 error: undefined
2024-06-29 15:08:16 error: undefined
2024-06-29 15:08:25 node:internal/process/promises:289
2024-06-29 15:08:25             triggerUncaughtException(err, true /* fromPromise */);
2024-06-29 15:08:25             ^
2024-06-29 15:08:25 
2024-06-29 15:08:25 TypeError: Invalid URL
2024-06-29 15:08:25     at new URL (node:internal/url:775:36)
2024-06-29 15:08:25     at OpenAI.buildURL (/home/perplexica/node_modules/openai/core.js:318:15)
2024-06-29 15:08:25     at OpenAI.buildRequest (/home/perplexica/node_modules/openai/core.js:199:26)
2024-06-29 15:08:25     at OpenAI.makeRequest (/home/perplexica/node_modules/openai/core.js:274:44)
2024-06-29 15:08:25     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2024-06-29 15:08:25     at async /home/perplexica/node_modules/@langchain/openai/dist/chat_models.cjs:684:29
2024-06-29 15:08:25     at async RetryOperation._fn (/home/perplexica/node_modules/p-retry/index.js:50:12) {
2024-06-29 15:08:25   code: 'ERR_INVALID_URL',
2024-06-29 15:08:25   input: 'null/chat/completions'
2024-06-29 15:08:25 }
2024-06-29 15:08:25 
2024-06-29 15:08:25 Node.js v20.8.1
2024-06-29 15:08:25 error Command failed with exit code 1.
2024-06-29 15:08:25 info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
2024-06-29 15:08:25 yarn run v1.22.19
2024-06-29 15:08:25 $ node dist/app.js
2024-06-29 15:08:25 info: WebSocket server started on port 3001
2024-06-29 15:08:25 info: Server is running on port 3001

Have you configured the OpenAI API key or the Groq API key or the Ollama address? Once of them is required for Perplexica to work.

@Arun-chaitanya
Copy link

Sorry for late reply. Yes I gave Ollama docker port address, which was successfully running in my local but it still did not work

@Arun-chaitanya
Copy link

[GENERAL]
PORT = 3001 # Port to run the server on
SIMILARITY_MEASURE = "cosine" # "cosine" or "dot"

[API_KEYS]
OPENAI = "" # OpenAI API key - sk-1234567890abcdef1234567890abcdef
GROQ = "" # Groq API key - gsk_1234567890abcdef1234567890abcdef

[API_ENDPOINTS]
SEARXNG = "http://localhost:32768" # SearxNG API URL
OLLAMA = "http://host.docker.internal:11434" # Ollama API URL - http://host.docker.internal:11434

@ItzCrazyKns
Copy link
Owner

What OS are you on?

[GENERAL] PORT = 3001 # Port to run the server on SIMILARITY_MEASURE = "cosine" # "cosine" or "dot"

[API_KEYS] OPENAI = "" # OpenAI API key - sk-1234567890abcdef1234567890abcdef GROQ = "" # Groq API key - gsk_1234567890abcdef1234567890abcdef

[API_ENDPOINTS] SEARXNG = "http://localhost:32768" # SearxNG API URL OLLAMA = "http://host.docker.internal:11434" # Ollama API URL - http://host.docker.internal:11434

@rb81
Copy link

rb81 commented Jul 7, 2024

+1 - Getting the same issue. I'm running Ollama on Ubuntu Server. IP address and port set in the Perplexica config works on all other apps. Inference seems to be triggered, but after about 3 seconds Perplexica throws the following error:

Failed to connect to the server. Please try again later.

@ItzCrazyKns
Copy link
Owner

+1 - Getting the same issue. I'm running Ollama on Ubuntu Server. IP address and port set in the Perplexica config works on all other apps. Inference seems to be triggered, but after about 3 seconds Perplexica throws the following error:

Failed to connect to the server. Please try again later.

What are the logs from the backend container?

@Arun-chaitanya
Copy link

What OS are you on?

[GENERAL] PORT = 3001 # Port to run the server on SIMILARITY_MEASURE = "cosine" # "cosine" or "dot"
[API_KEYS] OPENAI = "" # OpenAI API key - sk-1234567890abcdef1234567890abcdef GROQ = "" # Groq API key - gsk_1234567890abcdef1234567890abcdef
[API_ENDPOINTS] SEARXNG = "http://localhost:32768" # SearxNG API URL OLLAMA = "http://host.docker.internal:11434" # Ollama API URL - http://host.docker.internal:11434

Version 14.3 (23D56) MacOS Sonoma

@rb81
Copy link

rb81 commented Jul 8, 2024

2024-06-29 15:08:06 yarn run v1.22.19
2024-06-29 15:08:06 $ node dist/app.js
2024-06-29 15:08:06 info: WebSocket server started on port 3001
2024-06-29 15:08:06 info: Server is running on port 3001
2024-06-29 15:08:15 error: undefined
2024-06-29 15:08:16 error: undefined
2024-06-29 15:08:25 node:internal/process/promises:289
2024-06-29 15:08:25             triggerUncaughtException(err, true /* fromPromise */);
2024-06-29 15:08:25             ^
2024-06-29 15:08:25 
2024-06-29 15:08:25 TypeError: Invalid URL
2024-06-29 15:08:25     at new URL (node:internal/url:775:36)
2024-06-29 15:08:25     at OpenAI.buildURL (/home/perplexica/node_modules/openai/core.js:318:15)
2024-06-29 15:08:25     at OpenAI.buildRequest (/home/perplexica/node_modules/openai/core.js:199:26)
2024-06-29 15:08:25     at OpenAI.makeRequest (/home/perplexica/node_modules/openai/core.js:274:44)
2024-06-29 15:08:25     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2024-06-29 15:08:25     at async /home/perplexica/node_modules/@langchain/openai/dist/chat_models.cjs:684:29
2024-06-29 15:08:25     at async RetryOperation._fn (/home/perplexica/node_modules/p-retry/index.js:50:12) {
2024-06-29 15:08:25   code: 'ERR_INVALID_URL',
2024-06-29 15:08:25   input: 'null/chat/completions'
2024-06-29 15:08:25 }
2024-06-29 15:08:25 
2024-06-29 15:08:25 Node.js v20.8.1
2024-06-29 15:08:25 error Command failed with exit code 1.
2024-06-29 15:08:25 info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
2024-06-29 15:08:25 yarn run v1.22.19
2024-06-29 15:08:25 $ node dist/app.js
2024-06-29 15:08:25 info: WebSocket server started on port 3001
2024-06-29 15:08:25 info: Server is running on port 3001

Have you configured the OpenAI API key or the Groq API key or the Ollama address? Once of them is required for Perplexica to work.

I have an OpenAI API key configured as well as the Ollama server URI, but switching to Ollama seems to require a fake GROQ API key in order to work. Using sk-proj-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx gets it working for me. Just FYI, the error showing in the backend container logs was:

error: Error loading Groq models: Error: OpenAI or Azure OpenAI API key not found

@ItzCrazyKns
Copy link
Owner

This issue is now resolved with 8539ce8. I'd recommend everyone to re-install Perplexica.

docker compose up -d --build

Note: Run this only after re-cloning the repository.

@rb81
Copy link

rb81 commented Jul 8, 2024

This issue is now resolved with 8539ce8. I'd recommend everyone to re-install Perplexica.

docker compose up -d --build

Note: Run this only after re-cloning the repository.

Did this but now nothing works. Do I need to change or do anything else?

Update:

Even though I've selected OpenAI, it seems there are some Ollama related errors occurring. My Ollama settings were correct and were working with the fake GROQ API key before this last commit:

error: Error loading Ollama models: TypeError: fetch failed
error: Error loading Ollama embeddings model: TypeError: fetch failed

@echeoquehaii
Copy link

Also experiencing the same wit Ollama

@rb81
Copy link

rb81 commented Jul 9, 2024

@ItzCrazyKns - After your last commit, it seems that having any Ollama settings in place stop Perplexica from working at all. Removing the Ollama URI and GROQ API key, and OpenAI settings work. As another using mentioned, Ollama is no longer an option in the "Chat Model Provider" list - just OpenAI and "Custom_OpenAI"

@ItzCrazyKns
Copy link
Owner

@ItzCrazyKns - After your last commit, it seems that having any Ollama settings in place stop Perplexica from working at all. Removing the Ollama URI and GROQ API key, and OpenAI settings work. As another using mentioned, Ollama is no longer an option in the "Chat Model Provider" list - just OpenAI and "Custom_OpenAI"

Just did another commit, 27e6f5b, that should help

@rb81
Copy link

rb81 commented Jul 9, 2024

Just did another commit, 27e6f5b, that should help

How should Ollama work now? The option is still not in the drop down. There's "Custom_OpenAI" and then the Ollama URI parameter separately to that. Earlier there was a separate option for Ollama in the model provider list...

@echeoquehaii
Copy link

I've re-cloned and re-set up everything, and now it works to me. Also had to clear cache and cookies, at the first attempt it was showing the same error

@rb81
Copy link

rb81 commented Jul 9, 2024

I've re-cloned and re-set up everything, and now it works to me. Also had to clear cache and cookies, at the first attempt it was showing the same error

Are you able to get Ollama working?

@echeoquehaii
Copy link

Are you able to get Ollama working?

Yes, it's working fine for me now!

@totorofly
Copy link

Just did another commit, 27e6f5b, that should help

How should Ollama work now? The option is still not in the drop down. There's "Custom_OpenAI" and then the Ollama URI parameter separately to that. Earlier there was a separate option for Ollama in the model provider list...

the same error

@echeoquehaii
Copy link

Are you able to get Ollama working?

Yes, it works for me with Ollama, after re setting up everything, and clearing cache and cookies of the webpage from my browser

@MehrCurry
Copy link

MehrCurry commented Aug 19, 2024

I was facing the same 'Invalid URL' error on my mac.

I tried some solution from this thread. Then I figured out, that config.toml is baked right in the backends docker image. So every time you do a change, you have to rebuild the docker images by doing

docker compose up -d --build

After that I got it running.

There is still another problem with the searxng container (Cannot load engine “gentoo”), but since I am using my already existing instance that does not matter for me.

see #257

@MehrCurry
Copy link

The problem re-appeared. Deleting the cookies solved it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

8 participants