-
Notifications
You must be signed in to change notification settings - Fork 2.2k
local AI and chrome issues #164
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
How are you running ollama? If the answer is docker you'll need to use your local IP rather than localhost. for instance, http://192.168.1.3:11434 . Also the model names require the tag after the colon (:) as well. So llama3.1:latest, rather than llama3.1 Did you define the chrome dir correctly in the .env file? Chrome settingsCHROME_PATH="C:\Program Files\Google\Chrome\Application\chrome.exe" Set to true to keep browser open between AI tasksCHROME_PERSISTENT_SESSION=true |
EDIT: resolved running ollama as a docker container #146 (comment) I have the same issue. I use Ollama on a Docker container and have it configured with local IP. |
If you're working with Ollama, you should update the OLLAMA_ENDPOINT to "http://host.docker.internal:11434" in your .env file. This allows the Docker container to communicate with the Ollama application running on your host machine, as "localhost" inside a container refers to the container itself, not your computer. also I made pr about this issue : #399 |
Uh oh!
There was an error while loading. Please reload this page.
first of all it refuses to use my ollama i get connection errors no matter what i try
secondly it never uses my browser and always makes me use vnc to see it use its own
The text was updated successfully, but these errors were encountered: