Skip to content

Browser refusing to open any page - httpx.ConnectError: [Errno 61] Connection refused #223

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
LuD1161 opened this issue Feb 2, 2025 · 3 comments

Comments

@LuD1161
Copy link

LuD1161 commented Feb 2, 2025

Hello team,

Thanks for the awesome work in this project. I am trying to run the web-ui locally but I am getting the following error -

httpcore.ConnectError: [Errno 61] Connection refused

and the browser window just keeps on reloading the about:blank page.

I am pasting all the debug logs here -

(ash) ➜  web-ui git:(main) python webui.py --ip 127.0.0.1 --port 7788
INFO     [browser_use] BrowserUse logging setup complete with level debug
DEBUG    [htmldate.validators] minimum date setting: 1995-01-01 00:00:00
DEBUG    [telemetry] Telemetry disabled
* Running on local URL:  http://127.0.0.1:7788

To create a public link, set `share=True` in `launch()`.
Traceback (most recent call last):
  File "/Users/ash/.venv/lib/python3.11/site-packages/gradio/queueing.py", line 625, in process_events
    response = await route_utils.call_process_api(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/gradio/route_utils.py", line 322, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/gradio/blocks.py", line 2045, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/gradio/blocks.py", line 1604, in call_function
    prediction = await utils.async_iteration(iterator)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/gradio/utils.py", line 715, in async_iteration
    return await anext(iterator)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/gradio/utils.py", line 820, in asyncgen_wrapper
    response = await iterator.__anext__()
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/Repos/web-ui/webui.py", line 433, in run_with_stream
    result = await run_browser_agent(
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/Repos/web-ui/webui.py", line 118, in run_browser_agent
    llm = utils.get_llm_model(
          ^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/Repos/web-ui/src/utils/utils.py", line 35, in get_llm_model
    handle_api_key_error(provider, env_var)
  File "/Users/ash/Repos/web-ui/src/utils/utils.py", line 173, in handle_api_key_error
    raise gr.Error(
gradio.exceptions.Error: '💥 OpenAI API key not found! 🔑 Please set the `OPENAI_API_KEY` environment variable or provide it in the UI.'
DEBUG    [browser] Initializing new browser
DEBUG    [browser] Initializing new browser context with id: 92f85c0e-8f79-4fe7-bf14-6ff1c5da34eb
DEBUG    [agent] Version: 0.1.29, Source: pip
INFO     [agent] 🚀 Starting task: go to google.com and type 'OpenAI' click search and give me the first url
DEBUG    [agent] Version: 0.1.29, Source: pip
DEBUG    [browser_use] --get_state Execution time: 0.00 seconds
DEBUG    [browser] Initializing browser context
DEBUG    [browser] New page opened: about:blank
DEBUG    [browser] Network stabilized for 1 seconds
DEBUG    [browser] --Page loaded in 2.96 seconds, waiting for additional 0.00 seconds
INFO     [src.agent.custom_agent]
📍 Step 1
DEBUG    [browser_use] --get_state Execution time: 0.00 seconds
DEBUG    [browser] Network stabilized for 1 seconds
DEBUG    [browser] --Page loaded in 1.03 seconds, waiting for additional 0.00 seconds
DEBUG    [message_manager] Messages in history: 2:
DEBUG    [message_manager] SystemMessage - Token count: 2949
DEBUG    [message_manager] HumanMessage - Token count: 894
DEBUG    [message_manager] Total input tokens: 3843
ERROR    [agent] ❌ Result failed 1/5 times:
 [Errno 61] Connection refused
Stacktrace:
Traceback (most recent call last):
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
    yield
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 250, in handle_request
    resp = self._pool.handle_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request
    raise exc from None
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request
    response = connection.handle_request(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 101, in handle_request
    raise exc
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 78, in handle_request
    stream = self._connect(request)
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 124, in _connect
    stream = self._network_backend.connect_tcp(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp
    with map_exceptions(exc_map):
  File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 61] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/ash/Repos/web-ui/src/agent/custom_agent.py", line 247, in step
    raise e
  File "/Users/ash/Repos/web-ui/src/agent/custom_agent.py", line 235, in step
    model_output = await self.get_next_action(input_messages)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/browser_use/utils.py", line 36, in wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/Repos/web-ui/src/agent/custom_agent.py", line 193, in get_next_action
    ai_message = self.llm.invoke(messages_to_process)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 284, in invoke
    self.generate_prompt(
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 860, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 690, in generate
    self._generate_with_cache(
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 925, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 701, in _generate
    final_chunk = self._chat_stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 602, in _chat_stream_with_aggregation
    for stream_resp in self._create_chat_stream(messages, stop, **kwargs):
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 589, in _create_chat_stream
    yield from self._client.chat(**chat_params)
  File "/Users/ash/.venv/lib/python3.11/site-packages/ollama/_client.py", line 163, in inner
    with self._client.stream(*args, **kwargs) as r:
  File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py", line 137, in __enter__
    return next(self.gen)
           ^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 868, in stream
    response = self.send(
               ^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 914, in send
    response = self._send_handling_auth(
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 942, in _send_handling_auth
    response = self._send_handling_redirects(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
    response = self._send_single_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 1014, in _send_single_request
    response = transport.handle_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 249, in handle_request
    with map_httpcore_exceptions():
  File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 61] Connection refused

DEBUG    [browser_use] --step Execution time: 1.15 seconds
DEBUG    [browser_use] --get_state Execution time: 0.00 seconds
DEBUG    [browser] Network stabilized for 1 seconds
DEBUG    [browser] --Page loaded in 1.02 seconds, waiting for additional 0.00 seconds
INFO     [src.agent.custom_agent]
📍 Step 1
DEBUG    [browser_use] --get_state Execution time: 0.00 seconds
INFO     [__main__] 🛑 Stop requested - the agent will halt at the next safe point
DEBUG    [browser] Network stabilized for 1 seconds
DEBUG    [browser] --Page loaded in 1.02 seconds, waiting for additional 0.00 seconds
DEBUG    [message_manager] Messages in history: 2:
DEBUG    [message_manager] SystemMessage - Token count: 2949
DEBUG    [message_manager] HumanMessage - Token count: 894
DEBUG    [message_manager] Total input tokens: 3843
ERROR    [agent] ❌ Result failed 2/5 times:
 [Errno 61] Connection refused
Stacktrace:
Traceback (most recent call last):
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
    yield
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 250, in handle_request
    resp = self._pool.handle_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request
    raise exc from None
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request
    response = connection.handle_request(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 101, in handle_request
    raise exc
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 78, in handle_request
    stream = self._connect(request)
             ^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 124, in _connect
    stream = self._network_backend.connect_tcp(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp
    with map_exceptions(exc_map):
  File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 61] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/ash/Repos/web-ui/src/agent/custom_agent.py", line 247, in step
    raise e
  File "/Users/ash/Repos/web-ui/src/agent/custom_agent.py", line 235, in step
    model_output = await self.get_next_action(input_messages)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/browser_use/utils.py", line 36, in wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/Repos/web-ui/src/agent/custom_agent.py", line 193, in get_next_action
    ai_message = self.llm.invoke(messages_to_process)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 284, in invoke
    self.generate_prompt(
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 860, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 690, in generate
    self._generate_with_cache(
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 925, in _generate_with_cache
    result = self._generate(
             ^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 701, in _generate
    final_chunk = self._chat_stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 602, in _chat_stream_with_aggregation
    for stream_resp in self._create_chat_stream(messages, stop, **kwargs):
  File "/Users/ash/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 589, in _create_chat_stream
    yield from self._client.chat(**chat_params)
  File "/Users/ash/.venv/lib/python3.11/site-packages/ollama/_client.py", line 163, in inner
    with self._client.stream(*args, **kwargs) as r:
  File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py", line 137, in __enter__
    return next(self.gen)
           ^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 868, in stream
    response = self.send(
               ^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 914, in send
    response = self._send_handling_auth(
               ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 942, in _send_handling_auth
    response = self._send_handling_redirects(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
    response = self._send_single_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_client.py", line 1014, in _send_single_request
    response = transport.handle_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 249, in handle_request
    with map_httpcore_exceptions():
  File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/Users/ash/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 61] Connection refused

DEBUG    [browser_use] --step Execution time: 1.11 seconds
INFO     [src.agent.custom_agent] 🛑 Stop requested by user
INFO     [src.agent.custom_agent] Created GIF at agent_history.gif
DEBUG    [browser] Closing browser context
/Users/ash/.venv/lib/python3.11/site-packages/gradio/components/video.py:340: UserWarning: Video does not have browser-compatible container or codec. Converting to mp4.
  warnings.warn(
^CKeyboard interruption in main thread... closing server.
^L%
  • Python version : 3.11.11
@warmshao
Copy link
Collaborator

warmshao commented Feb 6, 2025

💥 OpenAI API key not found! 🔑 Please set the OPENAI_API_KEY environment variable or provide it in the UI

your key is incorrect

@LuD1161
Copy link
Author

LuD1161 commented Feb 6, 2025

Thanks @warmshao . Apologies, didn't see that earlier.

@LuD1161 LuD1161 closed this as completed Feb 6, 2025
@AliYmn
Copy link
Contributor

AliYmn commented Mar 14, 2025

@LuD1161 If you're working with Ollama, you should update the OLLAMA_ENDPOINT to "http://host.docker.internal:11434" in your .env file. This allows the Docker container to communicate with the Ollama application running on your host machine, as "localhost" inside a container refers to the container itself, not your computer.

also I made pr about this issue : #399

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants