Skip to content

Fix FastMCP integration tests and transport security #1001

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 24 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
242fea8
Fix FastMCP integration tests and transport security
spacelord16 Jun 21, 2025
5966a61
Fix merge conflict: adopt main branch concurrency test approach with …
spacelord16 Jun 30, 2025
e5af1d5
Merge origin/main into fix-fastmcp-integration-tests
spacelord16 Jul 6, 2025
96a5bce
Fix integration tests after merge - correct ClientSession API usage a…
spacelord16 Jul 6, 2025
551212e
Apply Ruff formatting to integration tests
spacelord16 Jul 6, 2025
7553bba
Merge branch 'main' into fix-fastmcp-integration-tests
spacelord16 Jul 7, 2025
a2638cd
fix: Handle BrokenResourceError on Windows Python 3.13
spacelord16 Jul 7, 2025
d25bafc
trigger: Re-run CI checks for Windows Python 3.13 fix
spacelord16 Jul 7, 2025
17a9867
style: Apply Ruff formatting to Windows stdio fixes
spacelord16 Jul 7, 2025
3343410
fix: Comprehensive Windows resource cleanup for ALL client transports
spacelord16 Jul 7, 2025
dcac243
fix: Improve streamable HTTP client stream cleanup with comprehensive…
spacelord16 Jul 24, 2025
2f568c0
Resolve merge conflicts from origin/main
spacelord16 Jul 24, 2025
236a041
fix: Resolve integration test issues and import problems
spacelord16 Jul 24, 2025
4abb5c2
style: Apply ruff formatting fixes from pre-commit
spacelord16 Jul 24, 2025
1283607
fix: Optimize test performance and resolve Windows parallelization is…
spacelord16 Jul 24, 2025
352065c
fix: Fix pytest import order for integration marker
spacelord16 Jul 24, 2025
565ea48
style: Apply Ruff formatting to fix pre-commit
spacelord16 Jul 24, 2025
d0ec057
fix: Add integration markers to all multiprocessing tests
spacelord16 Jul 24, 2025
f505572
style: Apply ruff formatting to integration test changes
spacelord16 Jul 24, 2025
35eae15
fix: Install CLI dependencies to prevent pytest collection hang
spacelord16 Jul 24, 2025
50fecda
fix: Add CLI dependencies to readme-snippets job as well
spacelord16 Jul 24, 2025
6611489
fix: Resolve import order and formatting issues
spacelord16 Jul 24, 2025
562a33b
fix: Configure pyright to skip unannotated to avoid multiprocessing i…
spacelord16 Jul 24, 2025
908cbf7
fix: Replace dynamic env.cache_id with github.run_id in workflow caches
spacelord16 Jul 24, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions .github/workflows/publish-docs-manually.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,9 @@ jobs:
enable-cache: true
version: 0.7.2

- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- uses: actions/cache@v4
with:
key: mkdocs-material-${{ env.cache_id }}
key: mkdocs-material-${{ github.run_id }}
path: .cache
restore-keys: |
mkdocs-material-
Expand Down
3 changes: 1 addition & 2 deletions .github/workflows/publish-pypi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -70,10 +70,9 @@ jobs:
enable-cache: true
version: 0.7.2

- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- uses: actions/cache@v4
with:
key: mkdocs-material-${{ env.cache_id }}
key: mkdocs-material-${{ github.run_id }}
path: .cache
restore-keys: |
mkdocs-material-
Expand Down
16 changes: 12 additions & 4 deletions .github/workflows/shared.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ jobs:

test:
runs-on: ${{ matrix.os }}
timeout-minutes: 10
timeout-minutes: 15
continue-on-error: true
strategy:
matrix:
Expand All @@ -45,10 +45,18 @@ jobs:
version: 0.7.2

- name: Install the project
run: uv sync --frozen --all-extras --python ${{ matrix.python-version }}
run: uv sync --frozen --all-extras --group dev --python ${{ matrix.python-version }}

- name: Run pytest
run: uv run --frozen --no-sync pytest
run: |
if [ "${{ matrix.os }}" = "windows-latest" ]; then
# Run integration tests without parallelization on Windows to avoid multiprocessing issues
uv run --frozen --no-sync pytest -m "not integration" --numprocesses auto
uv run --frozen --no-sync pytest -m integration --numprocesses 1
else
uv run --frozen --no-sync pytest
fi
shell: bash

# This must run last as it modifies the environment!
- name: Run pytest with lowest versions
Expand All @@ -68,7 +76,7 @@ jobs:
version: 0.7.2

- name: Install dependencies
run: uv sync --frozen --all-extras --python 3.10
run: uv sync --frozen --all-extras --group dev --python 3.10

- name: Check README snippets are up to date
run: uv run --frozen scripts/update_readme_snippets.py --check
1 change: 1 addition & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ repos:
pass_filenames: false
exclude: ^README\.md$
- id: pyright
args: ["--skipunannotated"]
name: pyright
entry: uv run pyright
language: system
Expand Down
5 changes: 5 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,11 @@ addopts = """
--capture=fd
--numprocesses auto
"""
# Disable parallelization for integration tests that spawn subprocesses
# This prevents Windows issues with multiprocessing + subprocess conflicts
markers = [
"integration: marks tests as integration tests (may run without parallelization)",
]
filterwarnings = [
"error",
# This should be fixed on Uvicorn's side.
Expand Down
60 changes: 55 additions & 5 deletions src/mcp/client/sse.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,9 @@ async def sse_client(
try:
logger.debug(f"Connecting to SSE endpoint: {remove_request_params(url)}")
async with httpx_client_factory(
headers=headers, auth=auth, timeout=httpx.Timeout(timeout, read=sse_read_timeout)
headers=headers,
auth=auth,
timeout=httpx.Timeout(timeout, read=sse_read_timeout),
) as client:
async with aconnect_sse(
client,
Expand Down Expand Up @@ -109,7 +111,16 @@ async def sse_reader(
logger.exception("Error in sse_reader")
await read_stream_writer.send(exc)
finally:
await read_stream_writer.aclose()
try:
await read_stream_writer.aclose()
except (
anyio.ClosedResourceError,
anyio.BrokenResourceError,
):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing read_stream_writer in sse_reader: {exc}")

async def post_writer(endpoint_url: str):
try:
Expand All @@ -129,7 +140,16 @@ async def post_writer(endpoint_url: str):
except Exception:
logger.exception("Error in post_writer")
finally:
await write_stream.aclose()
try:
await write_stream.aclose()
except (
anyio.ClosedResourceError,
anyio.BrokenResourceError,
):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream in post_writer: {exc}")

endpoint_url = await tg.start(sse_reader)
logger.debug(f"Starting post writer with endpoint URL: {endpoint_url}")
Expand All @@ -140,5 +160,35 @@ async def post_writer(endpoint_url: str):
finally:
tg.cancel_scope.cancel()
finally:
await read_stream_writer.aclose()
await write_stream.aclose()
# Improved stream cleanup with comprehensive exception handling
try:
await read_stream_writer.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing read_stream_writer in SSE cleanup: {exc}")

try:
await write_stream.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream in SSE cleanup: {exc}")

try:
await read_stream.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing read_stream in SSE cleanup: {exc}")

try:
await write_stream_reader.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream_reader in SSE cleanup: {exc}")
8 changes: 2 additions & 6 deletions src/mcp/client/stdio/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ async def stdout_reader():

session_message = SessionMessage(message)
await read_stream_writer.send(session_message)
except anyio.ClosedResourceError:
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
await anyio.lowlevel.checkpoint()

async def stdin_writer():
Expand All @@ -174,7 +174,7 @@ async def stdin_writer():
errors=server.encoding_error_handler,
)
)
except anyio.ClosedResourceError:
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
await anyio.lowlevel.checkpoint()

async with (
Expand Down Expand Up @@ -208,10 +208,6 @@ async def stdin_writer():
except ProcessLookupError:
# Process already exited, which is fine
pass
await read_stream.aclose()
await write_stream.aclose()
await read_stream_writer.aclose()
await write_stream_reader.aclose()


def _get_executable_command(command: str) -> str:
Expand Down
50 changes: 45 additions & 5 deletions src/mcp/client/streamable_http.py
Original file line number Diff line number Diff line change
Expand Up @@ -413,8 +413,15 @@ async def handle_request_async():
except Exception:
logger.exception("Error in post_writer")
finally:
await read_stream_writer.aclose()
await write_stream.aclose()
# Only close the write stream here, read_stream_writer is shared
# and will be closed in the main cleanup
try:
await write_stream.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream in post_writer cleanup: {exc}")

async def terminate_session(self, client: httpx.AsyncClient) -> None:
"""Terminate the session by sending a DELETE request."""
Expand Down Expand Up @@ -502,8 +509,41 @@ def start_get_stream() -> None:
)
finally:
if transport.session_id and terminate_on_close:
await transport.terminate_session(client)
try:
await transport.terminate_session(client)
except Exception as exc:
logger.debug(f"Error terminating session: {exc}")
tg.cancel_scope.cancel()
finally:
await read_stream_writer.aclose()
await write_stream.aclose()
# Comprehensive stream cleanup with exception handling
try:
await read_stream_writer.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing read_stream_writer in main cleanup: {exc}")

try:
await write_stream.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream in main cleanup: {exc}")

try:
await read_stream.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing read_stream in main cleanup: {exc}")

try:
await write_stream_reader.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream_reader in main cleanup: {exc}")
47 changes: 42 additions & 5 deletions src/mcp/client/websocket.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,10 @@
async def websocket_client(
url: str,
) -> AsyncGenerator[
tuple[MemoryObjectReceiveStream[SessionMessage | Exception], MemoryObjectSendStream[SessionMessage]],
tuple[
MemoryObjectReceiveStream[SessionMessage | Exception],
MemoryObjectSendStream[SessionMessage],
],
None,
]:
"""
Expand Down Expand Up @@ -79,8 +82,42 @@ async def ws_writer():
tg.start_soon(ws_reader)
tg.start_soon(ws_writer)

# Yield the receive/send streams
yield (read_stream, write_stream)
try:
# Yield the receive/send streams
yield (read_stream, write_stream)
finally:
# Once the caller's 'async with' block exits, we shut down
tg.cancel_scope.cancel()

# Once the caller's 'async with' block exits, we shut down
tg.cancel_scope.cancel()
# Improved stream cleanup with comprehensive exception handling
try:
await read_stream.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing read_stream in WebSocket cleanup: {exc}")

try:
await write_stream.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream in WebSocket cleanup: {exc}")

try:
await read_stream_writer.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing read_stream_writer in WebSocket cleanup: {exc}")

try:
await write_stream_reader.aclose()
except (anyio.ClosedResourceError, anyio.BrokenResourceError):
# Stream already closed, ignore
pass
except Exception as exc:
logger.debug(f"Error closing write_stream_reader in WebSocket cleanup: {exc}")
8 changes: 8 additions & 0 deletions src/mcp/server/transport_security.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,10 @@ def _validate_host(self, host: str | None) -> bool:
logger.warning("Missing Host header in request")
return False

# Check for wildcard "*" first - allows any host
if "*" in self.settings.allowed_hosts:
return True

# Check exact match first
if host in self.settings.allowed_hosts:
return True
Expand All @@ -70,6 +74,10 @@ def _validate_origin(self, origin: str | None) -> bool:
if not origin:
return True

# Check for wildcard "*" first - allows any origin
if "*" in self.settings.allowed_origins:
return True

# Check exact match first
if origin in self.settings.allowed_origins:
return True
Expand Down
15 changes: 13 additions & 2 deletions tests/client/test_stdio.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,9 @@
from mcp.types import CONNECTION_CLOSED, JSONRPCMessage, JSONRPCRequest, JSONRPCResponse
from tests.shared.test_win32_utils import escape_path_for_python

# Mark all tests in this file as integration tests (spawn subprocesses)
pytestmark = [pytest.mark.integration]

# Timeout for cleanup of processes that ignore SIGTERM
# This timeout ensures the test fails quickly if the cleanup logic doesn't have
# proper fallback mechanisms (SIGINT/SIGKILL) for processes that ignore SIGTERM
Expand Down Expand Up @@ -411,7 +414,11 @@ async def test_nested_process_tree(self):
await anyio.sleep(1.0)

# Verify all are writing
for file_path, name in [(parent_file, "parent"), (child_file, "child"), (grandchild_file, "grandchild")]:
for file_path, name in [
(parent_file, "parent"),
(child_file, "child"),
(grandchild_file, "grandchild"),
]:
if os.path.exists(file_path):
initial_size = os.path.getsize(file_path)
await anyio.sleep(0.3)
Expand All @@ -425,7 +432,11 @@ async def test_nested_process_tree(self):

# Verify all stopped
await anyio.sleep(0.5)
for file_path, name in [(parent_file, "parent"), (child_file, "child"), (grandchild_file, "grandchild")]:
for file_path, name in [
(parent_file, "parent"),
(child_file, "child"),
(grandchild_file, "grandchild"),
]:
if os.path.exists(file_path):
size1 = os.path.getsize(file_path)
await anyio.sleep(0.3)
Expand Down
4 changes: 3 additions & 1 deletion tests/issues/test_188_concurrency.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,9 @@
from pydantic import AnyUrl

from mcp.server.fastmcp import FastMCP
from mcp.shared.memory import create_connected_server_and_client_session as create_session
from mcp.shared.memory import (
create_connected_server_and_client_session as create_session,
)


@pytest.mark.anyio
Expand Down
Loading
Loading