Skip to content

feat: wire MCP server to dedicated dead-code and impact API endpoints#32

Merged
jonathanpopham merged 4 commits intosupermodeltools:mainfrom
jonathanpopham:feat/mcp-real-api-endpoints
Apr 3, 2026
Merged

feat: wire MCP server to dedicated dead-code and impact API endpoints#32
jonathanpopham merged 4 commits intosupermodeltools:mainfrom
jonathanpopham:feat/mcp-real-api-endpoints

Conversation

@jonathanpopham
Copy link
Copy Markdown
Contributor

@jonathanpopham jonathanpopham commented Apr 3, 2026

Closes #27

Summary

Replace inline graph traversal in the MCP server with calls to the real API endpoints. The MCP dead_code and blast_radius tools now use client.DeadCode() and client.Impact() — same data quality as the CLI commands.

What changed

1 file: internal/mcp/server.go — net -4 lines (157 added, 161 removed)

Removed: findDeadFunctions, findAffected, isEntryPoint, pathMatches, deadFn, affected — ~100 lines of inline graph traversal

Added: toolDeadCodeclient.DeadCode(), toolBlastRadiusclient.Impact(), ensureZip helper, intArg helper

Updated tool schemas:

  • dead_code: removed include_exports, added min_confidence and limit
  • blast_radius: removed depth, file is now optional (omit for global coupling map)

Before (inline graph traversal)

717 unreachable function(s):
- __add__
- __aenter__
...

After (real API)

119 dead code candidate(s) out of 1131 total declarations:
- [high] src/.../MeteredUsage.java:9 MeteredUsage — Type/interface with no references
- [high] src/.../RequiresScopes.java:8 RequiresScopes — Type/interface with no references
...

Test plan

  • go test ./... — all pass
  • Build succeeds

Summary by CodeRabbit

  • Updates

    • Tool parameters changed: dead-code now uses min_confidence and limit; blast-radius accepts an optional file (omitting it triggers global analysis).
    • Analysis outputs expanded to include confidence, file/line context, risk scores, dependents, and detailed impact metrics.
  • New Features

    • Dedicated analysis/reporting handlers with idempotent repo packaging and formatted dead-code/impact reports.
  • Refactor

    • Simplified tool dispatch with clearer error handling for unknown tools.
  • Tests

    • Added unit tests covering formatted report output and edge cases.

…points

Replace inline graph traversal (findDeadFunctions, findAffected) with
calls to client.DeadCode() and client.Impact(). MCP tools now return
the same rich results as the CLI commands — confidence levels, risk
scores, line numbers, reasons, entry points.

Removed ~100 lines of inline helpers (findDeadFunctions, findAffected,
isEntryPoint, pathMatches, deadFn, affected types).

Updated tool schemas to match new params (min_confidence, limit for
dead_code; file optional for blast_radius).

Closes supermodeltools#27
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 3, 2026

Walkthrough

Rewires MCP tool dispatch to dedicated handlers, removes local graph-analysis helpers, and calls analysis APIs via client.DeadCode and client.Impact using repo zips and idempotency keys; adds formatting helpers and tests for formatted outputs. Changes primarily in internal/mcp/server.go plus new tests.

Changes

Cohort / File(s) Summary
MCP server core
internal/mcp/server.go
Replaced inline dead-code/blast-radius graph logic with handlers: toolAnalyze, toolDeadCode, toolBlastRadius, toolGetGraph. Added ensureZip() (create zip + hash), intArg() helper, updated tool input schemas (dead_code: min_confidence, limit; blast_radius: optional file), and explicit unknown-tool error. Removed old graph helpers (findDeadFunctions, findAffected, isEntryPoint, etc.).
API formatting & wiring
internal/mcp/server.go (same file, formatting helpers)
toolDeadCode now calls client.DeadCode(...) with idempotency key and formats via formatDeadCode; toolBlastRadius calls client.Impact(...) and formats via formatImpact; toolGetGraph retained via wrapper. Added formatDeadCode and formatImpact.
Tests
internal/mcp/server_test.go
New unit tests for formatDeadCode and formatImpact: cover empty results, populated result formatting, and conditional rendering (omitting empty "Entry points" section).

Sequence Diagram(s)

sequenceDiagram
  participant MCP as MCP Server
  participant FS as Repo/Filesystem
  participant API as Analysis API
  participant Cache as Graph Cache

  MCP->>FS: ensureZip() — create repo zip, compute hash
  FS-->>MCP: zipPath, zipHash
  MCP->>API: client.DeadCode(zipPath, idempotencyKey=zipHash, min_confidence, limit)
  API-->>MCP: DeadCodeResults (candidates: file, line, confidence, reason)
  MCP->>API: client.Impact(zipPath, idempotencyKey=zipHash+file?, file?)
  API-->>MCP: ImpactResults (risk, affected files, entry points, factors)
  MCP->>Cache: use/reuse cached graph for analyze/get_graph requests
  MCP->>MCP: formatDeadCode / formatImpact -> return tool response
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Suggested reviewers

  • greynewell

Poem

Zip it up, call the API, then read the signs,
Confidence and risk come back with files and lines.
Old graph tucked away, new handlers steer the ride,
Tests sing the format — outputs bona fide. 🚀

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 20.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately reflects the main change: wiring the MCP server to use dedicated API endpoints instead of inline graph traversal for dead-code and impact analysis.
Description check ✅ Passed The description covers the key aspects: what changed (file and line counts), which functions were removed/added, the updated tool schemas, before/after examples, and test results.
Linked Issues check ✅ Passed The PR fully addresses issue #27: replaces inline dead-code and impact traversal with client.DeadCode() and client.Impact() calls, updates tool schemas with new parameters, and keeps changes isolated to internal/mcp/server.go.
Out of Scope Changes check ✅ Passed All changes are directly scoped to the objectives: refactoring MCP tool handlers and updating schemas; no unrelated formatting, dependency, or system changes detected.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@internal/mcp/server.go`:
- Around line 248-255: toolDeadCode and toolBlastRadius recreate the repo
zip/hash independently of the cached in-memory graph (served by toolGetGraph),
causing snapshot drift; fix by having these handlers call s.ensureZip() and then
compare the returned hash to the server's cached graph snapshot (e.g., a new
s.graphHash field) and if they differ, invalidate s.graph (set to nil) or update
s.graphHash to the new hash so all handlers use the same snapshot; apply the
same fix to the other affected handlers (the functions around the noted ranges
and any use of ensureZip()/s.graph) so any change in working-tree hash either
resets s.graph or synchronizes it to the new hash before using it.
- Around line 95-100: The MCP endpoint currently only accepts and forwards a
"file" argument and formats file-level output; update it to accept a generic
target (file or function), forward that target to Impact(...targets...) instead
of only args["file"], and format responses using ImpactTargetInfo{Name, Line,
Type} and AffectedFunctions so function-level results are preserved. Concretely:
change the request parsing to accept a "target" (or reuse args["target"] /
args["file_or_function"]) and pass that value into the call to Impact(...),
update any marshal/response builders that previously assumed file-only output to
iterate/serialize ImpactTargetInfo and AffectedFunctions, and ensure callers
that construct the request populate the new field. Use the existing types
Impact(...targets...), ImpactTargetInfo{Name, Line, Type}, and AffectedFunctions
to implement the forwarding and response formatting.
- Around line 256-257: The parsers currently silently coerce bad/missing entries
from the args map (e.g., min_confidence via min_confidence, limit via intArg,
and file/blast_radius/dead_code uses) to zero values which widens requests;
instead validate types and fail fast: update the code paths that read args
(referencing min_confidence, intArg, and the args map) to check the type
assertion results and return an explicit error/HTTP 400 when an arg is the wrong
type or missing, and modify intArg to surface parsing/validation errors rather
than returning 0; apply the same pattern to the other occurrences noted (around
the sections you highlighted at 287-294 and 452-455) so bad client input is
rejected rather than treated as default values.
- Around line 406-421: The linter flags an unnamedResult; fix by naming the
return values on ensureZip (e.g. change signature to func (s *server)
ensureZip() (zipPath string, hash string, err error)) and then assign to those
named variables (replace the short-declarations like zipPath, err :=
createZip(s.dir) and hash, err := cache.HashFile(zipPath) with simple
assignments zipPath, err = createZip(s.dir) and hash, err =
cache.HashFile(zipPath)) so the function uses the named returns consistently and
returns the named variables.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: c9b0c528-9ea6-48ff-8819-96d4c7f5c8a7

📥 Commits

Reviewing files that changed from the base of the PR and between edb53fe and d5165cf.

📒 Files selected for processing (1)
  • internal/mcp/server.go

Comment on lines +95 to 100
Description: "Analyze the impact of changing a file or function. Returns risk score, affected files and functions, entry points impacted, and risk factors.",
InputSchema: toolSchema{
Type: "object",
Required: []string{"file"},
Type: "object",
Properties: map[string]schemaProp{
"file": {Type: "string", Description: "Repo-relative path to the file (e.g. internal/api/client.go)."},
"depth": {Type: "integer", Description: "Maximum traversal depth. 0 = unlimited."},
"force": {Type: "boolean", Description: "Re-analyze even if a cached result exists."},
"file": {Type: "string", Description: "Repo-relative path to the file (e.g. internal/api/client.go). Omit for global coupling map."},
},
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Blast radius still doesn’t expose function-level analysis.

The tool now advertises “file or function,” but the MCP surface only accepts file, forwards only args["file"], and formats only file-level output. The underlying API/result types are already generic (Impact(...targets...), ImpactTargetInfo{Name, Line, Type}, AffectedFunctions), so MCP clients still miss the function-level parity this change is aiming for.

Also applies to: 287-294, 316-327

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 95 - 100, The MCP endpoint currently
only accepts and forwards a "file" argument and formats file-level output;
update it to accept a generic target (file or function), forward that target to
Impact(...targets...) instead of only args["file"], and format responses using
ImpactTargetInfo{Name, Line, Type} and AffectedFunctions so function-level
results are preserved. Concretely: change the request parsing to accept a
"target" (or reuse args["target"] / args["file_or_function"]) and pass that
value into the call to Impact(...), update any marshal/response builders that
previously assumed file-only output to iterate/serialize ImpactTargetInfo and
AffectedFunctions, and ensure callers that construct the request populate the
new field. Use the existing types Impact(...targets...), ImpactTargetInfo{Name,
Line, Type}, and AffectedFunctions to implement the forwarding and response
formatting.

Comment on lines +248 to +255
// toolDeadCode calls the dedicated /v1/analysis/dead-code endpoint.
func (s *server) toolDeadCode(ctx context.Context, args map[string]any) (string, error) {
zipPath, hash, err := s.ensureZip()
if err != nil {
return "", err
}
defer os.Remove(zipPath)

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

These tools can drift onto a different repo snapshot than get_graph.

toolDeadCode and toolBlastRadius always zip/hash the working tree again, while toolGetGraph can keep serving the cached in-memory graph. After the repo changes mid-session, the server can return impact/dead-code data for one snapshot and graph data for another. Please make them share one snapshot/hash or invalidate s.graph when the working tree hash changes.

Also applies to: 281-285, 344-347, 404-421

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 248 - 255, toolDeadCode and
toolBlastRadius recreate the repo zip/hash independently of the cached in-memory
graph (served by toolGetGraph), causing snapshot drift; fix by having these
handlers call s.ensureZip() and then compare the returned hash to the server's
cached graph snapshot (e.g., a new s.graphHash field) and if they differ,
invalidate s.graph (set to nil) or update s.graphHash to the new hash so all
handlers use the same snapshot; apply the same fix to the other affected
handlers (the functions around the noted ranges and any use of
ensureZip()/s.graph) so any change in working-tree hash either resets s.graph or
synchronizes it to the new hash before using it.

Comment on lines +256 to +257
minConfidence, _ := args["min_confidence"].(string)
limit := intArg(args, "limit")
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Fail fast on bad args instead of widening the request.

These parsers silently fall back to zero values. A non-string file turns blast_radius into a global coupling-map call, and a bad limit turns dead_code into “return everything.” That makes client bugs hard to spot and can trigger much broader API work than the caller asked for.

Also applies to: 287-294, 452-455

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 256 - 257, The parsers currently
silently coerce bad/missing entries from the args map (e.g., min_confidence via
min_confidence, limit via intArg, and file/blast_radius/dead_code uses) to zero
values which widens requests; instead validate types and fail fast: update the
code paths that read args (referencing min_confidence, intArg, and the args map)
to check the type assertion results and return an explicit error/HTTP 400 when
an arg is the wrong type or missing, and modify intArg to surface
parsing/validation errors rather than returning 0; apply the same pattern to the
other occurrences noted (around the sections you highlighted at 287-294 and
452-455) so bad client input is rejected rather than treated as default values.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (3)
internal/mcp/server.go (3)

250-251: ⚠️ Potential issue | 🟠 Major

These tools can still answer from different repo snapshots.

toolDeadCode and toolBlastRadius always zip/hash the current tree, but toolGetGraph can keep serving s.graph from an earlier analysis. If the working tree changes mid-session, MCP can return contradictory answers for the same repo. Please sync them through one snapshot/hash or invalidate s.graph when the hash changes.

Also applies to: 281-282, 344-346

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 250 - 251, The tools (toolDeadCode,
toolBlastRadius) use the current zip/hash from ensureZip but toolGetGraph may
serve a stale s.graph, causing inconsistent answers; modify ensureZip usage so
after calling ensureZip() you compare the returned hash to the stored snapshot
hash (e.g., a field like s.currentHash) and if they differ set s.graph = nil (or
otherwise invalidate it) and update s.currentHash, and also ensure toolGetGraph
reconstructs the graph when s.graph is nil or when its backing hash does not
match the new hash; key symbols: ensureZip, s.graph, toolGetGraph, toolDeadCode,
toolBlastRadius, and the returned hash from ensureZip.

95-100: ⚠️ Potential issue | 🟠 Major

blast_radius still stops at file-level.

The tool text says “file or function,” but the schema only accepts file, the request only forwards args["file"], and the formatter never renders affected functions. So MCP still can’t target a function, and it still falls short of the output it advertises.

Also applies to: 287-294, 312-337

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 95 - 100, The blast_radius tool
advertises "file or function" but only accepts and forwards a "file" field:
update the InputSchema (toolSchema Properties) to accept a "function" (or a
unified "target" with type/enum), change the code that builds the tool request
so it forwards args["function"] (or args["target"]) in addition to args["file"],
and update the formatter responsible for rendering affected items so it includes
and displays affected functions (not just files); search for the usage of
InputSchema/Properties for "file", the code path that reads args["file"] and
constructs the request, and the formatter function that renders affected files
and extend them to handle and render affected functions accordingly.

256-257: ⚠️ Potential issue | 🟠 Major

Malformed args still widen the analysis instead of failing fast.

A bad min_confidence just removes filtering, a non-numeric limit becomes 0 via intArg, and a non-string file turns blast_radius into a repo-wide call. That makes client bugs both expensive and easy to miss. Please validate these inputs and return a clear error instead of defaulting.

Also applies to: 287-287, 452-454

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 256 - 257, The current parsing of inputs
(min_confidence, limit via intArg, and file/blast_radius) silently falls back to
permissive defaults; update the request validation to fail fast: when extracting
min_confidence (variable minConfidence), ensure it's present and convertible to
the expected type or return a clear error, change intArg to validate that the
"limit" value is numeric and >0 (or return an error) instead of coercing to 0,
and validate the "file" argument is a string (reject or error when malformed)
before computing blast_radius; surface descriptive errors from the handler that
calls these values so callers receive immediate validation failures rather than
widened queries.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@internal/mcp/server.go`:
- Around line 299-307: The current branch in the function that checks if
len(result.Impacts) == 0 incorrectly falls back to
result.GlobalMetrics.MostCriticalFiles even when a specific target was
requested; change the logic in that branch to check the request's target (e.g.,
the variable target or request.Target) and only build/return the
MostCriticalFiles string when the caller omitted the target, otherwise return a
target-specific "no impact detected" message (or an error) so repo-wide hot
spots are not returned for targeted queries; update the conditional around
result.GlobalMetrics.MostCriticalFiles and the return path of that block
accordingly.
- Line 406: The function signature for ensureZip currently spells out types for
consecutive string params individually; update the signature in the ensureZip
declaration (function ensureZip) to combine identical types using Go shorthand
so it reads (zipPath, hash string, err error) instead of (zipPath string, hash
string, err error) to satisfy the paramTypeCombine lint rule.

---

Duplicate comments:
In `@internal/mcp/server.go`:
- Around line 250-251: The tools (toolDeadCode, toolBlastRadius) use the current
zip/hash from ensureZip but toolGetGraph may serve a stale s.graph, causing
inconsistent answers; modify ensureZip usage so after calling ensureZip() you
compare the returned hash to the stored snapshot hash (e.g., a field like
s.currentHash) and if they differ set s.graph = nil (or otherwise invalidate it)
and update s.currentHash, and also ensure toolGetGraph reconstructs the graph
when s.graph is nil or when its backing hash does not match the new hash; key
symbols: ensureZip, s.graph, toolGetGraph, toolDeadCode, toolBlastRadius, and
the returned hash from ensureZip.
- Around line 95-100: The blast_radius tool advertises "file or function" but
only accepts and forwards a "file" field: update the InputSchema (toolSchema
Properties) to accept a "function" (or a unified "target" with type/enum),
change the code that builds the tool request so it forwards args["function"] (or
args["target"]) in addition to args["file"], and update the formatter
responsible for rendering affected items so it includes and displays affected
functions (not just files); search for the usage of InputSchema/Properties for
"file", the code path that reads args["file"] and constructs the request, and
the formatter function that renders affected files and extend them to handle and
render affected functions accordingly.
- Around line 256-257: The current parsing of inputs (min_confidence, limit via
intArg, and file/blast_radius) silently falls back to permissive defaults;
update the request validation to fail fast: when extracting min_confidence
(variable minConfidence), ensure it's present and convertible to the expected
type or return a clear error, change intArg to validate that the "limit" value
is numeric and >0 (or return an error) instead of coercing to 0, and validate
the "file" argument is a string (reject or error when malformed) before
computing blast_radius; surface descriptive errors from the handler that calls
these values so callers receive immediate validation failures rather than
widened queries.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: b4aa3819-91cb-4b1d-891a-1e6fa07cc40b

📥 Commits

Reviewing files that changed from the base of the PR and between d5165cf and 3372c34.

📒 Files selected for processing (1)
  • internal/mcp/server.go

Comment on lines +299 to +307
if len(result.Impacts) == 0 {
if len(result.GlobalMetrics.MostCriticalFiles) > 0 {
var sb strings.Builder
sb.WriteString("Most critical files (by dependent count):\n\n")
for i := range result.GlobalMetrics.MostCriticalFiles {
f := &result.GlobalMetrics.MostCriticalFiles[i]
fmt.Fprintf(&sb, "- %s (%d dependents)\n", f.File, f.DependentCount)
}
return sb.String(), nil
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don’t return repo-wide hot spots for a targeted query.

If target is set and the API returns no impacts, this branch still falls back to MostCriticalFiles, which is repo-wide data. That can look like a valid answer for the requested file/function even though it’s unrelated. Only use this fallback when the caller omitted target; otherwise return a target-specific “no impact detected.”

Suggested fix
 	if len(result.Impacts) == 0 {
-		if len(result.GlobalMetrics.MostCriticalFiles) > 0 {
+		if target == "" && len(result.GlobalMetrics.MostCriticalFiles) > 0 {
 			var sb strings.Builder
 			sb.WriteString("Most critical files (by dependent count):\n\n")
 			for i := range result.GlobalMetrics.MostCriticalFiles {
 				f := &result.GlobalMetrics.MostCriticalFiles[i]
 				fmt.Fprintf(&sb, "- %s (%d dependents)\n", f.File, f.DependentCount)
 			}
 			return sb.String(), nil
 		}
-		return "No impact detected.", nil
+		if target != "" {
+			return fmt.Sprintf("No impact detected for %s.", target), nil
+		}
+		return "No impact detected.", nil
 	}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 299 - 307, The current branch in the
function that checks if len(result.Impacts) == 0 incorrectly falls back to
result.GlobalMetrics.MostCriticalFiles even when a specific target was
requested; change the logic in that branch to check the request's target (e.g.,
the variable target or request.Target) and only build/return the
MostCriticalFiles string when the caller omitted the target, otherwise return a
target-specific "no impact detected" message (or an error) so repo-wide hot
spots are not returned for targeted queries; update the conditional around
result.GlobalMetrics.MostCriticalFiles and the return path of that block
accordingly.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (4)
internal/mcp/server.go (4)

248-251: ⚠️ Potential issue | 🟠 Major

These tools can still drift onto different repo snapshots.

toolDeadCode and toolBlastRadius always zip the current working tree, while toolGetGraph can keep serving the cached s.graph. After the repo changes mid-session, one MCP session can mix stale graph data with fresh impact/dead-code results. Please tie ensureZip() to s.hash or invalidate s.graph when the hash changes.

Also applies to: 281-285, 344-347, 404-421

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 248 - 251, toolDeadCode and
toolBlastRadius call ensureZip() which zips the current working tree while
toolGetGraph may serve a cached s.graph, causing mixed stale/fresh data when the
repo changes mid-session; modify the flow so ensureZip() is tied to s.hash (or
returns the hash) and whenever a new hash is produced compare it to s.hash and,
if different, set s.hash to the new value and invalidate s.graph (clear or
reload) before running toolDeadCode, toolBlastRadius, toolGetGraph (also apply
same fix to the other affected handlers referenced in the review), ensuring all
handlers use the same snapshot hash before proceeding.

256-257: ⚠️ Potential issue | 🟠 Major

Bad arg types silently broaden the API call.

A non-string file turns blast_radius into a repo-wide query, a bad min_confidence removes filtering, and a non-integer limit falls back to 0/all results. intArg() also truncates non-integral JSON numbers. This should fail fast instead of quietly widening the request.

Also applies to: 287-287, 452-455

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 256 - 257, The handler currently
broadens queries when arg types are wrong because it silently casts values
(e.g., min_confidence via minConfidence, limit via intArg(args, "limit"), and
file/blast_radius) — change these to validate input types and fail fast: check
that args["min_confidence"] is a string (or explicitly parse/validate a numeric
confidence) and return an error if not; ensure args["limit"] is an integer type
(reject floats or non-numeric values rather than truncating) by updating intArg
or replacing it with strict parsing that returns an error on non-integers; and
validate the file/blast_radius parameter types similarly so incorrect types do
not convert into repo-wide queries. Update the same checks used around the other
occurrences (the other min_confidence/limit/file handling sites) to return a
clear error to the caller when types are invalid.

94-100: ⚠️ Potential issue | 🟠 Major

blast_radius is still file-only on the MCP surface.

Line 95 says “file or function”, but Line 99 only exposes file, Line 287 only reads that key, and Lines 312-327 still serialize file-level fields while dropping AffectedFunctions. MCP clients still can’t request or receive the function-level impact data the API already returns.

Also applies to: 287-295, 312-327

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 94 - 100, The MCP "blast_radius" tool
advertises "file or function" but only exposes and reads the "file" property and
serializes only file-level results; update the tool InputSchema (tool name
"blast_radius" and its toolSchema Properties) to add a "function" string
property, update the request parsing code that currently reads the "file" key to
also read "function" (and pass it through to the underlying API call), and
restore/include the AffectedFunctions field when building/serializing the tool
response so MCP clients can request and receive function-level impact data.

299-309: ⚠️ Potential issue | 🟠 Major

Don’t return repo-wide hot spots for a targeted miss.

When the caller provides a specific file/target and the API returns no impacts, this branch still falls back to MostCriticalFiles, which is unrelated repo-wide data. That makes “no impact for X” look like a valid answer for X. Only use this fallback when no target was supplied.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 299 - 309, The current branch returns
repo-wide MostCriticalFiles whenever result.Impacts is empty, which surfaces
unrelated hotspots for a targeted query; update the conditional so the
MostCriticalFiles fallback is only used when no target was supplied (e.g., check
the request's target variable such as req.Target or local target string) — if
result.Impacts is empty and target is non-empty return a target-specific "no
impact" message (or the existing "No impact detected."), but if target is empty
and len(result.GlobalMetrics.MostCriticalFiles) > 0 build and return the
MostCriticalFiles string as before; adjust the logic around result.Impacts and
result.GlobalMetrics.MostCriticalFiles accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@internal/mcp/server.go`:
- Around line 248-251: toolDeadCode and toolBlastRadius call ensureZip() which
zips the current working tree while toolGetGraph may serve a cached s.graph,
causing mixed stale/fresh data when the repo changes mid-session; modify the
flow so ensureZip() is tied to s.hash (or returns the hash) and whenever a new
hash is produced compare it to s.hash and, if different, set s.hash to the new
value and invalidate s.graph (clear or reload) before running toolDeadCode,
toolBlastRadius, toolGetGraph (also apply same fix to the other affected
handlers referenced in the review), ensuring all handlers use the same snapshot
hash before proceeding.
- Around line 256-257: The handler currently broadens queries when arg types are
wrong because it silently casts values (e.g., min_confidence via minConfidence,
limit via intArg(args, "limit"), and file/blast_radius) — change these to
validate input types and fail fast: check that args["min_confidence"] is a
string (or explicitly parse/validate a numeric confidence) and return an error
if not; ensure args["limit"] is an integer type (reject floats or non-numeric
values rather than truncating) by updating intArg or replacing it with strict
parsing that returns an error on non-integers; and validate the
file/blast_radius parameter types similarly so incorrect types do not convert
into repo-wide queries. Update the same checks used around the other occurrences
(the other min_confidence/limit/file handling sites) to return a clear error to
the caller when types are invalid.
- Around line 94-100: The MCP "blast_radius" tool advertises "file or function"
but only exposes and reads the "file" property and serializes only file-level
results; update the tool InputSchema (tool name "blast_radius" and its
toolSchema Properties) to add a "function" string property, update the request
parsing code that currently reads the "file" key to also read "function" (and
pass it through to the underlying API call), and restore/include the
AffectedFunctions field when building/serializing the tool response so MCP
clients can request and receive function-level impact data.
- Around line 299-309: The current branch returns repo-wide MostCriticalFiles
whenever result.Impacts is empty, which surfaces unrelated hotspots for a
targeted query; update the conditional so the MostCriticalFiles fallback is only
used when no target was supplied (e.g., check the request's target variable such
as req.Target or local target string) — if result.Impacts is empty and target is
non-empty return a target-specific "no impact" message (or the existing "No
impact detected."), but if target is empty and
len(result.GlobalMetrics.MostCriticalFiles) > 0 build and return the
MostCriticalFiles string as before; adjust the logic around result.Impacts and
result.GlobalMetrics.MostCriticalFiles accordingly.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 22d5e07c-96a4-4354-b567-6fdc25e87c53

📥 Commits

Reviewing files that changed from the base of the PR and between 3372c34 and 38ea79b.

📒 Files selected for processing (1)
  • internal/mcp/server.go

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (2)
internal/mcp/server_test.go (1)

99-113: Consider adding a test for "no affected files" case.

You've got TestFormatImpact_NoEntryPoints which is great. For symmetry, you might want a TestFormatImpact_NoAffectedFiles to verify that section is also omitted correctly. Same pattern:

func TestFormatImpact_NoAffectedFiles(t *testing.T) {
    result := &api.ImpactResult{
        Metadata: api.ImpactMetadata{TargetsAnalyzed: 1, TotalFiles: 50, TotalFunctions: 200},
        Impacts: []api.ImpactTarget{
            {
                Target:      api.ImpactTargetInfo{File: "src/util.ts", Type: "file"},
                BlastRadius: api.BlastRadius{DirectDependents: 2, RiskScore: "low"},
                // AffectedFiles intentionally empty
            },
        },
    }
    got := formatImpact(result)
    if strings.Contains(got, "Affected files") {
        t.Error("should not contain affected files section when none")
    }
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server_test.go` around lines 99 - 113, Add a new unit test
mirroring TestFormatImpact_NoEntryPoints to verify formatImpact omits the
"Affected files" section when there are no affected files: create
TestFormatImpact_NoAffectedFiles that builds an api.ImpactResult with one
api.ImpactTarget whose AffectedFiles slice is nil/empty (use api.ImpactResult,
api.ImpactTarget, api.ImpactMetadata as in the existing test), call
formatImpact(result) and assert the returned string does not contain "Affected
files" (t.Error if it does).
internal/mcp/server.go (1)

463-466: Helper enables silent fallback - consider adding validation variant.

The helper itself is fine for JSON number handling (numbers come in as float64). But as mentioned earlier, it silently returns 0 when the type assertion fails.

If you want to support strict validation, you could add a sibling function:

func intArgStrict(args map[string]any, key string) (int, error) {
    v, ok := args[key]
    if !ok {
        return 0, nil // key not present is OK
    }
    f, ok := v.(float64)
    if !ok {
        return 0, fmt.Errorf("%s must be a number, got %T", key, v)
    }
    return int(f), nil
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@internal/mcp/server.go` around lines 463 - 466, The current helper
intArg(args map[string]any, key string) int silently returns 0 when the type
assertion fails, which hides type errors; add a strict variant intArgStrict(args
map[string]any, key string) (int, error) that first checks presence of the key,
returns (0, nil) if absent, asserts the value to float64 and returns a
descriptive error if the assertion fails (include the key and actual type),
otherwise convert and return the int; update callers that need validation to use
intArgStrict and keep intArg for permissive code paths.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@internal/mcp/server_test.go`:
- Around line 99-113: Add a new unit test mirroring
TestFormatImpact_NoEntryPoints to verify formatImpact omits the "Affected files"
section when there are no affected files: create
TestFormatImpact_NoAffectedFiles that builds an api.ImpactResult with one
api.ImpactTarget whose AffectedFiles slice is nil/empty (use api.ImpactResult,
api.ImpactTarget, api.ImpactMetadata as in the existing test), call
formatImpact(result) and assert the returned string does not contain "Affected
files" (t.Error if it does).

In `@internal/mcp/server.go`:
- Around line 463-466: The current helper intArg(args map[string]any, key
string) int silently returns 0 when the type assertion fails, which hides type
errors; add a strict variant intArgStrict(args map[string]any, key string) (int,
error) that first checks presence of the key, returns (0, nil) if absent,
asserts the value to float64 and returns a descriptive error if the assertion
fails (include the key and actual type), otherwise convert and return the int;
update callers that need validation to use intArgStrict and keep intArg for
permissive code paths.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 4ffd1626-2193-462a-ae39-6534e09bb73f

📥 Commits

Reviewing files that changed from the base of the PR and between 38ea79b and ba5eb35.

📒 Files selected for processing (2)
  • internal/mcp/server.go
  • internal/mcp/server_test.go

@jonathanpopham
Copy link
Copy Markdown
Contributor Author

MCP dead_code: Before vs After

Before (main) — inline graph traversal

717 unreachable function(s):

- setExampleResponse  ()
- handleResponseStatusException  ()
- handleMissingRequestHeader  ()
- handleMissingServletRequestPart  ()
... (717 total, no files, no lines, no confidence, no reasons)

After (this PR) — real API endpoint

5 dead code candidate(s) out of 1131 total declarations:

- [high] src/.../MeteredUsage.java:9 MeteredUsage — Type/interface with no references
- [high] src/.../RequiresScopes.java:8 RequiresScopes — Type/interface with no references
- [high] src/.../SubscriptionStatusService.java:227 SubscriptionHealth — Type/interface with no references
- [high] src/dashboard/components/ui/alert.tsx:25 Alert — Exported but file never imported
- [high] src/dashboard/components/ui/alert.tsx:38 AlertTitle — Exported but file never imported

717 false-positive-heavy results → 5 high-confidence results with file:line and reasons. Same repo, same MCP protocol.

Unit tests: 6 new (MCP had 0 before). All 14 packages pass.

@jonathanpopham jonathanpopham marked this pull request as ready for review April 3, 2026 17:34
@jonathanpopham jonathanpopham merged commit 05db1cf into supermodeltools:main Apr 3, 2026
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: wire MCP server dead-code and impact tools to dedicated API endpoints

1 participant