fix: avoid ralph crash after a task is completed#256
fix: avoid ralph crash after a task is completed#256frankbria merged 1 commit intofrankbria:mainfrom
Conversation
WalkthroughA condition in Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@ralph_loop.sh`:
- Around line 1599-1601: The existing comment block that begins with "# Extract
session ID from stream-json output for session continuity" is stale; update it
to state that this block always runs in live mode to normalize NDJSON for
downstream processing, that it still writes the full stream to _stream.log and
extracts the session_id from the final "result" message, and remove any wording
implying it is only for session continuity; ensure the new comment briefly
explains why normalization is needed and what outputs are produced (normalized
NDJSON and _stream.log) so future readers understand the current behavior.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
| # Extract session ID from stream-json output for session continuity | ||
| # Stream-json format has session_id in the final "result" type message | ||
| # Keep full stream output in _stream.log, extract session data separately |
There was a problem hiding this comment.
Update stale comment to reflect new behavior.
The comment still frames this block as session-continuity-only, but it now always runs in live mode to normalize NDJSON for downstream processing.
✏️ Suggested comment update
- # Extract session ID from stream-json output for session continuity
- # Stream-json format has session_id in the final "result" type message
- # Keep full stream output in _stream.log, extract session data separately
+ # Normalize stream-json NDJSON into a single result JSON object for downstream analysis
+ # and keep the full stream in _stream.log for debugging/session fallback.
+ # Session ID is extracted from result, with system-message fallback on truncation.As per coding guidelines, "Update inline comments in bash scripts immediately when implementation changes, remove outdated comments".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@ralph_loop.sh` around lines 1599 - 1601, The existing comment block that
begins with "# Extract session ID from stream-json output for session
continuity" is stale; update it to state that this block always runs in live
mode to normalize NDJSON for downstream processing, that it still writes the
full stream to _stream.log and extracts the session_id from the final "result"
message, and remove any wording implying it is only for session continuity;
ensure the new comment briefly explains why normalization is needed and what
outputs are produced (normalized NDJSON and _stream.log) so future readers
understand the current behavior.
fix: extract result JSON from stream output regardless of SESSION_CONTINUITY
Problem
When
SESSION_CONTINUITY=falseandLIVE_OUTPUT=true(live mode), Ralph crashes silently duringanalyze_responseafter each successful Claude Code execution.The loop ends at "Analyzing Claude Code response..." with no error output, and the Ralph process dies. Status JSON is never updated (stuck at
"status": "running"), exit signals are never written, and Ralph never proceeds to the next loop.Root Cause
In
ralph_loop.shline ~1602, the stream-json result extraction is guarded by:When
SESSION_CONTINUITY=false, this entire block is skipped. The block does two things:--continue(only needed whenSESSION_CONTINUITY=true)output_filefrom raw NDJSON to a single result JSON line (needed byanalyze_responsealways)Without the extraction,
analyze_responsereceives the full multi-MB NDJSON stream (everystream_event,content_block_delta, etc.) instead of the clean{"type":"result", ...}object. This causes a silent crash during response parsing.Reproduction
SESSION_CONTINUITY=falsein.ralphrcralph --live --verbose_stream.logbackup is createdFix
Remove the
CLAUDE_USE_CONTINUEguard so the result extraction always runs in live mode:This is safe because:
SESSION_CONTINUITY=false— it's only read back whenCLAUDE_USE_CONTINUE=true(line ~1456)_stream.logbackup is useful for debugging regardless of session modeanalyze_responseandsave_claude_sessionboth expect a single JSON object, not raw NDJSON (as noted in the code comment at line ~1616: "save_claude_session and analyze_response expect JSON format")Impact
Without this fix, any user running
SESSION_CONTINUITY=falsewith--livewill have Ralph crash after every successful Claude execution. Ralph completes the task (commits are made, tests pass) but can never proceed to the next task or exit gracefully.Ralph Log
Summary by CodeRabbit
Bug Fixes