Skip to content

Comments

feat: debug log in LLM response#606

Open
afjcjsbx wants to merge 1 commit intosipeed:mainfrom
afjcjsbx:feat/llm-debug-response
Open

feat: debug log in LLM response#606
afjcjsbx wants to merge 1 commit intosipeed:mainfrom
afjcjsbx:feat/llm-debug-response

Conversation

@afjcjsbx
Copy link

📝 Description

Added a debug log inside the agent loop (runLLMIteration) to capture and print the complete LLM response.
The response payload is parsed and formatted as indented JSON (pretty-print) to ensure clear and structured readability.

🗣️ Type of Change

  • 🐞 Bug fix (non-breaking change which fixes an issue)
  • ✨ New feature (non-breaking change which adds functionality)
  • 📖 Documentation update
  • ⚡ Code refactoring (no functional changes, no api changes)

🤖 AI Code Generation

  • 🤖 Fully AI-generated (100% AI, 0% Human)
  • 🛠️ Mostly AI-generated (AI draft, Human verified/modified)
  • 👨‍💻 Mostly Human-written (Human lead, AI assisted or none)

🔗 Related Issue

📚 Technical Context (Skip for Docs)

  • Reference URL:
  • Reasoning: Currently, there is already a debug-level log showing the entire content of the call to the LLM (the messages and tools sent), but there was nothing in place to fully track the response received from the LLM. This addition is extremely useful for developers who need to deeply debug exactly what the model replied (tool arguments, formatting, internal reasoning) to better understand the behavior and decisions of the assistant. Naturally, since it uses logger.DebugCF, the log is printed to the console exclusively when debug mode is active, keeping the terminal clean during standard usage.

🧪 Test Environment

Hardware: Apple Silicon Mac
OS: Tahoe 26.3
Model/Provider: glm-4.7-flash
Channels: Telegram

Click to view Logs/Screenshots

Example output with debug mode enabled:

2026/02/21 17:54:46 [2026-02-21T16:54:46Z] [DEBUG] agent: Full LLM response {content={
  "content": "Based on the available forecast data for Rome, tomorrow (February 22, 2026) is expected to have:\n\n**🌡️ Temperature:**\n- High: 15-18°C (59-64°F)\n- Low: 3-6°C (38-42°F)\n\n**🌧️ Conditions:**\n- Rain is likely due to unusual weather patterns in February 2026\n- Record rainfall has been occurring daily following an exceptionally wet January\n- Pack an umbrella and warm clothing\n\n**💡 Tip:** Since Rome has been experiencing unusual weather patterns this February with near-daily downpours, you should be prepared for rain tomorrow. The temperatures will be mild during the day but quite cool in the evenings.\n\nFor the most precise and up-to-date forecast closer to tomorrow, I'd recommend checking AccuWeather, Weather.com, or the Italian Meteorological Service (Protezione Civile) directly, as forecast accuracy improves within 24-48 hours. 🌤️",
  "finish_reason": "stop",
  "usage": {
    "prompt_tokens": 10581,
    "completion_tokens": 280,
    "total_tokens": 10861
  }
}, agent_id=main, iteration=6}

☑️ Checklist

  • My code/docs follow the style of this project.
  • I have performed a self-review of my own changes.
  • I have updated the documentation accordingly.

@afjcjsbx afjcjsbx changed the title feat: debug log in llm response feat: debug log in LLM response Feb 21, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant