-
Notifications
You must be signed in to change notification settings - Fork 830
'charmap' codec can't decode byte 0x81 in position 1980: character maps to <undefined> #433
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hey @Damil001, thanks for reporting this! That error happens because Python is using your system’s default text encoding (on Windows that’s often “cp1252”) and it ran into a byte it can’t handle. Quick fix: wherever you do with open(path, "r") as f:
data = json.load(f) change it to with open(path, "r", encoding="utf-8") as f:
data = json.load(f) That forces UTF-8 decoding and will avoid the “charmap” error. I’m also rolling up a small PR right now to add |
Thanks. Please share your PR and we will review it once ready. |
@hangfei I’ve just opened this PR #439 to ensure all Please take a look when you get a chance. Thanks! |
Hi @AlankritVerma01 , I just wanted to clarify the issue, the agents works fine when I am using Gemini Model but when I try to swap the model with other LLM and use LiteLLM object I get this error, I am stuck with this for 2 days now, any help would be much appreciated. Thanks |
@Damil001 thanks for the additional context! I’m not entirely sure what’s going on yet—could you share a minimal snippet of the code you’re running (including how you instantiate and call LiteLlm) that reproduces this decode error? It’d also help to see exactly what the JSON reader in lite_llm.py is seeing—can you print out the raw bytes or repr of the data around lines 131–132 here: And just to double‐check: you’ve already pulled in the PR that adds encoding="utf-8" to all our open() calls and reinstalled, but the problem persists? If that’s true, we may need to dig into whether LiteLLM itself is doing any file reads (or string handling) under the hood that aren’t using UTF-8. Let me know what you find and we’ll take it from there! |
Whenever I try to use LlmAgent with LiteLlm I am getting this error
'charmap' codec can't decode byte 0x81 in position 1980: character maps to
I have pasted the code below any help would be much appreciated
`import os
import asyncio
import warnings
import logging
from google.adk.agents import Agent
from google.adk.agents import LlmAgent
from google.adk.models.lite_llm import LiteLlm # For multi-model support
from google.adk.sessions import InMemorySessionService
from google.adk.runners import Runner
from google.genai import types # For creating message Content/Parts
Ignore all warnings
warnings.filterwarnings("ignore")
Configure logging
logging.basicConfig(level=logging.ERROR)
Set API keys
os.environ['OPENAI_API_KEY'] = "My key here"
print("API Keys Set:")
print(f"Google API Key set: {'Yes' if os.environ.get('GOOGLE_API_KEY') and os.environ['GOOGLE_API_KEY'] != 'YOUR_GOOGLE_API_KEY' else 'No (REPLACE PLACEHOLDER!)'}")
print(f"OpenAI API Key set: {'Yes' if os.environ.get('OPENAI_API_KEY') and os.environ['OPENAI_API_KEY'] != 'YOUR_OPENAI_API_KEY' else 'No (REPLACE PLACEHOLDER!)'}")
print(f"Anthropic API Key set: {'Yes' if os.environ.get('ANTHROPIC_API_KEY') and os.environ['ANTHROPIC_API_KEY'] != 'YOUR_ANTHROPIC_API_KEY' else 'No (REPLACE PLACEHOLDER!)'}")
Configure ADK to use API keys directly (not Vertex AI for this multi-model setup)
os.environ["GOOGLE_GENAI_USE_VERTEXAI"] = "False"
Models
MODEL_GEMINI_2_0_FLASH = "gemini-2.0-flash"
MODEL_GPT_4O = "openai/gpt-4o"
MODEL_CLAUDE_SONNET = "anthropic/claude-3-sonnet-20240229"
print("\nEnvironment configured.")
Define tool
def get_weather(city: str) -> dict:
"""Retrieves the current weather report for a specified city.
Example tool usage
print(get_weather("Mumbai"))
Set up agent
AGENT_MODEL = MODEL_GPT_4O
weather_agent = LlmAgent(
name="weather_agent_v1",
model= LiteLlm(AGENT_MODEL),
description="Provides weather information for specific cities.",
instruction=(
"You are a helpful weather assistant. Your primary goal is to provide current weather reports. "
"When the user asks for the weather in a specific city, "
"you MUST use the 'get_weather' tool to find the information. "
"Analyze the tool's response: if the status is 'error', inform the user politely about the error message. "
"If the status is 'success', present the weather 'report' clearly and concisely to the user. "
"Only use the tool when a city is mentioned for a weather request."
),
tools=[get_weather],
)
print(f"Agent '{weather_agent.name}' created using model '{AGENT_MODEL}'.")
`
The text was updated successfully, but these errors were encountered: