You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
text-embedding-3-small
I've got, server side, this exception on each Q&A generated:
15:44:50 - LiteLLM:INFO: cost_calculator.py:588 - selected model name for cost calculation: openai/gpt-4o-mini-2024-07-18
2025-Mar-27 15:44:50 - INFO - (LiteLLM): selected model name for cost calculation: openai/gpt-4o-mini-2024-07-18
15:44:50 - LiteLLM:ERROR: litellm_logging.py:3538 - Error creating standard logging object - __annotations__
Traceback (most recent call last):
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 3520, in get_standard_logging_object_payload
model_parameters=ModelParamHelper.get_standard_logging_model_parameters(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 28, in get_standard_logging_model_parameters
ModelParamHelper._get_relevant_args_to_use_for_logging()
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 45, in _get_relevant_args_to_use_for_logging
all_openai_llm_api_params = ModelParamHelper._get_all_llm_api_params()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 65, in _get_all_llm_api_params
ModelParamHelper._get_litellm_supported_transcription_kwargs()
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 126, in _get_litellm_supported_transcription_kwargs
return set(TranscriptionCreateParams.__annotations__.keys())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cdebari/miniconda3/lib/python3.11/typing.py", line 1288, in __getattr__
raise AttributeError(attr)
AttributeError: __annotations__
2025-Mar-27 15:44:50 - ERROR - (LiteLLM): Error creating standard logging object - __annotations__
Traceback (most recent call last):
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/litellm_logging.py", line 3520, in get_standard_logging_object_payload
model_parameters=ModelParamHelper.get_standard_logging_model_parameters(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 28, in get_standard_logging_model_parameters
ModelParamHelper._get_relevant_args_to_use_for_logging()
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 45, in _get_relevant_args_to_use_for_logging
all_openai_llm_api_params = ModelParamHelper._get_all_llm_api_params()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 65, in _get_all_llm_api_params
ModelParamHelper._get_litellm_supported_transcription_kwargs()
File "/Users/cdebari/Documents/GitHub/oaim-sandbox-cdb/src/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/model_param_helper.py", line 126, in _get_litellm_supported_transcription_kwargs
return set(TranscriptionCreateParams.__annotations__.keys())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cdebari/miniconda3/lib/python3.11/typing.py", line 1288, in __getattr__
raise AttributeError(attr)
AttributeError: __annotations__
2025-Mar-27 15:44:52 - INFO - (httpx): HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
The text was updated successfully, but these errors were encountered:
In Testbed, asking 50 Q&A, with:
I've got, server side, this exception on each Q&A generated:
The text was updated successfully, but these errors were encountered: