Currently, the wp-ai-client library does not provide a way to retrieve token usage information (e.g. prompt tokens, completion tokens, total tokens) from API responses.
Having access to token usage is important for:
- Monitoring and controlling costs
- Debugging and optimizing prompts
- Displaying usage metrics in WordPress dashboards or logs
Request
Please consider exposing token usage data returned by the underlying AI provider (when available) as part of the response object, or through an optional accessor/helper method.
For example:
usage.prompt_tokens
usage.completion_tokens
usage.total_tokens
Expected behavior
- Token usage should be returned in a consistent, documented structure.
- If a provider does not support token usage, the field should be
null or omitted gracefully.
- This change should be backward-compatible with existing implementations.
Additional context
Many AI SDKs (e.g. OpenAI, Azure OpenAI) already return token usage metadata, and exposing it in wp-ai-client would make the library more useful for production environments where cost awareness is critical.
Currently, the
wp-ai-clientlibrary does not provide a way to retrieve token usage information (e.g. prompt tokens, completion tokens, total tokens) from API responses.Having access to token usage is important for:
Request
Please consider exposing token usage data returned by the underlying AI provider (when available) as part of the response object, or through an optional accessor/helper method.
For example:
usage.prompt_tokensusage.completion_tokensusage.total_tokensExpected behavior
nullor omitted gracefully.Additional context
Many AI SDKs (e.g. OpenAI, Azure OpenAI) already return token usage metadata, and exposing it in
wp-ai-clientwould make the library more useful for production environments where cost awareness is critical.