Skip to content

Langchain Vertex integration #4585

@CharlesAntoineParent

Description

@CharlesAntoineParent

How do you use Sentry?

Sentry Saas (sentry.io)

Version

2.33.0

Steps to Reproduce

Integration problem with langchain for Vertex AI models

I noticed that all my calls to Vertex AI models are missing token usage. When I checked your implementation, I see that you are counting tokens based on tiktoken which contains only OpenAI encoders. Also, you are using the attribute ChatResponse.llm_output of langchain to get the token_usage, which seems to be empty for all Vertex AI responses.

import json

import sentry_sdk
from google.oauth2.service_account import Credentials
from langchain_google_vertexai import ChatVertexAI


sentry_sdk.init(dsn=YOUR_DSN, traces_sample_rate=1.0, send_default_pii=True)

with open(
    YOUR_PATH_TO_CRED
) as f:
    creds = json.load(f)
vertex_ai_credentials = Credentials.from_service_account_info(creds)
model = ChatVertexAI(
    model_name="gemini-2.0-flash-001",
    credentials=vertex_ai_credentials,
)

with sentry_sdk.start_transaction(op="llm_call", name="test_vertex_ai") as transaction:
    model.invoke("hello world")

Expected Result

Token usage should be available in the chat span

Actual Result

Current token usage in the span (input, output and total) are all 0.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions