-
Notifications
You must be signed in to change notification settings - Fork 412
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Need the option to mask the input and output of the LLM API in Datadog LLM observability #11179
Comments
Hi @Yun-Kim |
Hi, |
Hi @Gekko0114! Are you using VertexAI to use Gemini? If so, we have an integration coming in the next few weeks which will include controls to omit the input and output. Also, we're looking for people to try the integration as we develop it, would you be interested in partnering with us for this? |
Yes, I am using VertexAI. Sounds great!
No problem! What should I do? |
Awesome, @Gekko0114 are you in our public Slack? If not could you join and we can follow up with you once we have a build ready for you to try! |
Hi @Kyle-Verhoog |
Not sure if it's off topic. I tried with setting env I want to collect the metrics but hide the input completely or having option to mask the sensitive information like phone number and email, I wonder if we could get flexibility to control over that. |
@Gekko0114 you should be able to find me under "Kyle Verhoog", send me a message! |
@yj-ang this sounds like a bug 🤔. I will do some investigation and get back to you. |
The input and output of the LLM API contain sensitive information, so I don't want to send them to Datadog.
I would like to have an option to send data to Datadog with the input and output masked.
If it's okay, I want to create a PR for this.
Previously I raised the same question in this issue #10517 .
In this issue, I heard that APM integration dashboards should be used for these four providers (OpenAI/Bedrock/LangChain/Anthropic).
However, I would like to monitor Gemini.
According to #10971, Gemini will not be supported in APM integration because LLM obs will support Gemini.
Therefore I need an option to mask the input and output of the LLM API in Datadog LLM observability.
If it is okay, I would like to create a PR for this.
The text was updated successfully, but these errors were encountered: