Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need the option to mask the input and output of the LLM API in Datadog LLM observability #11179

Open
Gekko0114 opened this issue Oct 25, 2024 · 9 comments
Labels
MLObs ML Observability (LLMObs)

Comments

@Gekko0114
Copy link

The input and output of the LLM API contain sensitive information, so I don't want to send them to Datadog.
I would like to have an option to send data to Datadog with the input and output masked.
If it's okay, I want to create a PR for this.

Previously I raised the same question in this issue #10517 .
In this issue, I heard that APM integration dashboards should be used for these four providers (OpenAI/Bedrock/LangChain/Anthropic).
However, I would like to monitor Gemini.
According to #10971, Gemini will not be supported in APM integration because LLM obs will support Gemini.

Therefore I need an option to mask the input and output of the LLM API in Datadog LLM observability.
If it is okay, I would like to create a PR for this.

@Gekko0114
Copy link
Author

Hi @Yun-Kim
Gently ping

@Gekko0114
Copy link
Author

Hi,
It would be nice if you could give me any feedback.

@Kyle-Verhoog
Copy link
Member

Hi @Gekko0114! Are you using VertexAI to use Gemini? If so, we have an integration coming in the next few weeks which will include controls to omit the input and output. Also, we're looking for people to try the integration as we develop it, would you be interested in partnering with us for this?

@Gekko0114
Copy link
Author

Are you using VertexAI to use Gemini? If so, we have an integration coming in the next few weeks which will include controls to omit the input and output

Yes, I am using VertexAI. Sounds great!

Also, we're looking for people to try the integration as we develop it, would you be interested in partnering with us for this?

No problem! What should I do?

@quinna-h quinna-h added the MLObs ML Observability (LLMObs) label Nov 1, 2024
@Kyle-Verhoog
Copy link
Member

Awesome, @Gekko0114 are you in our public Slack? If not could you join and we can follow up with you once we have a build ready for you to try!

@Gekko0114
Copy link
Author

Gekko0114 commented Nov 5, 2024

Hi @Kyle-Verhoog
Thanks, I've just joined the slack, but I couldn't find you because this channel has a lot of people named Kyle.
Which channel should I join? And could you let me know your account name?

@yj-ang
Copy link

yj-ang commented Nov 5, 2024

Not sure if it's off topic. I tried with setting env DD_LANGCHAIN_SPAN_PROMPT_COMPLETION_SAMPLE_RATE="0.0" stated in the LangChain integration but I could still see all the content (input and output) visible under the LLM Obs Traces dashboard.

I want to collect the metrics but hide the input completely or having option to mask the sensitive information like phone number and email, I wonder if we could get flexibility to control over that.

@Kyle-Verhoog
Copy link
Member

@Gekko0114 you should be able to find me under "Kyle Verhoog", send me a message!

@Kyle-Verhoog
Copy link
Member

@yj-ang this sounds like a bug 🤔. I will do some investigation and get back to you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
MLObs ML Observability (LLMObs)
Projects
None yet
Development

No branches or pull requests

4 participants