Skip to content

Conversation

@andreibondarev
Copy link
Collaborator

No description provided.

@andreibondarev
Copy link
Collaborator Author

andreibondarev commented Oct 24, 2024

@bborn Okay, I re-packaged it a bit to a small, simple PR. I'm thinking of adding instructions to the README, something like:

# Execute the Assistant.
assistant.add_message_and_run! content: "What is 2+2?"
actual_output = assistant.messages.last.content

# Compare it's output to the expected output
Langchain::Evals::CosineSimilarity.new(llm).score(actual_output: actual_output, expected_output: "4")

( ^^^ Need a better example of course)

@sergiobayona
Copy link
Collaborator

@andreibondarev is this only missing the README update?

@andreibondarev andreibondarev requested a review from Copilot April 17, 2025 18:21
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This pull request introduces the Langchain::Evals::CosineSimilarity class to compute a similarity score between two output strings by comparing their embedding vectors. The implementation uses an underlying LLM to embed the texts and leverages a cosine similarity utility, with accompanying specifications to ensure correct behavior.

  • Adds a new CosineSimilarity class that computes similarity using embeddings.
  • Provides an RSpec test to verify that a score of 1.0 is returned when embeddings are identical.

Reviewed Changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
spec/langchain/evals/cosine_similarity_spec.rb New test file to verify cosine similarity computation.
lib/langchain/evals/cosine_similarity.rb Implements the cosine similarity calculation using LLM embeddings.
Comments suppressed due to low confidence (1)

spec/langchain/evals/cosine_similarity_spec.rb:11

  • Consider adding additional tests with varied embedding values to verify that the cosine similarity calculation works correctly across different scenarios.
allow(subject.llm).to receive(:embed).and_return(double("Langchain::LLM::OpenAIResponse", embedding: [1, 0, 0]))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add SemanticSimilarity metric into the Langchain::Evals::Ragas class namespace

4 participants