Skip to content

Annotations #847

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 12 commits into
base: main
Choose a base branch
from
Draft

Annotations #847

wants to merge 12 commits into from

Conversation

simonw
Copy link
Owner

@simonw simonw commented Mar 23, 2025

TODO:

  • Documentation for execute() plugin authors (which is not yet implemented yet)
  • Handle model .execute() methods that yield Chunk in addition to str
  • Spin up a dummy model with annotations that I can start writing tests against
  • Implement Response.chunks() method described here for Python API, with docs
  • llm prompt should display annotations correctly as they are yielded by the new .execute() method
  • Redo the work to add annotation display to llm logs - previous prototype is here.
  • llm-openai-plugin to use annotations for web_search tool
  • llm-anthropic plugin to use annotations for Claude citations

@simonw
Copy link
Owner Author

simonw commented Mar 23, 2025

I hope I got that type stuff right in 2ce2510 - which cover what happens when you do for chunk in model.prompt(...) and the execute() method may return str or Chunk objects. Since Chunk has a __str__() method I hope things continue to work OK.

@simonw
Copy link
Owner Author

simonw commented Mar 23, 2025

Spin up a dummy model with annotations that I can start writing tests against

Idea: I could have this model class live in the llm source code but NOT be registered as a model. Then the tests could register it with a temporary plugin but I could also build an llm-debug plugin that, when installed, registers the model for real as well. This would make it easier for me to test things interactively with the llm CLI.

@simonw
Copy link
Owner Author

simonw commented Mar 23, 2025

I have that llm-debug plugin now:

import llm
from llm.examples import Markov, AnnotationsModel, AnnotationsModelAsync

@llm.hookimpl
def register_models(register):
    register(Markov())
    register(AnnotationsModel(), AnnotationsModelAsync())

Can't package and ship it until I have at least an alpha with llm/examples.py in it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant