Feature: llamacpp support + strict collection naming (optional) #224
+735
−19
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hi. I'm testing few models and providers for my projects and make some contributions to claude-context repo.
May be this features will be usefull for your projects too.
Thanks for your work!
Below you can see description prepared by Claude Code:
🎯 Summary
Add configurable collection naming strategy to prevent data conflicts when switching between different embedding providers and models.
💡 Motivation
Currently, when users switch between embedding providers (e.g., from Ollama to LlamaCpp), all providers share the same collection name (
hybrid_code_chunks_<hash>). This causes:✨ What's New
Core Features
Strict Collection Naming Mode (opt-in via
EMBEDDING_STRICT_COLLECTION_NAMES=true)hybrid_<provider>_<model>_<path_hash>_<unique_hash>hybrid_ollama_nomic_embed_text_abc12345_def67890Custom Collection Names (via
MILVUS_COLLECTION_NAME)New
getModel()MethodEmbeddingclassBackward Compatibility
hybrid_code_chunks_<hash>) remains default🎁 Benefits
For Users
For Project