-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Contextual Generate model #17913
base: main
Are you sure you want to change the base?
Contextual Generate model #17913
Conversation
0c332c3
to
2b2f3dc
Compare
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
2b2f3dc
to
6239fdd
Compare
llama-index-integrations/llms/llama-index-llms-contextual/tests/test.py
Outdated
Show resolved
Hide resolved
llama-index-integrations/llms/llama-index-llms-contextual/llama_index/llms/contextual/base.py
Outdated
Show resolved
Hide resolved
26236cd
to
89318d5
Compare
@logan-markewich can you approve the workflows? |
89318d5
to
403d39a
Compare
Pinging on this @logan-markewich |
Signed-off-by: Sean Smith <[email protected]>
bd2e611
to
afcaf34
Compare
""" | ||
raise NotImplementedError("stream methods not implemented in Contextual") | ||
|
||
def _generate( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems fairly open ai compatible -- any reason not not just use the base openai client and change the model args?
Also what you have here doesn't override any async methods, so those will still call the normal openai client
Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Fixes # (issue)
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.
Suggested Checklist:
make format; make lint
to appease the lint gods