-
Notifications
You must be signed in to change notification settings - Fork 55
no-jira: add remote offline batch inference with vllm example #848
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
@kryanbeane: This pull request explicitly references no jira issue. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #848 +/- ##
==========================================
- Coverage 92.55% 92.47% -0.08%
==========================================
Files 24 24
Lines 1410 1396 -14
==========================================
- Hits 1305 1291 -14
Misses 105 105 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
demo-notebooks/additional-demos/batch-inference/remote_offline_bi.ipynb
Outdated
Show resolved
Hide resolved
demo-notebooks/additional-demos/batch-inference/remote_offline_bi.ipynb
Outdated
Show resolved
Hide resolved
51f189e
to
1390ee8
Compare
/hold |
What changes have been made
Added example for remote offline batch inference using ray data and vllm
Checks