Skip to content

Support for vLLM in 26.04 #1433

@akoumpa

Description

@akoumpa

Is your feature request related to a problem? Please describe.
Add vLLM in our docker container to streamline the train-inference workflow.

Describe the solution you'd like
We want to include tests in our Gha CI and the software on the docker container.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
I'm adding @thomasdhc for the automation, but will update assignees on code side.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions