-
-
Couldn't load subscription status.
- Fork 10.9k
Open
Labels
Description
Your current environment
On Top of Tree:
(vllm) [robertgshaw2-redhat@nma-h200-isolated-0-preserve vllm]$ VLLM_USE_PRECOMPILED=1 uv pip install -e .
Updated https://github.com/triton-lang/triton.git (c3c476f357f1e9768ea4e45aa5c17528449ab9ef)
× No solution found when resolving dependencies:
╰─▶ Because there is no version of apache-tvm-ffi==0.1.0b15 and flashinfer-python==0.4.1 depends on apache-tvm-ffi==0.1.0b15, we can conclude that flashinfer-python==0.4.1 cannot be used.
And because vllm==0.11.1rc3.dev57+g6454afec9.precompiled depends on flashinfer-python==0.4.1, we can conclude that vllm==0.11.1rc3.dev57+g6454afec9.precompiled cannot be used.
And because only vllm==0.11.1rc3.dev57+g6454afec9.precompiled is available and you require vllm, we can conclude that your requirements are unsatisfiable.
hint: `apache-tvm-ffi` was requested with a pre-release marker (e.g., apache-tvm-ffi==0.1.0b15), but pre-releases weren't enabled (try: `--prerelease=allow`)How you are installing vllm
No response
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
noooop, sniper35, wwl2755, jmkuebler and hmellor