-
Notifications
You must be signed in to change notification settings - Fork 405
Open
Description
Hi, how does orpheus interact with the project in order to synthesize LLM responses and send them back in realtime to the websocket?
I'm not very technical so I've been having a pretty hard time understand how orpheus ties to the other componants of the project and how I would host it on runpod or any other service since my GPU is too weak.
Is it through an API request?
As far as I can see, the current setup is deploying orpheus on LM studio and uses its API to call orpheus and receive the audio tokens in realtime, but I can't deploy LM studio on runpod since it doesn't have a GUI
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels