Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Blog post on LLM deployment on Imperial HPC #51

Open
jamesturner246 opened this issue Aug 23, 2024 · 0 comments
Open

Blog post on LLM deployment on Imperial HPC #51

jamesturner246 opened this issue Aug 23, 2024 · 0 comments
Assignees
Labels
post suggestion For post that could be published in the future and will need discussion

Comments

@jamesturner246
Copy link
Contributor

jamesturner246 commented Aug 23, 2024

In one code surgery, there was a question about deploying a LLM for inference on HPC, and the process is not trivial.

A technical blog describing the process would be welcomed.

@jamesturner246 jamesturner246 self-assigned this Aug 23, 2024
@jamesturner246 jamesturner246 added the post suggestion For post that could be published in the future and will need discussion label Aug 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
post suggestion For post that could be published in the future and will need discussion
Projects
None yet
Development

No branches or pull requests

1 participant