[Feature Request] GPU Memory/Utilization #1194
Replies: 6 comments 1 reply
-
This would be an awesome addition, but I worry about the extra baggage it'd attach to the image for users not interested or without GPUs. I'll have to do some investigation to see how large the nvidia-smi package would be, and if it can be added to an Alpine linux image. |
Beta Was this translation helpful? Give feedback.
-
this addition would be great. i think you can make a another image with nvidia smi and keep the original without it. that way each user can select which image to use. but in the end it is up to you @benphelps and which option is easier for you to maintain this great docker application. thank you for your hard work |
Beta Was this translation helpful? Give feedback.
-
I would like the Intel gpu top aswell |
Beta Was this translation helpful? Give feedback.
-
This discussion has been automatically closed due to lack of community interest. |
Beta Was this translation helpful? Give feedback.
-
I'm currently doing a small docker image that expose the nvidia-smi info as json : https://github.com/Opa-/nvidia-smi-rest ![]() version: '3'
services:
nvidia-smi-rest:
image: opaopa/nvidia-smi-rest:latest
container_name: nvidia-smi-rest
ports:
- 7777:7777
deploy:
resources:
reservations:
devices:
- driver: "nvidia"
count: 1
capabilities: [gpu]
labels:
- homepage.group=fsociety
- homepage.name=Nvidia
- homepage.weight=7
- homepage.icon=/icons/nvidia.svg I think this would be a preferable approach rather than having to include nvidia-smi into gethomepage container. All that's left to do is deciding what the widget would expose and code it (if this discussion get enough upvotes :) ) |
Beta Was this translation helpful? Give feedback.
-
This discussion has been automatically locked since there has not been any recent activity after it was closed. Please open a new discussion for related concerns. See our contributing guidelines for more details. |
Beta Was this translation helpful? Give feedback.
-
Would like to have a widget that would display the local resource utilization for GPU similar to nvidia-smi command:
Beta Was this translation helpful? Give feedback.
All reactions