-
Notifications
You must be signed in to change notification settings - Fork 24
Issues: aidatatools/ollama-benchmark
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
ConnectError: [WinError 10049] in > 43 │ check_models.pull_models(models_file_path)
Investigation
Investigate user's questions
#19
opened Jan 30, 2025 by
mattnmeng
cant detect intel arc a770 ubuntu 22.04
Investigation
Investigate user's questions
#17
opened Sep 11, 2024 by
Xyz00777
LLM Benchmark crashes on llava:13b when used on Nvidia GPU
Investigation
Investigate user's questions
#15
opened Aug 19, 2024 by
synchronic1
Adding CPU/GPU distribution to the logs and reports
enhancement
New feature or request
#11
opened Jun 11, 2024 by
dan-and
Running on Non GPU laptops
Investigation
Investigate user's questions
#8
opened May 23, 2024 by
twelsh37
ProTip!
What’s not been updated in a month: updated:<2025-01-07.