Skip to content

Reduce openvino InferenceNumThreads to 1 #1824

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ThagonDuarte
Copy link
Contributor

@ThagonDuarte ThagonDuarte commented Apr 3, 2025

Why? What?

Most of the VisionTop cycles taking too long were apparently due to ObjectDetectionTop simply maxing out the CPU and leading to contention. This PR adds a property to the openvino core, which suggests the engine to only use 1 thread during inference: InferenceNumThreads. This does not completely eliminate the DONKs but dramatically reduces them. This however also increases the inference time of the pose detection model to over 450ms. Reducing the number of inference threads only to 2 did not significantly increase the inference time (~230ms -> ~255ms), but also did reduce the DONKing nearly as much.

Fixes #1822

ToDo / Known Issues

Ideas for Next Iterations (Not This PR)

If there are some improvements that could be done in a next iteration, describe them here.

How to Test

@github-project-automation github-project-automation bot moved this to Request for Review in Development Apr 3, 2025
@pejotejo pejotejo moved this from Request for Review to In Progress in Development Apr 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: In Progress
Development

Successfully merging this pull request may close these issues.

Vision and Control cycles regularily take too long
1 participant