-
Notifications
You must be signed in to change notification settings - Fork 42
Performance benchmarking using airspeed velocity #277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
OOOO I hadn't heard of I have a question about Looking at the documentation for While I hope my question is helpful for |
Nice comparison to The package being primarily targeted at doing this performance tracking, you don't need to do anything other than its setup. In the examples I have seen, people include |
Cool. This sounds nice. |
It may be fair to say that if one is writing discrete event simulations in pure Python that performance is not their 'top' priority. Python's dynamic typing and garbage collection preclude being in the top tier of performance among DES tools.
But I think that performance still matters and tracking and benchmarking can put some observability on performance issues.
I've been looking into airspeed velocity. It supports running benchmarks across commits so you can see how things have improved or worsened. It is kind of like writing unit tests, except that they measure run times and memory usage.
I suggest trying this out with Ciw.
The text was updated successfully, but these errors were encountered: