Ciw icon indicating copy to clipboard operation
Ciw copied to clipboard

Performance benchmarking using airspeed velocity

Open galenseilis opened this issue 10 months ago • 3 comments

It may be fair to say that if one is writing discrete event simulations in pure Python that performance is not their 'top' priority. Python's dynamic typing and garbage collection preclude being in the top tier of performance among DES tools.

But I think that performance still matters and tracking and benchmarking can put some observability on performance issues.

I've been looking into airspeed velocity. It supports running benchmarks across commits so you can see how things have improved or worsened. It is kind of like writing unit tests, except that they measure run times and memory usage.

  • https://asv.readthedocs.io/en/latest/

I suggest trying this out with Ciw.

galenseilis avatar Apr 09 '25 04:04 galenseilis

OOOO I hadn't heard of asv before. I'm using pytest-benchmark on another project.

I have a question about asv:

Looking at the documentation for asv briefly it looks like writing the benchmark is a bit more work than for pytest-benchmark (which integrates nicely with pytest) but you mention "running benchmarks across commits" which is something that pytest-benchmark does not support. Could you point me at the docs for that specific feature (I had a lazy look but couldn't find it) and do you happen to have any more insight in to comparing pytest-benchmark with asv?

While I hope my question is helpful for ciw I'm also somewhat shamefully asking for my other projects :)

drvinceknight avatar Apr 09 '25 07:04 drvinceknight

OOOO I hadn't heard of asv before. I'm using pytest-benchmark on another project.

I have a question about asv:

Looking at the documentation for asv briefly it looks like writing the benchmark is a bit more work than for pytest-benchmark (which integrates nicely with pytest) but you mention "running benchmarks across commits" which is something that pytest-benchmark does not support. Could you point me at the docs for that specific feature (I had a lazy look but couldn't find it) and do you happen to have any more insight in to comparing pytest-benchmark with asv?

While I hope my question is helpful for ciw I'm also somewhat shamefully asking for my other projects :)

Nice comparison to pytest-benchmark. The main difference is that asv is about tracking performance benchmarks of the Python project across git commits. If someone doesn't want that then asv is not a good choice as that's primarily what it does.

The package being primarily targeted at doing this performance tracking, you don't need to do anything other than its setup. asv is configurable, but none of the configuration, AFAIK, turns off this performance tracking. It is the core feature of the package to do that. There is not a specific place in the docs.

In the examples I have seen, people include asv as part of their continuous integration procedures.

galenseilis avatar Apr 09 '25 13:04 galenseilis

Cool. This sounds nice.

drvinceknight avatar Apr 11 '25 09:04 drvinceknight