A dedicated repo for monoidal construction and reduction
The recent activity by @Anton-Latukha has brought me to update the "acc" package and its benchmark suite, which is focused on monoidal construction and reduction using various data-structures. See https://github.com/nikita-volkov/acc#benchmark-results
I believe a better place for such a suite is in this org. I can create the repo and link to it from "sequences" if everyone's on board.
Oh and BTW, it would be super-duper helpful if someone had audited the suite! :) I don't want to mess up by making biased claims.
May I ask a stupid question - if sequences are for monoidal structures, compared to the repository for monoidal construction & reduction?
Is there some deep specifics in monoidal construction & reduction term that does not relate to laziness/strict modes of operation?
Even if repo would be a patchwork monster running several benchmarking backends or monorepo hosting several projects until/if they unify - that still allows a performance data be done, delivered & managed in a uniform way.
Because I think for stupid Haskellers like me it would be useful to see real numbers on both the most atomic & integrative tests, see both lazy & strict construction & operational costs, normalization costs & conversion costs, and the construction into normalization into full consumption to benchmark classic parser case. Because one thing to read books on it - other is to learn how to poke laziness & strictness & measure performance by tools & the other is constantly keep poking things & the other is to mostly save the hassle of doing all that to everyone & ecosystem & show factual recommendations that get regular updates.
One of the main things in public projects - is the network effect - once something central like HLS becomes useful - it is hard to stop its progress, & I just kinda "scared" of scattering the benchmark effort across number of projects & none of those projects getting a team of experts & maintainers around it & not becoming well-known & referenced enough to reach a network effect boundary. When Haskell is complex in data type performance metrics which multiplies with that Haskellers being uninformed how to choose data types.