JATothrim
JATothrim
I cloned and test run `./polycube_generator 13 -o cubes13.dat` and it completed in 44 seconds. For comparison: - Rust `cargo run -r enumerate 13` is still going after 4min and...
@snowmanam2 The most interesting thing is the fact that this is an embarrassingly parallel solution? I have 8 core machine with 16 threads and this scaled to ~1200% cpu usage...
I pulled today and looked at the code. I noticed the progress bar is still gone? I hoped it would do an come back. :wink: `polycube_generator 14 -i cubes13.dat -o...
I tested using zlib deflate settings `level=2` and `memLevel=9`. This increased the N=14 output to 5.2 Gb: ``` Starting file reader in PCube mode with compression. Starting file writer in...
This thread is getting rather long.. I don't see reason to continue this discussion here since the code doesn't actually live in here. I made https://github.com/snowmanam2/SnowmanPolycubeGenerator/issues/1#issue-1888583765 to continue this discussion...
The C++ `cubes` outputs its data currently *only* in following format as cache-files: ``` struct Header { uint32_t magic = MAGIC; // should be "PCUB" = 0x42554350 uint32_t N; //...
Just looking what the cubes output looks like made my day. :+1: You should publish this as repository so that the main repository could track this as git submodule?
> This is just the union of all the cubes in the base polycube shifted +/-1 for all axis? > Is this the "expansion list"? I'm answering quite late and...
This is more an thought experiment than proposal at this point... I made the orignal post after I tried to make an version of Hashy that de-duplicates portion of Cube...
> I think we could also look at how much overhead the unordered set adds. Since we are only adding elements to the set and never removing any (except clear),...