JamiePringle
JamiePringle
This is related to the issue in #691, but with more specifics. Saving a variable length (ragged) array with zarr.save fails, but creating the file without the convenience function works....
I am writing a python code to convert the temporary *.npy output files to netCDF with minimal memory overhead, so that the conversion succeeds even if the output file is...
OceanParcels continues to be very useful; thanks! I am now running global problems with Mercator currents tracking coastal dispersal. I run on a large computer in parallel; this data is...
On netcdf 4.8.1 when I try to use ncdump to print a zarr file with a variable length (jagged) array, it crashes with a segmentation fault. On 4.9.0 on a...
This is my first attempt reading zarr from netCDF, and I have found this simple, reproducible error on both my Apple Silicon and Ubuntu 20.04 machines. It exists in both...
### Zarr version 2.8.1 ### Numcodecs version 0.9.1 ### Python Version 3.9 ### Operating System Linux ### Installation conda ### Description If I make a Numpy object array, and save...
I am making some large runs, with a total of 664,798,647 particles, of which roughly 1/7th are active at any one time. (Curious why? Check out https://github.com/JamiePringle/EZfate ). Every 12...
Dear Zarr community -- I am a physical oceanographer doing fairly large data work in an interdisciplinary setting. Many of the folks with whom I want to share my data...
In the discussion (#1963 ), I have posted a test case of a large, parallel model run. As part of this, I have been experimenting with various partitioning functions for...
### Parcels version 3.1.2 ### Description I am building a test case for very large runs, and have found an error in the implementation of `partition_function`. Since a better partition...