Error with too many particles in the simulation
I'm using PyBroMo to obtain simulated photon timestamps in a range of concentration up to some nanomolars. When the number of particles inside the simulation box is bigger than about 150, I get the following error when performing the simulation:
File "C:\aroot\work\hdf5-1.8.15-patch1\src\H5A.c", line 259, in H5Acreate2
unable to create attribute
File "C:\aroot\work\hdf5-1.8.15-patch1\src\H5Aint.c", line 275, in H5A_create
unable to create attribute in object header
File "C:\aroot\work\hdf5-1.8.15-patch1\src\H5Oattribute.c", line 347, in H5O_attr_create
unable to create new attribute in header
File "C:\aroot\work\hdf5-1.8.15-patch1\src\H5Omessage.c", line 224, in H5O_msg_append_real
unable to create new message
File "C:\aroot\work\hdf5-1.8.15-patch1\src\H5Omessage.c", line 1945, in H5O_msg_alloc
unable to allocate space for message
File "C:\aroot\work\hdf5-1.8.15-patch1\src\H5Oalloc.c", line 1142, in H5O_alloc
object header message is too large
End of HDF5 error back trace
Can't set attribute 'particles' in node:
/parameters (Group) 'Simulation parameters'.
@lampo808, thanks for the report!
Can you provide a minimal script/notebook to reproduce the error? Also information of operating system, version of python, numpy, pytables and pybromo is typically useful. Thanks!
Yes, sorry for not having already done that.
I'm attaching a zipped sample notebook that I wrote to test the simulation. Now the parameters generate the error.
For what concerns the versions, I'm running a Win10 machine with the following versions for python & libs: Python 3.5.1 :: Anaconda 4.0.0 (64-bit) Numpy version: 1.10.4 PyTables version: 3.2.2 PyBroMo version: 0.6+16.g8bcb267
@lampo808, thanks. I'm currently travelling but I will investigate this ASAP.
This is an issue with the serialization of Particles object into a HDF5 attribute (done automagically by pytables). For some reason it works for small number of particles but it fails with a larger number.
I think the solution would be teaching Particles to dump/load itself into JSON. Then the JSON can be saved as pure text attribute. There are limitations in attribute size in HDF5. Maybe it is better to move this data to an HDF5 node instead of an attribute.
@lampo808, I pushed a quick fix to master for this issue. It seems to work but there may be some lurking regression since I now save less Particles' data to HDF5 (the additional data was redundant, but maybe some script or notebook is relying on it).
For the next major version Particles object needs to be simplified to store (as attribute) only the list of particles. The generation of random position (Particles._generate()) can become a public classmethod that returns a Particles object. Simulation box and random state can be handled by ParticlesSimulation, as it is now. This will decouple Box and Particles objects which is simplifies the logic quite a bit.