Giridhar Pemmasani
Giridhar Pemmasani
You can use class method as computation. However, you need to send object (`self` in above case) as first argument to computation (remote Python interpreter doesn't know about objects at...
Can you give fragment used to create cluster? If you use 'nodes=["192.168.2.24"]', then the other node shouldn't be used. If you first don't give any nodes and later added '192.168.2.24',...
I have fixed these issues but not ready to commit yet, as it turns out setting up SSL across dispy, dispynode and dispyscheduler had too many problems (although I have...
I believe current implementation shouldn't cause duplicate UIDs, as jobs that are done are kept in 'done_jobs' until they are no longer used anywhere (all user callbacks are called). There...
If you modified `kill_pid` with above step, then you should return 0 (to be safe, check that the process was indeed killed). If this is done correctly, there is no...
What OS are you using on nodes? Killing a process should kill its children too (unless signal handlers are installed). `terminate_job` makes sure that process is killed (i.e., `kill_pid` returns...
It is already fixed. Try latest release 4.8.5.
Yes, it was a problem with starting in daemon mode (background process). See the [fix](https://github.com/pgiri/dispy/commit/285fbc15c4b47217aa32a548d859e71c0bd16af9)
If you submit just 4 jobs, by the time second node is found and initialized, it is likely all jobs are already sent to the first node. Either delay submission...
Committed an alternate fix. I haven't tested it, but hopefully it works. Please confirm.