add ActiveModel::Datastore.save_all feature
Motivation
I want to use bulk insert feature to improve performance.
Resolution
implement ActiveModel::Datastore.save_all(entries)
in case of 500 records insertion, 200x faster than before.
Remark
I'm not good for English. please feel free to update comment and document :)
oh.. I'll fix code style reported by RuboCop.
@bmclean please review it.
Hi @shao1555. I like the concept. Hopefully I will have time to review it in more detail tomorrow.
@shao1555 This is ready for your feedback.
Hi @bmclean any particular reason why was this not merged yet? I am looking for the same functionality :-) I'll do a fork and try to clean it up.
Hi @NielsKSchjoedt. This was a proof of concept that was never completed. We left it here in case somebody wanted to pick it up again.
Also, one thing we have learned since this feature was started is that a batch size of 500 won't always work.
See entries.each_slice(500).map
While 500 is the max number of entities that can be passed to Datastore sometimes we hit the size limit first. We should set the default to 250 and allow a batch size to be passed in (up to 500).
@bmclean great. If I am to finish it I will add both a save_all and a save_all! method. Where the latter will throw an exception if any objects are invalid. Agree? As for the return value of the methods I can think of several ideas. From top of my head I am wondering if something like: {User#1 => true, User#2 => false} - a hash where the keys are the ActiveModel objects with a value of true or false depending on whether they were successfully persisted or not. What's your thoughts?
@NielsKSchjoedt Sure, save_all and a save_all! methods would be ok. Yes, perhaps the return values should be ActiveModel. That way validation errors can be extracted if needed. I like your hash idea with the booleans, that would work. For users that wanted a successful count returned they can run something like .select { |k, v| v == true }.size on the result.