activerecord - Rails: Working with a large set -


i have rake scripts operate on collections of hundreds of thousands of items.

often, server runs out of memory , script crashes. assume because code looks this:

asset.where(:archived => false).each { |asset| asset.action! }

as far can tell, rails fetches entire set memory, , iterates through each instance.

my server doesn't seem happy loading 300,000 instances of asset @ once, in order reduce memory requirements i've had resort this:

collection = asset.where(:archived => false) # activerecord::relation while collection.count > 0   collection.limit(1000).each { |asset| asset.action! } end 

unfortunately, doesn't seem clean. gets worse when action doesn't remove items set, , have keep track offsets too. have suggestions better way of partitioning data or holding onto relation longer, , loading rows necessary?

the find_each method designed in these situations. it'll

asset.where(:archived => false).find_each(:batch_size=>500) |asset|   asset.stuff end 

by default, batch size 1000


Comments