If you need to cache a large amount of data for subsequent retrieval in a recipe, how would you do it? Would you put it in a lookup table (limit 100K rows) or in a list of hashes?
Is there a limit on the number of rows in a list?
I am wondering how to cache a million or more records.
Agreed it is not optimal, but some use cases would require it.
There are some cases where you would might need to have a large dataset of records, say from Workday, to compare to what currently is in a legacy DB, and doing repeated calls across the network to the database is slow, or maybe there are requirements for a snapshot of a continually changing source to compare to another large dataset. So it may not be the common case, but you might need to do an ETL solution requiring temporary storage of a large dataset. It is not *always* indicative of bad design.
The limit on the size of a collection is 50K records, BTW.
How about storing these records onto one or multiple Pub/Sub topic(s)?
And then have your secondary recipe(s) read from those to process them.
Along with that we also use a SQL server to deal with these larger quantities.