โ03-17-2022 03:23 PM
If you need to cache a large amount of data for subsequent retrieval in a recipe, how would you do it? Would you put it in a lookup table (limit 100K rows) or in a list of hashes?
Is there a limit on the number of rows in a list?
I am wondering how to cache a million or more records.
โ03-17-2022 03:51 PM
I see that this is not possible in Workato.
โ03-17-2022 03:56 PM
โ03-17-2022 04:37 PM
Agreed it is not optimal, but some use cases would require it.
There are some cases where you would might need to have a large dataset of records, say from Workday, to compare to what currently is in a legacy DB, and doing repeated calls across the network to the database is slow, or maybe there are requirements for a snapshot of a continually changing source to compare to another large dataset. So it may not be the common case, but you might need to do an ETL solution requiring temporary storage of a large dataset. It is not *always* indicative of bad design.
The limit on the size of a collection is 50K records, BTW.
โ03-17-2022 04:43 PM
How about storing these records onto one or multiple Pub/Sub topic(s)?
And then have your secondary recipe(s) read from those to process them.
Along with that we also use a SQL server to deal with these larger quantities.