cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Cache millions of records

kolson
Deputy Chef I
Deputy Chef I

If you need to cache a large amount of data for subsequent retrieval in a recipe, how would you do it? Would you put it in a lookup table (limit 100K rows) or in a list of hashes?

Is there a limit on the number of rows in a list?

I am wondering how to cache a million or more records.

10 REPLIES 10

kolson
Deputy Chef I
Deputy Chef I

I see that this is not possible in Workato.

anthony-oconnor
Deputy Chef I
Deputy Chef I
I'm not sure what the row limit is on collections, but you might want to look at those, they also allow you to use SQL to query against them and have more columns than lookup tables.
They do not persist outside of the running recipe though, whereas lookup tables do.

I know this isn't helpful but I'd probably just look at redesigning something so you don't have to cache 1M+ rows in your middleware.


--

kolson
Deputy Chef I
Deputy Chef I

Agreed it is not optimal, but some use cases would require it.

There are some cases where you would might need to have a large dataset of records, say from Workday, to compare to what currently is in a legacy DB, and doing repeated calls across the network to the database is slow, or maybe there are requirements for a snapshot of a continually changing source to compare to another large dataset. So it may not be the common case, but you might need to do an ETL solution requiring temporary storage of a large dataset. It is not *always* indicative of bad design.

The limit on the size of a collection is 50K records, BTW.

steven-marissen
Executive Chef I
Executive Chef I

How about storing these records onto one or multiple Pub/Sub topic(s)?

And then have your secondary recipe(s) read from those to process them.


Along with that we also use a SQL server to deal with these larger quantities.