โ03-17-2022 03:23 PM
If you need to cache a large amount of data for subsequent retrieval in a recipe, how would you do it? Would you put it in a lookup table (limit 100K rows) or in a list of hashes?
Is there a limit on the number of rows in a list?
I am wondering how to cache a million or more records.
โ03-21-2023 12:13 PM
Hey @patrick-steil,
We at Workato have recently released two features that looks like a good fit with this specific ETL use case.
Workato FileStorage is a persistent storage system within Workato that can be used to store large volume data (limitless number of rows in the order of millions) and this data can be fetched and used across recipes.
Second, we have the new SQL Transformations utility connector that can operate on CSV data stored within FileStorage and perform transformations (compare with other data sets, live application data extract, or other CSV files), etc.
By using both these features, you can store any volume data as CSV within Workato and run queries on them. More details on the two feature are available here - FileStorage, SQL Transformations.
The features are premium and are in beta stage, so reach out to Workato's customer support to get more details.
Cheers!
Meghan