cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Parsing a CSV and hitting the 50K row limit

matt-kruzicki
Deputy Chef II
Deputy Chef II

Is there a way to create multiple csv files to get around this limit of 50K records when using the Parse CSV action? I know there's a datapill to identify the list size but not sure how to set this up to only grab the first 50K records and parse them on the first file and then grab the next x amount of records and have them parsed on a second file.

2 ACCEPTED SOLUTIONS

gary1
Executive Chef III
Executive Chef III

My understanding is 50k is not a hard limit, but the real limit is based on memory usage. I regularly use the CSV action on docs with 200k+ records without issue. You could also try using the SQL Collection, which is pretty much bulletproof when it comes to size limits.

If you really want to break it up into 50k chunks, I can provide some quick instructions after you confirm.

View solution in original post

Prajwal
Deputy Chef III
Deputy Chef III

Hi @matt-kruzicki ,

I've found a solution for chunking large files (over 50K records). I utilized Google Drive to store the large CSV file and implemented a simple Python script to achieve this task. You can refer to the recipe linked below for more details and clarification:
Recipe Link: Chunking the csv file which has more than 50K records 

Let me know if you find this helpful.

Prajwal Prasanna

View solution in original post

7 REPLIES 7

gary1
Executive Chef III
Executive Chef III

My understanding is 50k is not a hard limit, but the real limit is based on memory usage. I regularly use the CSV action on docs with 200k+ records without issue. You could also try using the SQL Collection, which is pretty much bulletproof when it comes to size limits.

If you really want to break it up into 50k chunks, I can provide some quick instructions after you confirm.

Thanks for the response! If we can get past the 50k limit that's be the preferred option but currently the recipe is throwing an error on the parse csv action when it's over the 50k. Ultimately what I need is to take a tab separated text file and create a csv file so that I can use that csv file in a later action (that only allows csv and xlsx file types).  You may have a better option for this than using the parse and compose csv actions but that's what I'm attempting and it fails when over 50k rows.

gary1
Executive Chef III
Executive Chef III

Got it. I suggest trying the Collection action first. It's the simplest solution.

If the Collection doesn't work, then you can try to process the CSV in chunks.

sampath
Deputy Chef III
Deputy Chef III

I think, CSV parse connector has hard limit. When you try to parse more than 50,000 lines, then it will throw an error.

As per my experience, @matt-kruzicki  you have to upload your large file to google drive or workato file storage, and then read by chunks. Then you can parse those chunks and proceed with your target logic. 

That should be the best option. Ignoring warning or potential errors are not best practices.