<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Working w/ Bulk Data in NetSuite in Workato Pros Discussion Board</title>
    <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/35#M35</link>
    <description>&lt;P&gt;&lt;STRONG&gt;[Nov 26, 2020] Jason Jho (CTO at Anvyl) replied: &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;If the data processing logic doesn't have to live in the body of your parent recipe, you could potentially leverage Bulk triggers instead. For instance, if you have a New/Updated Records in Bulk, a job will be created for each batch of 200 records. This way you don't have to maintain lists and pagination logic.&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;Hope this helps!&lt;/P&gt;</description>
    <pubDate>Fri, 15 Jan 2021 11:55:48 GMT</pubDate>
    <dc:creator>jessica-lie</dc:creator>
    <dc:date>2021-01-15T11:55:48Z</dc:date>
    <item>
      <title>Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/32#M32</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 25, 2020] Joe Blanchett (Product Manager, Finance Systems at Seat Geek) posted: &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;We have a few recipes where we need to work with &amp;gt; 200 rows of data in NetSuite. The search records call maxes out at 200 results on the first page, so the only way I've come up with to get &lt;STRONG&gt;all &lt;/STRONG&gt;results involves multiple steps that iterate through each page, and accumulating the results into a workato variable list along the way. As you can see below, this method involves 10 steps just to get all the data that's needed. Has anyone come up with a more elegant way to get high volumes of data from NetSuite?&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Fri, 15 Jan 2021 11:50:33 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/32#M32</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T11:50:33Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/33#M33</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 25, 2020] Gordon Hu from WeGalvanize replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;Throwing my two cents here:&lt;BR /&gt;&lt;BR /&gt;1. If the total number of pages is big – this recipe will run for a long time. It may also run into limit of the number of items Workato variable supported.&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;2. If you have a third party place to “park” your data (e.g., as simple as Workato lookup table, Box CSV, or DynamoDB, or RedShift), here is what we usually do:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Create a “master recipe” to make initial call, upload the first batch. If there is more than one page, call the “callable_1”, pass the parameter (e.g., total number of pages, next page, etc)&lt;/LI&gt;&lt;LI&gt;Callable_1: similar to master recipe, make the next page’s call and upload the next batch. If the current page is &amp;lt; total pages: call callable_2&lt;/LI&gt;&lt;LI&gt;Callable_2: similar to Callable_1, same logic. If the current page is &amp;lt; total pages: call callable_1&lt;/LI&gt;&lt;/UL&gt;&lt;BR /&gt;&lt;P&gt;Then, tweak Callable_1 and Callable_2 so that it knows “when to stop”.&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;There are a few more small things to note:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;This strategy can be used for any pagination involved api calls.&lt;/LI&gt;&lt;LI&gt;The place where you park the data may not support large number of rows (e.g., lookup table)&lt;/LI&gt;&lt;LI&gt;The place where you “park” the data may have some compliance issue (e.g., can you store customer’s data on box?)&lt;/LI&gt;&lt;LI&gt;Once the data is parked, can workato “get it back” ? Does workato have ability to digest the large number of rows?&lt;/LI&gt;&lt;/UL&gt;</description>
      <pubDate>Fri, 15 Jan 2021 11:52:02 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/33#M33</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T11:52:02Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/34#M34</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 25, 2020] Joe Blanchett (Product Manager, Finance Systems at Seat Geek) replied: &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Thanks for the feedback Gordon! &lt;/P&gt;&lt;BR /&gt;&lt;P&gt;I totally agree with your points below. I haven’t ran into the variable list limit yet - and to be honest I’m not sure what that is. To avoid a 90 min timeout on the Workato side, I’ve typically used (1) recipe to just get all of the data and (2) a recipe and that processes each individual record. We’re currently processing up to 8K records daily. As we scale up past ~20K records we will use a Postgres DB to “park” the data like you suggested.&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 11:53:05 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/34#M34</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T11:53:05Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/35#M35</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 26, 2020] Jason Jho (CTO at Anvyl) replied: &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;If the data processing logic doesn't have to live in the body of your parent recipe, you could potentially leverage Bulk triggers instead. For instance, if you have a New/Updated Records in Bulk, a job will be created for each batch of 200 records. This way you don't have to maintain lists and pagination logic.&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;Hope this helps!&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 11:55:48 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/35#M35</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T11:55:48Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/36#M36</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 26, 2020] Brian Flood (Vice President, IT, Business Systems, &amp;amp; Data at Fastly) replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;We've had to do essentially the same thing for results from BigQuery and Okta (&lt;A href="https://app.workato.com/recipes/1148428?st=3318f9" target="_blank" rel="noreferrer noopener"&gt;https://app.workato.com/recipes/1148428?st=3318f9&lt;/A&gt;- this one gets ugly because of the way they give you the next page).&amp;nbsp;&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;I feel like this is an area where I disagree philosophically with Workato about how this should be handled. I can't think of a use case where a user would only want a single page of results, which leads me to think the pagination should just be built into the connector, rather than making the users figure out how to page through the results from the various APIs. In some cases, you wouldn't even know there were more pages since, like with Okta, the next page might be passed as a parameter in the response header which the OOB connector does not give you access to and there's no indicator to tell you that your results are partial. It might just seem odd you have exactly 1000 records.&amp;nbsp;&lt;/P&gt;&lt;P&gt;Is there a use case for not wanting every page of results (outside of testing)?&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 11:56:58 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/36#M36</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T11:56:58Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/37#M37</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 27, 2020] Mike Flynn (Principal Software Engineer at Rapid7) replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Joe - I agree with Gordon, we follow a similar strategy for getting large searches back from NetSuite (like generating all 1099s at the end of the year).&amp;nbsp; We created some utility recipes to help with getting all the results back from a saved search.&amp;nbsp; There are 3 recipes - the main recipe that initiates the search, the pub/sub recipe that gets all the results from all the pages, and the recipe that processes the results.&amp;nbsp; Here's basically how it works:&lt;/P&gt;&lt;P&gt;1.&amp;nbsp; Main recipe calls the saved search to get the Search ID and the number of pages.2.&amp;nbsp; Main recipe then does a loop for the number of pages in the search (does not get the page from NetSuite yet)&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2a.&amp;nbsp; Posts to a Pub/Sub topic with the search ID and page number&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; 2b.&amp;nbsp; Adds an entry into a lookup table with the search ID, page number, and status of "PENDING"3.&amp;nbsp; Uses recipe ops to start the recipe that will monitor and process all the results from the search.&lt;/P&gt;&lt;P&gt;4.&amp;nbsp; Main recipe job ends.&lt;/P&gt;&lt;P&gt;1.&amp;nbsp; Pub/Sub recipe is listening for Pub/Sub posts for searches, when a new post is made to the topic it does the following2.&amp;nbsp; Executes the search and gets all the results for the page3.&amp;nbsp; Stores the results in a database or a lookup table4.&amp;nbsp; Updates the entry in the lookup table to have a status of "COMPLETED"5.&amp;nbsp; Pub/Sub recipe job ends.&lt;/P&gt;&lt;P&gt;1.&amp;nbsp; Processing Recipe is started by the main recipe, and checks every 5 minutes if all the pages of the search have been found (does a search of the lookup table to see if all pages have status of "COMPLETED").&amp;nbsp; When all pages are "COMPLETED", then do the following2.&amp;nbsp; Do whatever action is needed to process the rows (perform calculations, kick off new searches, update NetSuite or other system, etc)3.&amp;nbsp; Delete all the rows from the lookup table for this search&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;This strategy allows us to split the search into multiple threads, so that the pub/sub recipe is fetching 5 pages of results at a time.&amp;nbsp; When using this for generating 1099s, the pub/sub recipe is fetching all vendors, all vendor bills, and all vendor payments (so executing thousands of searches and multiple pages of results for each search).&amp;nbsp; There is one main recipe - the slack command that starts the 1099 process.&amp;nbsp; There is a processing recipe for vendors, vendor bills, vendor payments, and the relationships between payments and bills (since it is a many-to-many relationship).&amp;nbsp; The processing recipe will initiate new pub/sub searches (first get all the vendors, then for each vendor get all their bills, then for each bill get the payments made for that bill, and finally do the calculation for what is 1099able).&amp;nbsp; Like Gordon mentioned, it is handy for the processing recipe to know when to start and when to kick off the next level of processing.&lt;/P&gt;&lt;P&gt;Here are some screenshots of examples:&lt;/P&gt;&lt;P&gt;1.&amp;nbsp; main recipe - for each vendor start the search for vendor payments&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;2.&amp;nbsp; pub-sub example recipe - executes search for vendor payments, updates MySQL (where we are temporarily storing data), start process for more searches&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;&lt;/P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;P&gt;3.&amp;nbsp; processing recipe - this is the very final processing recipe that generates the final report.&amp;nbsp; There are other processing recipes for handling results for vendors/payments/bills&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;&lt;/P&gt;&lt;BR /&gt;</description>
      <pubDate>Fri, 15 Jan 2021 12:00:00 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/37#M37</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T12:00:00Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/38#M38</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 27, 2020] Ryan Koh (Customer Success at Workato) replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Lots of good discussion points here. I would love to find out what are some of the business scenarios where you require the full data set from your systems as opposed to more real time data? And how often do you run these types of workflows?&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;And when you have the full data set, what do you do with that data? How do you detect what is new and what has changed or is that even necessary?&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 12:01:11 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/38#M38</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T12:01:11Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/39#M39</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 27, 2020] Joe Blanchett (Product Manager, Finance Systems at Seat Geek) replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Thanks everybody for the great info. We will definitely be implementing some of these strategies at SeatGeek.&amp;nbsp;&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;@Mike - way to utilize the entire Workato Suite! I like the idea of leveraging lookup tables and pub/sub. Have not had the opportunity to use Recipe Ops yet.&lt;/P&gt;&lt;P&gt;@Brian - I totally agree. It would be great if pagination was a built-in option in the connector itself.&amp;nbsp;&lt;/P&gt;&lt;P&gt;@Ryan - we have at least 2 use cases that come to mind (1) payment processor settlements and (2) customer syncs from NetSuite to Adaptive.&amp;nbsp;&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;For 1, we drop in all transactions that our payment processor settles to us (we process payments on behalf of our enterprise clients) in a custom NetSuite table, calculate our fees, then create bill/bill payment transactions to payout the remaining funds to our clients. We get up to 8K transactions per batch and need to ensure a batch is fully processed and never in a partially processed state for a few reasons, one of which is that our cash release process is based on comparing incoming and outgoing cash on the batch level. For 2, we have &amp;gt; 200 customers and sync the customer table nightly to Adaptive Insights.&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 12:02:06 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/39#M39</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T12:02:06Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/40#M40</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 30, 2020] Brian Flood (Vice President, IT, Business Systems, &amp;amp; Data at Fastly) replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;There are several use cases we have where a large bulk upload makes more&amp;nbsp;sense than real-time. One is daily aggregated customer usage&amp;nbsp;statistics. Once a day we update every single customer account in Salesforce with aggregated usage data from our data warehouse. The underlying tables contain billions of events, and updating Saelsforce in real time would quickly overwhelm their APIs and hit our call limits, we also don't really care about intraday&amp;nbsp;usage so more than once a day is not useful. There are also use cases where either the source&amp;nbsp;or destination&amp;nbsp;does not support real-time and only supports something like a daily FTP feed. We have a couple of those with our Equity management tools (Shareworks) and our LMS (Bridge). We also use our data warehouse in BigQuery to calculate all types of things we want to push to various systems, as well as use it as lightweight master data management. BigQuery just does not lend itself to real time processing so some sort of batch update is always needed whenever it is involved, which is often for us.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 12:02:50 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/40#M40</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T12:02:50Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/41#M41</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Nov 30, 2020] Gordon Hu from WeGalvanize replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;I found a “work around” to overcome the real time API limit.&amp;nbsp;&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;Original: we need multiple workato real time recipient will pull every 5 minutes.&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;Work around - to slow down calls&lt;/P&gt;&lt;P&gt;1. Define a master schedule recipe that calls every 5, 15, 60, day, week callable recipes.&amp;nbsp;&lt;/P&gt;&lt;P&gt;2. Make callable recipes according to these schedules.&lt;/P&gt;&lt;P&gt;3. Then make another callable recipe to extract data between now and now-&amp;lt;period&amp;gt;&lt;/P&gt;&lt;P&gt;We use this approach for Okta events to monitor login failure, new hire activations, user offboardings etc. We’ve also used this to pull data for aggrgated statistics (eg Gong, Salesforce).&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;Pros: one recipe design that can be reused in many systems.&lt;/P&gt;&lt;P&gt;Cons: now I need to make sure the time master recipes are correct, and attach the callable recipes to the right schedule. However once attached, there is pretty much no debugging. Only need to tweak the last callable recipe.&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 12:03:57 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/41#M41</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T12:03:57Z</dc:date>
    </item>
    <item>
      <title>Re: Working w/ Bulk Data in NetSuite</title>
      <link>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/42#M42</link>
      <description>&lt;P&gt;&lt;STRONG&gt;[Dec 2, 2020] Amy Jorde (Senior Integration Engineer at Malwarebytes) replied:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;If you have an OPA, you can also connect to the NetSuite JDBC driver to pull higher volumes of data. That said, this might not be completely practical for a regular integration use case without parking the data in a data warehouse, as it does pull the object ids rather than field values for the related objects (like subsidiary, location, custom list fields), so some work would be required to piece the details back together.&amp;nbsp;&lt;/P&gt;&lt;BR /&gt;&lt;P&gt;We send the raw NetSuite data into Snowflake, and then I piece it back together using SQL for use cases with this type of requirements (like sending transaction data to Anaplan, for example).&lt;/P&gt;</description>
      <pubDate>Fri, 15 Jan 2021 12:05:43 GMT</pubDate>
      <guid>https://systematic.workato.com/t5/workato-pros-discussion-board/working-w-bulk-data-in-netsuite/m-p/42#M42</guid>
      <dc:creator>jessica-lie</dc:creator>
      <dc:date>2021-01-15T12:05:43Z</dc:date>
    </item>
  </channel>
</rss>

