cancel
Showing results for 
Search instead for 
Did you mean: 

Cost and design strategy choices

plmrskopecek
Deputy Chef I
Deputy Chef I

Howdy,

I'm roughly new to Workato but have been in development for decades.  While I see a lot of potential value in the solution as an integrative, orchestration, and governing platform...I struggle to frame some of the design options such that I can move forward building solutions.  The core issue is around design choices which affect cost.  Workato provides limiting features through recipe creation and task execution.  Due to this, I have to master/sub recipes which starts to breakdown some of the value add of Workato's governing facilitation.  Also tasks are counted for even accessing a variable (lookup table) at a notably high cost rate.  One of our needs includes on-prem agent so that actions which require our IP address can process.  However, the orchestration/execution logic cannot operate on an agent.  All the agent can do is execute a script/exe...which is me just not using Workato.

Since I'm newer to Workato, I thought I'd reach out here to see if others have tips/tricks or other insights.  Just to note, I did cover these challenges with our account rep, but the responding solution was directed towards spend more vs efficient reasonable options.

 

 

1 ACCEPTED SOLUTION

rachelnatik
Deputy Chef III
Deputy Chef III
6 REPLIES 6

Thanks for the link.  Some decent tips in there.  Still seeing a thread focused on batch design for improvements.

So I guess one question with that, one of the design styles in the video is to use filestorage as an individual append and then full one time read, are there any strategies/tips around the individual execution versus batching upstream?  For example, if every event I kicked off a recipe which appended the file it would cost me 1 task (which costs ~$0.01).  However, if I append an Azure Blob it costs me 0 tasks (which costs ~$0.00000228 per Azure https://azure.microsoft.com/en-us/pricing/details/storage/blobs/ ).

This is one of those design challenge points.  If I connect a recipe to any system which is non-batch (ex. record in CRM system update with correct criteria), each execution takes a task...which makes sense.  However, in connecting it for some form of near-live actionable flow it can burn through tasks very fast even without getting into design optimizations discussed in the video. 

Any recommendations on that front? (and thank you again)

Hm, I'm not sure I have a great answer.  I haven't had to use the large file/streaming for anything, but your concern sounds fair.  The only thing I'll toss out on that front is the cost of the infrastructure and maintenance involved in building your own thing to deal with Azure outside of Workato versus the cost/benefit of using Workato and tasks.  That's one of the main benefits is to have integrations/automations built that "just run", without having to patch servers, migrate OS, etc.  Of course, if you're comparing against a different cloud IPaaS then that's a different story.

As for the real-time integration, I think it just depends on the business requirements.  If the need truly is real-time, then you're right it could add up quickly if you're not careful.  But many times I've found that users WANT real-time b/c it sounds more exciting/useful, but when you dig in the NEED really isn't real-time.  Then again, I'm in the Higher Education space so I'm sure other industries vary.  But if the requirements can be challenged somewhat, then any amount of batching will go a long way.

Hope that helps a little