Comment on page
Process inputs in batches
How to perform hundreds of executions of the same workflow.
In most cases, you need to scale your task execution.
It can be done either by using batch processing or using integrations/automations.
Let’s process input in batches.
Choose the "Upload Dataset" input option:

Starting batch input processing
Let's generate Google Ads for multiple products at once.
In our case, it’s a list of 5 products from Producthunt:

Formatting products data from Producthunt
We’ll use the full description field as input.
This AI model delivers the best results when the name of the product is mentioned in the description.
Drag-n-drop the file into the upload field:

Uploading a set of inputs
There is a limit to the number of entries in the dataset to be processed at once.
Currently, it’s 200 maximum.
Click the “Upload Dataset” button to start matching fields.
Select the fields from your CSV that match the mandatory input fields of this workflow.
In this case, it’s Product Description:

Matching data fields
This workflow should be checked and published for batch processing.
Wait a bit for all the entries to be processed.

Processing in batch
Usually, it takes almost the same time as a single entry, because of the parallel processing.
You get 9 variations for every input entry in this case (3 variations of Google Ads Headlines * 3 variations of Google Ads Body):

Results of batch processing
Find out, how many of them are successful or failed.
After that, you can make changes to your workflow, and input to improve your success rate.
Or just keep it as is if you’re happy.
There is rarely a 100% success rate when it comes to AI.
To find the best combination of the credits spent and the quality of the output.
The next way to scale your AI ops - integrate AI workflows with the tools you use every day.
Last modified 8mo ago