Process inputs in batches

How to perform hundreds of executions of the same workflow.

In most cases, you need to scale your task execution.

It can be done either by using batch processing or using integrations/automations.

Let’s process input in batches.

1. Get back to the input tab

Choose the "Upload Dataset" input option:

2. Prepare a CSV with the required input data.

Let's generate Google Ads for multiple products at once.

In our case, it’s a list of 5 products from Producthunt:

We’ll use the full description field as input.

This AI model delivers the best results when the name of the product is mentioned in the description.

3. Upload the CSV

Drag-n-drop the file into the upload field:

There is a limit to the number of entries in the dataset to be processed at once.

Currently, it’s 200 maximum.

Click the “Upload Dataset” button to start matching fields.

4. Match data fields

Select the fields from your CSV that match the mandatory input fields of this workflow.

In this case, it’s Product Description:

5. Launch it

This workflow should be checked and published for batch processing.

Wait a bit for all the entries to be processed.

Usually, it takes almost the same time as a single entry, because of the parallel processing.

6. Check the results

You get 9 variations for every input entry in this case (3 variations of Google Ads Headlines * 3 variations of Google Ads Body):

Find out, how many of them are successful or failed.

After that, you can make changes to your workflow, and input to improve your success rate.

Or just keep it as is if you’re happy.

There is rarely a 100% success rate when it comes to AI.

To find the best combination of the credits spent and the quality of the output.

The next way to scale your AI ops - integrate AI workflows with the tools you use every day.

Last updated