Split List divides a large list into smaller, manageable batches. Perfect for processing large datasets in chunks to avoid timeouts, API limits, or overwhelming systems.


When to Use It

  • Process large campaign lists in smaller batches
  • Break down hundreds of URLs into manageable groups
  • Split client lists to avoid API rate limits
  • Create controlled processing for large datasets

Inputs

FieldTypeRequiredDescription
List to SplitListYesThe large list you want to break into batches
Batch SizeNumberYesNumber of items per batch (1-100)

Outputs

OutputDescription
BatchesList of smaller lists, each containing the specified number of items

Credit Cost

Free to use - no credits required.


How It Works

Takes your large list and creates multiple smaller lists based on your batch size:

Example:

Input List: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
Batch Size: 3
Output Batches: [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10]]

The last batch may contain fewer items if the total doesn’t divide evenly.


Real-World Examples

Large Campaign Processing:

Google Ads Get Report (500 campaigns) → Split List (batch size: 50) → Loop Over List
"Process 500 campaigns in 10 batches of 50 each"

URL Batch Processing:

Extract URLs from Sitemap → Split List (batch size: 25) → Loop Over List → Web Scrape
"Scrape 200 URLs in batches of 25 to avoid timeouts"

Email Campaign Batches:

Sheets Read Data (client emails) → Split List (batch size: 10) → Loop Over List → Send Email
"Send personalized emails in batches of 10 to manage sending limits"

API Rate Limit Management:

Generate List (1000 keywords) → Split List (batch size: 100) → Loop Over List → Add delay between batches
"Process 1000 keywords while respecting API limits"

Batch Processing Strategy

With Loop Over List:

  1. Split your large list into batches
  2. Connect Split List to Loop Over List
  3. Inside the loop, process each batch as a complete unit
  4. Each iteration handles one batch (multiple items)

Example Workflow:

Large URL List (200 items)

Split List (batch size: 20)

Loop Over List (10 iterations)

Each loop processes 20 URLs together

Tips

Choosing Batch Size:

  • Start with smaller batches (10-25) for testing
  • Increase based on system performance and API limits
  • Consider processing time and memory usage

API Limits:

  • Check rate limits for your data sources
  • Batch size should stay well under hourly/daily limits
  • Add delays between batches if needed

Error Handling:

  • If one batch fails, other batches can still process
  • Smaller batches make debugging easier
  • Consider retry logic for failed batches

Performance Balance:

  • Larger batches = fewer loop iterations but more data per iteration
  • Smaller batches = more control but more overhead
  • Test to find the optimal size for your use case

Memory Management:

  • Large batches use more memory
  • Important for data-heavy operations like image processing
  • Monitor workflow performance with different batch sizes

FAQ