Split List
Break large lists into smaller batches for controlled processing and better workflow management.
Split List divides a large list into smaller, manageable batches. Perfect for processing large datasets in chunks to avoid timeouts, API limits, or overwhelming systems.
When to Use It
- Process large campaign lists in smaller batches
- Break down hundreds of URLs into manageable groups
- Split client lists to avoid API rate limits
- Create controlled processing for large datasets
Inputs
Field | Type | Required | Description |
---|---|---|---|
List to Split | List | Yes | The large list you want to break into batches |
Batch Size | Number | Yes | Number of items per batch (1-100) |
Outputs
Output | Description |
---|---|
Batches | List of smaller lists, each containing the specified number of items |
Credit Cost
Free to use - no credits required.
How It Works
Takes your large list and creates multiple smaller lists based on your batch size:
Example:
The last batch may contain fewer items if the total doesn’t divide evenly.
Real-World Examples
Large Campaign Processing:
URL Batch Processing:
Email Campaign Batches:
API Rate Limit Management:
Batch Processing Strategy
With Loop Over List:
- Split your large list into batches
- Connect Split List to Loop Over List
- Inside the loop, process each batch as a complete unit
- Each iteration handles one batch (multiple items)
Example Workflow:
Tips
Choosing Batch Size:
- Start with smaller batches (10-25) for testing
- Increase based on system performance and API limits
- Consider processing time and memory usage
API Limits:
- Check rate limits for your data sources
- Batch size should stay well under hourly/daily limits
- Add delays between batches if needed
Error Handling:
- If one batch fails, other batches can still process
- Smaller batches make debugging easier
- Consider retry logic for failed batches
Performance Balance:
- Larger batches = fewer loop iterations but more data per iteration
- Smaller batches = more control but more overhead
- Test to find the optimal size for your use case
Memory Management:
- Large batches use more memory
- Important for data-heavy operations like image processing
- Monitor workflow performance with different batch sizes
FAQ
What happens if my list doesn't divide evenly by the batch size?
What happens if my list doesn't divide evenly by the batch size?
The last batch will contain the remaining items. For example, 10 items with batch size 3 creates batches of [3, 3, 3, 1]. This is normal and expected behavior.
How do I process the batches after splitting?
How do I process the batches after splitting?
Connect Split List to Loop Over List. Each loop iteration will process one complete batch (not individual items). Inside the loop, you can work with the entire batch at once.
What's the optimal batch size for my workflow?
What's the optimal batch size for my workflow?
Start with 10-25 items and test performance. Increase for faster processing, decrease if you hit memory limits or API restrictions. The optimal size depends on your specific workflow complexity.
Can I split an already small list?
Can I split an already small list?
Yes, but if your list is smaller than the batch size, you’ll get one batch containing all items. For example, 5 items with batch size 10 creates one batch of 5 items.
How is this different from Loop Over List alone?
How is this different from Loop Over List alone?
Loop Over List processes items one by one. Split List + Loop Over List processes items in groups. Use splitting when you need to handle multiple items together in each iteration.