
If your images live in AWS S3, you can process millions of them cost-effectively with Roboflow Batch Processing — 25× cheaper than using Hosted API — while maintaining full control over automation and storage.
This guide will walk through how to use Roboflow Batch Processing with AWS S3 to automatically run inference on large image datasets stored in your S3 bucket. This guide walks you through generating signed URLs, creating JSONL reference files, and triggering Roboflow workflows directly from your AWS environment.
To follow this guide, you will need:
- Images stored on AWS S3, and;
- A Roboflow Workflow which contains the models and logic that you want to run on your data. Don't have a Workflow yet? Check out our Workflows documentation to get started.
Using Roboflow Batch Processing on Images Stored in AWS S3
Step 1: Generate S3 Signed URLs
Use the AWS CLI to generate short-lived signed URLs for your image files. Each URL allows Roboflow to access your data securely without exposing your bucket.
curl -fsSL https://raw.githubusercontent.com/roboflow/roboflow-python/main/scripts/generateS3SignedUrls.sh | bash -s -- s3://my-bucket/images output.jsonl 21600 8
This produces a signed_urls.jsonl file with one line per image:
{"name": "car__image1.jpg", "url": "https://s3.amazonaws.com/..."}
Step 2: Create a Batch from S3
Upload this reference file to Roboflow Batch Processing:
inference rf-cloud data-staging create-batch-of-images \
--batch-id my-s3-batch \
--references signed_urls.jsonl \
--data-source reference-file
..."}
You’ll receive a confirmation once ingestion begins. You can track progress via:
inference rf-cloud data-staging show-batch-details --batch-id my-s3-batch
Step 3: Run a Workflow
Once ingestion completes, kick off your processing job using your Roboflow Workflow ID:
inference rf-cloud batch-processing process-images-with-workflow \
--workflow-id my-workflow-id \
--batch-id my-s3-batch \
--notifications-url https://my-webhook-url
When the job finishes, you’ll receive a webhook event that the batch has completed successfully.
Step 4: Export Results
Download predictions (JSONL or CSV) from your completed batch:
inference rf-cloud data-staging export-batch --target-dir ./results --batch-id my-s3-batch
Results of Using Roboflow Batch Processing on Images Stored in AWS S3
With this setup, you can:
- Process 100k+ images in a single batch job
- Automate ingestion and processing directly from S3
- Receive webhook notifications for full pipeline automation
To learn more about Batch Processing, refer to the Roboflow Batch Processing documentation.
Cite this Post
Use the following entry to cite this post in your research:
Contributing Writer. (Oct 20, 2025). How to Use Roboflow Batch Processing on Images Stored in AWS S3. Roboflow Blog: https://blog.roboflow.com/batch-processing-for-aws-s3-images/