How to Use Roboflow Batch Processing on Images Stored in Google Cloud Storage (GCS)
Published Oct 20, 2025 • 2 min read

Roboflow Batch Processing enables you to analyze and transform large image datasets without running servers or GPUs yourself. Connecting it to your Google Cloud Storage (GCS) bucket unlocks fully automated, pay-per-credit image processing at scale.

If your organization stores image data in GCS, you can integrate GCS directly with Roboflow Batch Processing to automate large-scale inference jobs. This post shows how to generate signed URLs using the gcloud CLI and feed them to Roboflow via JSONL reference files.

To follow this guide, you will need:

  1. Images stored on Google Cloud Storage, and;
  2. A Roboflow Workflow which contains the models and logic that you want to run on your data. Don't have a Workflow yet? Check out our Workflows documentation to get started.

Using Roboflow Batch Processing on Images Stored in Google Cloud Storage (GCS)

Step 1: Generate GCS Signed URLs

Use this script to create a signed-URL reference file for all image assets in your bucket:

curl -fsSL https://raw.githubusercontent.com/roboflow/roboflow-python/main/scripts/listgcs.sh | bash -s -- gs://my-bucket/images output.jsonl 21600 8

Each line of output.jsonl will contain:

{"name": "plant__leaf1.png", "url": "https://storage.googleapis.com/..."}..."}

Step 2: Create a Batch from GCS

Use Roboflow’s CLI to stage your GCS images for batch processing:

inference rf-cloud data-staging create-batch-of-images \
  --batch-id gcs-batch-001 \
  --references output.jsonl \
  --data-source reference-file


Track ingestion status:

inference rf-cloud data-staging show-batch-details --batch-id gcs-batch-001


Step 3: Kick Off a Batch Job

Once ingestion completes, trigger your workflow for processing:

inference rf-cloud batch-processing process-images-with-workflow \
  --workflow-id your-workflow-id \
  --batch-id gcs-batch-001 \
  --notifications-url https://your-webhook-url


Roboflow will handle compute orchestration, and your webhook will receive job completion events automatically.

Step 4: Export Predictions

Once finished, export processed data:

inference rf-cloud data-staging export-batch --target-dir ./gcs-results --batch-id gcs-batch-001

Results of Using Roboflow Batch Processing on Images Stored in Google Cloud Storage

 By connecting Roboflow Batch Processing to your GCS storage, you can:

  • Run large-scale image analysis jobs securely with signed URLs
  • Automate inference pipelines without manual polling
  • Cut infrastructure overhead while maintaining accuracy and throughput

To learn more about Batch Processing, refer to the Roboflow Batch Processing documentation.

Cite this Post

Use the following entry to cite this post in your research:

Contributing Writer. (Oct 20, 2025). How to Use Roboflow Batch Processing on Images Stored in Google Cloud Storage (GCS). Roboflow Blog: https://blog.roboflow.com/batch-processing-for-google-cloud-storage-images/

Stay Connected
Get the Latest in Computer Vision First
Unsubscribe at any time. Review our Privacy Policy.

Written by

Contributing Writer