How to Use Roboflow Batch Processing on Images Stored in Azure Blob Storage
Published Oct 20, 2025 • 2 min read

Roboflow Batch Processing allows you to analyze and transform massive image datasets without managing servers or GPUs yourself. Connecting it to your Azure Blob Storage enables secure, scalable, pay-per-credit image processing that runs seamlessly across your organization’s infrastructure.

If your organization stores image data in Azure Blob Storage, you can integrate it directly with Roboflow Batch Processing to automate large-scale computer vision inference jobs.

This guide shows how to generate SAS-signed URLs using the Azure CLI and feed them to Roboflow via JSONL reference files.

To follow this guide, you will need:

  1. Images stored in Azure Blob Storage, and;
  2. A Roboflow Workflow which contains the models and logic that you want to run on your data. Don't have a Workflow yet? Check out our Workflows documentation to get started.

Using Roboflow Batch Processing on Images in Azure Blob Storage

Step 1: Generate Azure Blob SAS URLs

Use the following script to generate time-limited SAS URLs for all images in your Azure container and write them to a .jsonl file:

curl -fsSL https://raw.githubusercontent.com/roboflow/roboflow-python/main/scripts/generateAzureSasUrls.sh \
  | bash -s -- https://myaccount.blob.core.windows.net/mycontainer output.jsonl 6 8

Each line of output.jsonl will contain entries like:

{"name": "inspection__sample1.png", "url": "https://myaccount.blob.core.windows.net/...<SAS_TOKEN>"}

This ensures your files remain private while still accessible to Roboflow for batch inference.

Step 2: Create a Batch from Azure Blob Storage

Once you have your JSONL reference file, stage your Azure-hosted images for processing with the Roboflow CLI:

inference rf-cloud data-staging create-batch-of-images \
  --batch-id azure-batch-001 \
  --references output.jsonl \
  --data-source reference-file

Check batch ingestion progress:

inference rf-cloud data-staging show-batch-details --batch-id azure-batch-001

Step 3: Kick Off a Batch Job

After ingestion completes, trigger your Roboflow Workflow to process the batch:

inference rf-cloud batch-processing process-images-with-workflow \
  --workflow-id your-workflow-id \
  --batch-id azure-batch-001 \
  --notifications-url https://your-webhook-url

Roboflow handles all compute orchestration automatically, and job completion events will be sent to your webhook endpoint.

Step 4: Export Predictions

Once processing is complete, export your inference results locally or back to Azure:

inference rf-cloud data-staging export-batch \
  --target-dir ./azure-results \
  --batch-id azure-batch-001

Results of Using Roboflow Batch Processing on Images Stored in Azure Blob

By connecting Roboflow Batch Processing to your Azure Blob Storage, you can:

  • Run large-scale image analysis securely using SAS-signed URLs
  • Automate inference pipelines without manual polling or scripts
  • Eliminate infrastructure overhead while maintaining high throughput

To learn more about Batch Processing, refer to the Roboflow Batch Processing documentation.

Cite this Post

Use the following entry to cite this post in your research:

Contributing Writer. (Oct 20, 2025). How to Use Roboflow Batch Processing on Images Stored in Azure Blob Storage. Roboflow Blog: https://blog.roboflow.com/batch-processing-on-images-in-azure-blob-storage/

Stay Connected
Get the Latest in Computer Vision First
Unsubscribe at any time. Review our Privacy Policy.

Written by

Contributing Writer