Use Roboflow Trained Models to Annotate Data

One of the most time-consuming parts of the computer vision workflow is curating a high-quality dataset. When we launched Roboflow Annotate last month we aimed to streamline this process by launching an easy to use image annotation tool built right into your computer vision pipeline.

Don't forget to subscribe to our YouTube for more computer vision content!

But we think of that best-in-class tool as merely the starting point. Today, we're proud to announce Label Assist: a new feature that enables you to use your Roboflow Train models to accelerate your annotation work.

Simply select a model and Label Assist will use its predictions as the starting point for labeling. Instead of starting from scratch, you simply correct the model's predictions. This means the time you spend labeling is focused only on the areas where your model is under-performing. No need to spend time teaching the model things it already knows. (See tips for labeling images here.)

But that's not all, we're also adding Label Assist with public models (starting with the 80 classes from Microsoft COCO) for free on the Starter Plan. We plan to quickly ramp up our catalog of public models you can use with Label Assist via models trained on Roboflow's Public Datasets. (If there are datasets you'd like to contribute to this community effort, please let us know.)

To use our API for model assisted labeling, checkout this video:

We have a lot more in store for Roboflow Annotate in the coming months to continue to accelerate your workflow and make computer vision as simple as possible so you can focus on solving the problems that are unique to your business and not on reinventing the wheel building infrastructure.

Fully Automated Dataset Labeling

Roboflow also offers Auto Label, a fully hosted automated labeling tool. The foundation models used in Auto Label are powerful because of the nearly limitless number of classes and require iterating through various prompts to find the right combination for different objects.