About the series
As you know, we at Humans in the Loop have a great love and appreciation of a well-designed annotation tool. After the great feedback on the reviews we published of our the best platforms on the market here and here, we decided that it’s time for a deep dive in some of our all-time favorites!
This article is the third review in our series of 10 reviews. The first two on Supervise.ly and Trainingdata.io can be found here and here. which will be published each week. Soon we will be uploading the links to the other articles as they are released.
The whole series is based on the premise of transparency and honesty and none of these reviews are sponsored. They are just our way to give props to the best teams out there working on making annotation easier for AI teams, and to share some of the know-how that we have been accumulating over the past few years as a professional annotation company.
As in previous reviews, our parameters are:
- project management
If you have additional questions or want to get in touch with us to beta test or feature your tool in an upcoming article, feel free to email us at firstname.lastname@example.org!
Superannotate is a venture of a team of computer vision PhDs, engineers and developers which was created as a simple tool for semantic segmentation. The real boost for it came after it was accepted to Berkeley’s Skydeck Accelerator Program.
Given the team’s academic background, they have recently partnered with OpenCV to release their entire desktop app for free so as to support research. For commercial projects, the first 100 images are free, after which paid versions start with Starter (up to 10.000 images), Pro (unlimited images), and Enterprise (unlimited, custom).
Annotate.online provides for both pixel-wise as well as vector annotations with all range of tools such as polygons, bounding boxes, brushes, cuboids, polylines, ellipses, points and keypoints. The platform allows for keypoints templates to be set up for annotators to follow (e.g. the same sequence of keypoints that have to be marked on human figures on each image).
The same type of templates can be created for objects of different classes as well using the ‘Workflow’ feature. It allows for the annotator to automatically change classes as they move forward in the annotation process following a predetermined sequence of steps.
Having successfully delivered annotation projects themselves, Superannotate has created the platform with tons of small tricks that make all the difference from an annotator standpoint. For example, when a user is drawing an object close to the edge of the image, it automatically sets its coordinate as 0, avoiding thin annotated stretches along the image edge which appear in other tools. Another such trick is the “eyedropper” which easily copies properties from one object to the other.
The tool supports uploading and downloading images from an S3 Bucket and Command Line Interface as well besides the device and Google storage. The downloads come in the form of JSON files with coordinates + fused semantic masks + a blue map of the separate instances. Classes can be added manually, or as a JSON file.
In terms of team management, the ‘Analytics’ feature is quite handy because it not only provides information about the project as a whole and the target achieved, but also about individual annotators’ efficiency. It keeps account of their rate of work i.e. number of instances and images completed per hour which encourages both quality and accountability for the work.
The labeling platform allows a supervisor to annotate a few images in a project and ‘pin’ them. Thus, creating an instruction for other annotators within a team. Another team member can use the ‘image filter’ feature to find the pinned image and use it as a ‘gold standard’ of what the annotation needs to look like. Filters can also be used to select images by name, assignee, annotation status, or prediction status.
Images can be assigned to team members for annotation and for QA, and supervisors can reject or decline images for further corrections with comments. If the supervisor and the annotator don’t reach a consensus, the case can be escalated to an admin – a great feature for solving team debates about edge cases!
A similar automation feature that we are expecting to be released in the near future is Active learning within the tool itself. So, as Superannotate is preparing for new developments in the upcoming months, we will be following their updates!
Hope this was helpful! If you are working on an AI project and are currently reviewing which tool might be the most appropriate for it, get in touch with us and we would be happy to have a call and advise you on the best way to build your pipeline.