Selective Sampling is an approach in active learning where the algorithm selectively chooses which data points to label from an unlabeled data stream or pool, rather than randomly sampling all data.In a common scenario (stream-based selective sampling), as each unlabeled instance is encountered during training, the model decides whether that instance would be informative enough to query its label.The goal is to train the model more efficiently by only labeling the most useful examples, thereby focusing on informative samples and reducing labeling cost. Selective sampling strategies ensure that the learning algorithm queries labels for data points that it is most uncertain about or that would most reduce its prediction error, which leads to higher accuracy with fewer labeled examples compared to naive sampling.
Data Selection & Data Viewer
Get data insights and find the perfect selection strategy
Learn MoreSelf-Supervised Pretraining
Leverage self-supervised learning to pretrain models
Learn MoreSmart Data Capturing on Device
Find only the most valuable data directly on devide
Learn More