In machine learning, a Batch refers to a subset of the training dataset processed together in one iteration. Using batches balances computational efficiency and gradient estimation stability. Training methods include: Batch Gradient Descent: Updates weights using the entire dataset in one pass. Mini-batch Gradient Descent: Uses a small, fixed-size batch for each update. Stochastic Gradient Descent: Updates weights after processing a single training example. Mini-batch processing is commonly used in deep learning to leverage parallel computation while ensuring smooth optimization convergence.
Data Selection & Data Viewer
Get data insights and find the perfect selection strategy
Learn MoreSelf-Supervised Pretraining
Leverage self-supervised learning to pretrain models
Learn MoreSmart Data Capturing on Device
Find only the most valuable data directly on devide
Learn More