A fully connected layer (FC layer), also known as a dense layer, in a neural network is a layer where each neuron is connected to every neuron in the previous layer. This means it takes a vector input (flattening any structure from previous layers) and produces a vector output. Each output is a weighted sum of all inputs plus a bias, followed by a non-linear activation (if not the last layer). FC layers are typically found at the end of CNN architectures to integrate all the extracted features and make a final prediction (e.g., class scores). However, purely fully connected networks (multi-layer perceptrons) can also be the whole model for tabular data. FC layers have a lot of parameters (weights), which is manageable for moderate-sized inputs but not for large images unless after pooling/flattening. They are good at capturing global interactions between features but don’t exploit spatial structure (hence CNNs use conv layers for most of the work, then FC for final decision).
Data Selection & Data Viewer
Get data insights and find the perfect selection strategy
Learn MoreSelf-Supervised Pretraining
Leverage self-supervised learning to pretrain models
Learn MoreSmart Data Capturing on Device
Find only the most valuable data directly on devide
Learn More