19 Oct 2023
"Saturating" refers to a situation where the activation function of a neuron reaches its maximum or minimum output value, and any further changes in the input have little to no effect on the output and the neuron becomes unresponsive. This can hinder the learning process in a neural network.
That's one of the reasons why Rectified Linear Units (ReLUs) have become popular in neural networks. ReLUs do not saturate for positive input values; they output the input as-is if it's positive and zero for negative input values.