Live Engine
Select Topic
easyBackpropagation
A neural network has 3 layers. You compute the forward pass successfully but during backpropagation, the gradient for the first layer is exactly zero for all weights. The loss is non-zero and the last layer's gradient is correct. What is the most likely cause?