Live Engine
Select Topic
easyNeurons And Perceptrons
A neural network layer computes z = Wx + b, where W is a 64×128 weight matrix, x is a 128-dimensional input, and b is a 64-dimensional bias. A new engineer adds an extra bias vector of shape (64,) after the activation and trains the model. He is surprised to find no improvement. What is the most likely reason?