Normalization
In a deep neural network, there is a phenomenon called internal covariate shift, which is a change in the input distribution to the network's layers due to the ever-changing network parameters during training.
The input layer may have certain features which dominate the process, due to having high numerical values:
For example, imagine feature one having values between 1 and 5, and feature two having values between 100 and 10000.
During training, due to the difference in scale of both features, feature two would dominate the network and only that feature would have a contribution to the outcome of the model.
This can create a bias in the network because only those features contribute to the outcome of the training.
Due to the reasons stated, a concept known as normalization was introduced to resolve these issues.