Batch Normalization
Today let's enter inside the Deep Neural Network to understand what actually batch Normalization is, what all problems can we face if Batch Normalization is not used and advantages of using it in our model. In machine learning we have been constantly using this technique of feature scaling such as standardization or normalization over our features so that there is uniformity in ranges of all features and there's no inbuilt bias from the model over a particular feature or set of features. Similarly in Neural Network we normalize our input by mean centring to zero and variance scaling to unity which is also known as "whitening". Okay so as of now we have normalized our inputs to zero mean and unit variance. Sounds good . But what about the values deep inside the network?? Will it follow the same distribution as that of input values distribution?? Let's find out..!!! We are here extensively dealing with Deep Neural Network which are having a lot of hidden layers p...