Posts

Showing posts from August, 2022

Batch Normalization

Image
Today let's enter inside the Deep Neural Network to understand what actually batch Normalization is, what all problems can we face if Batch Normalization is not used and advantages of using it in our model. In machine learning we have been constantly using this technique of feature scaling such as standardization or normalization over our features so that there is uniformity in ranges of all features and there's no inbuilt bias from the model over a particular feature or set of features. Similarly in Neural Network  we normalize our input by mean centring to zero and variance scaling to unity which is also known as "whitening". Okay so as of now we have normalized our inputs to zero mean and unit variance. Sounds good .  But what about the values deep inside the network?? Will it follow the same distribution as that of input values distribution?? Let's find out..!!! We are here extensively dealing with Deep Neural Network which are having a lot of hidden layers p...

Why activation function is needed in Neural Networks???

Image
Today allow me to share with you, the relationship between a spider and an activation function   Activation function is used in Neural network to introduce the non-linearity. Without the proper activation function, neural network will simply be large Linear Model.   Consider the case where you have a data which is linearly separable  Constructing a model to separate yellow and orange data points is very simple. A simple logistic regression will do this job very perfectly  Now consider the case where your data set looks like something shown below What do you think now?? will a simple linear model be able to carve out the hyperplane in such a way to classify both yellow and orange points. I don’t think so. And it is the fact that a simple linear model may not be able to find complex patterns in a data.  So now we go to big brother of linear model -Neural network. Neural Network has this super ability to both carve out linear...