Welcome to CS With James
In this tutorial I will discuss about what is Overfitting and how to use Dropout layer to prevent the overfitting.
Overfitting happens during training the Network.
The weight on the neurons fits exactly into the training dataset so the accuracy from the training dataset is above 95% but from the testing dataset or real world application the accuracy is way lower than the training accuracy. This is called overfitting
You can see from here the Training Loss is 0.3529 but the validation loss is 1.3425 this is the sign of overfitting, you want to see the two losses decrease at the similar rate.
In order to prevent overfitting you have to use the Dropout layer.
Dropout turns off the percentage of neurons randomly so it prevent from overfitting your Network into the training dataset. Very simple idea but works great.
This Dropout Layer will turn off 25% of the Neurons randomly to prevent from the overfitting.
You can see from here after 50 epochs the training loss is at 1.0765 and validation loss is at 0.9930 which is pretty similar number and this is the sign of not overfitting your network to the data and this is what you want to see.
When you training your Network you have to check if the training loss and validation loss decrease in the similar rate. If the training loss decrease much faster than the validation loss it means your model is overfitting to the training dataset.
In order to reach the higher accuracy with the dropout you have to iterate more time which means you have to set higher Epochs for the training and it means it will take more time to training. However, the model with the dropout layer will work better after finish training on the dataset.