Convolutional Horses and Humans:
The ImageDataGenerator() pulls data from the directory or subdirectory in which it is stored in order to use that data. The command takes the argument rescale, which normalizes the images by dividing 1 by the highest pixel, so all the values are between 0 and 1. To flow from the directory to the generated object you need to create a generator object that calls the flow_from_directory() command and pass in the directory, target size, batch size, and class mode. The target size sets the size for each image, so they are consistent as they are passed through the neural network. When setting the class mode argument, you should think about the number of classes in your data. For example, if you have two classes, you would set the class mode to binary. The only things that change in this code between the training and testing datasets is the name of the generator and the directory that you pass in.
For my model, I used three convolutional layers, one of 16, one of 32, and one of 64, and three pooling layers. With each convolutional layer, the output shape decreased by two pixels, and with each pooling layer, the size of the image was cut in half. For my output layer, I chose the function sigmoid, which works well for binary classifications like this one, as it sets the output to 0 for one class and 1 for the other. Additionally, the sigmoid function allows the model to operate with only one neuron in the final layer. In the model compiler, I used binary crossentropy as my argument for loss, RMSprop with a learning rate of .001 for my optimizer, and accuracy as my metric.
Regression:
When looking at the change in MSE throughout the epochs, it appears that the model actually became overfit much earlier, around 100 epochs, as that is when the validating MSE begins to increase compared to the training MSE. Throughout the epochs, you can see slight fluctuations in both the training and validating MSE, however, these fluctuations are more severe for the validating data, reflecting what was shown in the last five steps.
Overfit and Underfit: