This week featured a lot of AI training – and boy did it take a while.
I had amassed a few different instances of AlexNet and VGG16. The goal here this time was to run fine-tuning tests on them to see which one of them would work the best for my purposes. While the beginning of the week focused a bit on AlexNet, I ended up spending most of my time with vgg16 because of just how much time it took to train. VGG16 is named as such because it has 16 layers – compared to AlexNet’s 8. It took 8-10 hours to train per run, and generally did not hit above 50% accuracy. I believe this is mostly due to the weights I received being bad, but nonetheless this option was ruled out.