Week 4: AlexNet benefit, or AlexNet loss

Mar 18, 2019

The 4th week was when things began to transition from learning to doing. Armed with the more practical knowledge I gained in the past three weeks, I was ready to examine a fully developed neural network – AlexNet – which I would use as a base point and modify to better fit my needs.

AlexNet is a fairly simple NN in today’s standards, with only 8 layers as to the 150 or even 200 that the more modern ones have. This was great for me though, since this meant I could quickly run and possibly even train the network if I had to. As a starting point I downloaded an implementation from here https://github.com/felzek/AlexNet-A-Practical-Implementation. In other words, I downloaded their model and their weights (which were generated by training the model; Imagine taking all of a person’s memories and copying it over to a fresh new brain).

Here I began to have some trouble though. In the end, my goal was to modify the network. However this was my first time, and I needed to use the weights I was given as otherwise I would not be able to test without having to retrain. I was unaware of what to modify and what not to modify in order to keep this property true. So I began looking into other different variations of AlexNet and comparing them, seeing what I could do.

That about sums up what happened as of late. In terms of getting a finished product, I’ve roughly hit the mid-mark (with the end just being presentation preparation and such). It’s good knowing that I can finally get started on a product.

2 Replies to “Week 4: AlexNet benefit, or AlexNet loss”

  1. Alex Y. says:

    Can you get more specific on AlexNet and what it does? Why do you choose this NN specifically over the other ones?

  2. Aarushi N. says:

    Have you found a variation that works for it now? If not, do you have any backups that you could use instead?

Leave a Reply

Your email address will not be published.