Project Title: Image Processing Through Water-Related Weather and Obstacles
BASIS Advisor: Ms. Jefferson
Internship Location: BYTON
Onsite Mentor: Dr. Xinhua Xiao
Here’s a scenario: You’re on a drive. It’s cloudy outside, and you see the droplets begin to accumulate on your car window. Based off the unusual sight impairments that follow, it’s easy to infer it is raining. After you process this, you turn on your windshield wipers. Still, it’s hard to see, and it takes you one or two seconds longer than usual to brake for a traffic light. As a result, you drive carefully and more slowly on your way home. To humans, perceptions and reactions such as these are almost automatic, but it poses a major challenge for those in the machine learning field readying the self-driving car for the world. Water presents many problems for self-driving cars; for instance, water reflects light differently, which can throw off most of our current algorithms which heavily rely on lidar (light-based radar). In more extreme cases, it nearly impossible to gauge the depth of a body of water during very heavy rain. Byton is a company looking to tackle this problem, and I will be working with them to research ways to improve our AI (Artificial Intelligence) algorithms to better include water-related factors. I will be changing and testing the algorithms to better suit this purpose, as well as to choose and label good test data. Along the way I hope to better understand an array of different algorithms and their development process.
My Posts
Week 11: I ran out of puns
As promised from last week, let’s discuss results. AlexNet ended up with a very surprising plateau of 88%. Whether it will be able to identify wetness beyond still images is a good question, but one that I can’t really test due to space limitations of my computer (I cannot download video data) and the fact […]
Week 10: Trained to (something close to) perfection
This week was spent modifying the AlexNet built for fine tuning and setting the text files to read the image directories. I had to write a small python script for this. There was still a major issue here however, because there was a hidden .DS file that messed with python’s reading of the folder. I […]
Week 9: The BAIR necessities
This week I completely abandoned the prospect of vgg16, and shifted focus entirely to AlexNet. Perhaps most importantly, By the end of the week I managed to get enough data from the BAIR dataset sorted out between rainy and nonrainy. It was still likely that I would have to grab other images since some are […]
Week 8: The Epcohalypse
This week featured a lot of AI training – and boy did it take a while. I had amassed a few different instances of AlexNet and VGG16. The goal here this time was to run fine-tuning tests on them to see which one of them would work the best for my purposes. While the beginning […]
Week 7: Back in tune with the task at hand
With the python problems out of the way, it was time to resume learning about and working towards modification. My advisor and I agreed that one more semi-exercise was necessary before we created the final model, which was to get a fine-tuning model up and going. In short, fine-tuning is the process of taking a […]
Week 6: Resolving the ePYdemic plaguing my computer
This week I spent most of my time resolving the python issue that plagued my computer, followed by a robotics competition I had to leave for on Friday. Unfortunately, I don’t remember enough of the details to give a step by step guide, but that probably doesn’t matter anyway as situations differ and my problem […]
Week 5: Trouble is (brew)ing
The first half of my week was spent fiddling with my AlexNet Model. I played around with 2 open source versions on github, one by felzek (https://github.com/felzek/AlexNet-A-Practical-Implementation) and one by huanzhang12(https://github.com/huanzhang12/tensorflow-alexnet-model). Experimenting with the number of nodes in each layer and layer type and such. I was still learning about different layer types, so this […]
Week 4: AlexNet benefit, or AlexNet loss
The 4th week was when things began to transition from learning to doing. Armed with the more practical knowledge I gained in the past three weeks, I was ready to examine a fully developed neural network – AlexNet – which I would use as a base point and modify to better fit my needs. AlexNet […]
Week 3: ReLUctant to begin
The 3rd week was a little uneventful, as it was largely reading without any visible data coming out of it. For the week, I continued to learn more about the different parts of a machine learning algorithm. The first day was spent tidying up my overfitting and underfitting algorithms from last week. The other 2 […]
Week 2: Fit for the job?
This week was largely focused on a common problem for those working with AI – overfitting and underfitting data. A rundown, using a direct parallel in math. Let’s say we have 200 points in a scatter graph. We need to make a function that will hit every one of these points. If these points were […]
Week 1: Just go with the (Tensor)Flow
I was thrown a bit of a curveball before this project began, as my external advisor had visa issues, and could not return until Friday. As this was meant to be the learning week, I was given a couple of resources to look at – namely a Keras tutorial and a set of MIT lectures […]