Week 3: Continuing Learning Machine Learning

Mar 04, 2019

To Fellow Readers:

Thank you for stopping by for Week 3 of my Senior Project blog.

This week, I continued on the Machine Learning course of which I have been taking for the past few weeks. After completing the courses on Regression last week, I moved on to the Classification topic in my course. Classification will be used a lot in the first part of my project, particularily music and image classification. I will need to use classification to place different music in different categories of emotion, such as happy, sad, calm, etc.

There are many classification algorithms that this section went over, I will be going over one of the main algorithms in a broad overview. The k-nearest neighbors algorithm is an algorithm that classifies based on how closely the item resembles its neighbors. “k” is the number of neighbors that the object is being compared to. If the majority of an object’s k-nearest neighbors is of Class A, then the object is classified as “Class A”.

Here is an example graph of k-nearest neighbors:

The new object, with “k” set to 1 would be classified as “Class 1” as the 1st nearest neighbor is of Class 1 (square). However, when “k” is 3, two triangles are added to the set of “nearest neighbors”. As 2 triangles is > 1 square, the object could be classified as Class 2 (triangle). The important part of this is to find the right k value to properly classify the object.

Thank you so much again for stopping by my blog, I hope that you have enjoyed. Hope to see you again next week!

-Ray Lin ._.

2 Replies to “Week 3: Continuing Learning Machine Learning”

  1. Swarit Joshipura says:

    Hey Ray! Thanks for including the diagrams — they help in understanding the concepts you are explaining.

  2. Anjali S. says:

    Hi Ray,

    I agree with the comment above; the diagrams help in understanding the concepts. Also, will there be any categories for songs with overlapping emotions?

Leave a Reply

Your email address will not be published.