by syntaxing on 3/23/17, 3:00 PM
One thing that I discovered recently which surprised me (while taking the Udacity SDC)is how effective and resilient these "older" ML algorithms can be. Neural networks was always my go to method for most of my classification or regression problems for my small side projects. But now I learned with the minimal dataset I have (<5K samples), linear regression, SVM, or decision tress is the way to go. I got higher accuracy and it's about 10X faster in terms of computational time!
by nafizh on 3/23/17, 4:18 PM
Aaah, I was hoping for an explanation of the kernel trick. I think that is the hardest concept in support vector machines.
by shas3 on 3/23/17, 8:17 PM
Very cool! However, I think the author should have spent a a few more words and figures to distinguish support vector machines from standard perceptrons. Maximum margin classification and the definition of 'support vectors,' in my experience, helps demystify the algorithm.
by lallysingh on 3/23/17, 2:55 PM
This is great! Any follow-ups describing kernels?
by rs86 on 3/23/17, 7:33 PM
Amazingly well written. Short and to the point, humbly sharing something cool!
by curiousgal on 3/23/17, 4:47 PM
Many aspects of Machine Learning boil down to optimization problems.
by LeanderK on 3/23/17, 7:32 PM
well, that really was a quick look. Any reading-recommendations about the kernel-functions? How do they work and why are they fast?
by rmchugh on 3/23/17, 8:23 PM
Best name for a blog ever?