What makes it the de-facto multi-class classifier loss function

There is much to learn and grasp to ace that important Data Science/Machine Learning interview. This article is part of a series of articles that try to make the preparation process easier and less daunting by introducing structure, use visual explanations, and keeping things relevant.

Starry night in forest
Starry night in forest
Photo from Pixabay

Motivation


Why Deep Learning is so powerful yet so simple in its core

Image for post
Image for post
Photo by Kristine Tumanyan on Unsplash


And How to Still Achieve Success

kinds thinking about math
kinds thinking about math
Image by marker_photography from Pixabal.com

It All Started From My Personal Struggle


Came across this piece talking about the current #machinelearning status by Mark Saroufim. Some views are a bit controversial but good points nonetheless.

Image for post
Image for post

I’ll summarize the key points below: 🧵👇

  1. ML Researchers — Supposed to be risk-taking and less commercial oriented so ground-breaking progress can be made. Rather, ML Researchers found ways to not taking any risk but getting good pay through FANNG, media, YouTube, etc., and SOTA chasing.
  2. Math is overrated in Deep Learning. Matrix multiplication is mostly what you need and auto grad removes the real needs for manual gradient calculation. Be real now.
  3. The empiricism tendency of…


A Thought Experiment

Image for post
Image for post
Photo by Joshua Aragon on Unsplash

What is UAT?

What exactly is Universal Approximation Theorem? Well, put in layman’s terms, UAT just means that giving a one hidden layer neural network with enough neurons. It can approximate(or simulate closely) any continuous function within the given input range. It means that a one hidden layer neural network is an ultimate flexible function approximator. Maybe a little too flexible.

A Lesson Learned

Because of the flexibility, Universal Approximation Theorem used to push AI researchers to focus mostly on shallow neural networks, thus in some way hinders the development progress of deep learning. This is interesting. Come to think of it, a ‘shallow and wide’…


Why All Data Scientists Not From Computer Science Background Should Take the Course

MIT 6.00.1x and 2x
MIT 6.00.1x and 2x
Photo by Irvan Smith on Unsplash

…my Journey into Data Science So Far


Data Scientist rejoice! Always have a hard time putting your fine LaTex equations into PowerPoint? Worry no more. Jeremy Howard from fast.ai to the rescue. Check out this great piece…


Good article from Kaspersky about Phishing Email detection using Machine Learning. The approach introduced are clearly laid out and easy to understand. Rarely found solid piece for ML/AI in cybersecurity…

Michael Li

Playing with numbers and shapes. Pushing features and pixels. Traversing multiple dimensions I sail on. Do I carry a towel you asked? The answer is 42 I smiled.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store