Member-only story
2 min readJan 27, 2021
Came across this piece talking about the current #machinelearning status by Mark Saroufim. Some views are a bit controversial but good points nonetheless.
I’ll summarize the key points below: 🧵👇
- ML Researchers — Supposed to be risk-taking and less commercial oriented so ground-breaking progress can be made. Rather, ML Researchers found ways to not taking any risk but getting good pay through FANNG, media, YouTube, etc., and SOTA chasing.
- Math is overrated in Deep Learning. Matrix multiplication is mostly what you need and auto grad removes the real needs for manual gradient calculation. Be real now.
- The empiricism tendency of DL formed a feedback loop, where people/company has more computing power to do many experiments in parallel wins, thus gaining more resources to do even more experiments. Google Brain, OpenAI, DeepMind.
- Transformers are everywhere and so popular
- Graduate Student Descent. Again, more people to do experiments leads to faster breakthroughs. Also, cargo-culting configs, loss functions, frameworks make it possible but less well-thought-out.