Came across this piece talking about the current #machinelearning status by Mark Saroufim. Some views are a bit controversial but good points nonetheless.

  1. Math is overrated in Deep Learning. Matrix multiplication is mostly what you need and auto grad removes the real needs for manual gradient calculation. Be real now.
  2. The empiricism tendency of DL formed a feedback loop, where people/company has more computing power to do many experiments in parallel wins, thus gaining more resources to do even more experiments. Google Brain, OpenAI, DeepMind.
  3. Transformers are everywhere and so popular
  4. Graduate Student Descent. Again, more people to do experiments leads to faster breakthroughs. Also, cargo-culting configs, loss functions, frameworks make it possible but less well-thought-out.
  5. Good innovation in ML:
    — Keras, fast.ai: user-centric, layered in nature, software engineering for ML is a good direction.
    — Julia — Differentiable Computing from the ground up. (Swift may also be)
    — HuggingFace — so popular NLP model
    — Hasktorch — Haskell based, elegant, less known, and developed
    — Unity ML agents — great for RL
    — AlphaFold2 is groundbreaking

Playing with numbers and shapes. Pushing features and pixels. Traversing multiple dimensions I sail on. Do I carry a towel you asked? The answer is 42 I smiled.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store