Yesterday at ML Conference
, which took place this year for the first time, I had a talk on cool bits of calculus and linear algebra that are useful and fun to know if you’re writing code for deep learning and/or machine learning.
Originally, the title was something like “What every interested ML/DL developer should know about matrices and calculus”, but then really I didn’t like the schoolmasterly tone that had, as really what I’ve wanted to convey was the fun and the fascination of it …
So, without further ado, here are the slides and the raw presentation on github.
Thanks for reading!