Hacker Newsnew | past | comments | ask | show | jobs | submitlogin



What is too much though? Backpropagation uses derivatives, some filters in Computer Vision use multivariate calculus. If you want to have a thorough understanding then calculus is necessary. That said, Andrew Ng was quite good at avoiding calculus in his Machine Learning MOOC, and for applied machine learning I guess calculus is not that important.

A great place to study about math is www.khanacademy.org, they have courses on calculus, probability/statistics and linear algebra.


Strang's complaint is that there's too little linear algebra. This is true. This doesn't overshadow the fact that you're not going to get out of using some partial derivatives in neural net land (and many other AI subfields).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: