I was in one of those early cohorts that used Octave, one of the things the course had to deal with was that at the time (I don't know about now) Octave did not ship with an optimization function suitable for the coursework so we ended up using an implementation of `fmincg` provided along with the homework by the course staff. If you're following along with the lectures, you might need to track down that file, it's probably available somewhere.
Using Octave for a beginning ML class felt like the worst of both worlds - you got the awkward, ugly language of MATLAB without any of the upsides of MATLAB-the-product because it didn't have the GUI environment or the huge pile of toolbox functions. None of that is meant as criticism at Octave as a project, it's fine for what it is, it just ended up being more of a stumbling block for beginners than a booster in that specific context.
I did that with Octave too. I didn't mind the language much, but it wasn't great. I had significant experience with both coding and simple models when doing it, so I wasn't a beginner; I can see it being an additional hurdle for some people. What are they using now? Python?
Believe Andrew Ng's new course is all Python now, yeah. Amusingly enough another class that I took (Linear Algebra: Foundations to Frontiers) kinda did the opposite move - when I took it, it was all Python, but shortly after they transitioned to full-powered MATLAB with limited student licenses. Guess it makes sense given that LAFF was primarily about the math.
Using Octave for a beginning ML class felt like the worst of both worlds - you got the awkward, ugly language of MATLAB without any of the upsides of MATLAB-the-product because it didn't have the GUI environment or the huge pile of toolbox functions. None of that is meant as criticism at Octave as a project, it's fine for what it is, it just ended up being more of a stumbling block for beginners than a booster in that specific context.