W7 Machine Learning (part 2)

On the Previous Week

In the previous week, when I spoke about Classification, I felt that I didn’t manage to make it so clear what I meant with the existence of a function “g” that we were trying to approximate with our function “f”. I was trying to refer (and I didn’t really use this wording because I really couldn’t find it before) about the “Manifold Hypothesis”. The idea is that the function “g” is “out there”, and we want to find it. We also assume that it is continuous. I found a video that I think makes it quite clear what this “manifold” is. I would like you to watch it:

https://www.youtube.com/watch?v=BePQBWPnYuE

Gradient Descent (external resources)

I’d recommend you to watch the following videos on Neural Networks by 3Blue1Brown. I think they will be SUPER USEFUL to sediment the ideas discussed this week, and also to prepare you for the topics of next week.

What is a Neural Network: https://www.youtube.com/watch?v=aircAruvnKk&list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi&index=1

Gradient Descent: https://www.youtube.com/watch?v=IHZwWFHWa-w&list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi&index=2

I honestly suggest you to watch this both before and after my videos of this week. Watching before will help you to build some “expectations” about the kinds of things you’ll learn about (and it will help you A LOT with this week’s topic). Watching after will work as a super good example of how the things you learnt about are applied.

Dataset Splits / Hyperparameters

If you have difficulties understanding this video, you might want to watch an alternative explanation: the Andrej Karpathy class on the topic. It is a little more mathy, but it is really good, and the entire content of my video is also present there:

https://www.youtube.com/watch?v=8inugqhkfve&list=plkt2usq6rbvctenovbg1tpcc7oqi31alc&index=3&t=0s

(in fact, if you have interest in learning about ML, these classes are a great resource)

Regression

You’ll find a jupyter notebook called “W7 Regression.ipynb” in the downloads folder. This is what I used during my videos.

Performing Regression with Gradient Descent

Logistic Regression

Now I would like you to read the part on Logistic Regression in the jupyter notebook W7 Rgression,ipynb, that is in the downloads folder.

I found that it didn’t really make sense to explain this in video, because I feel that it is very much the same. That section is quite inspired on some other materials that are also linked there. If you have troubles understanding it, you may want to take a look at those other materials.

One-vs-All Classification