Lesson 2 Multivariate linear regression

So, I have joined the machine learning course on Coursera given by no other than Andrew Ng! The first module of the course were just repetition of linear regression and linear algebra, I skimmed through it quickly to freshen up my memory. In module two, things starting to get interesting again. We start of by installing the course software, Octave (Or matlab) and then quickly dive into multivariate linear regression which is formulated as:

yi = 0 + 1xi1 + 2xi2 + … pxip + i for i = 1,2, … n.

 

Or on Matrix form,

Y=Xβ+ϵ

Where the feature vector X , parameter Beta and residual epsilon are made up by:

Matrix form Multivariate linear regression

The matrix form is so much neater to handle!

The next lesson will cover gradient descent for multiple variables which Im at the moment don’t recall but it the memory usually comes back after seeing some examples.

Annonser