Home » Blog » Simple Linear Regression | Machine Learning Beginners

Simple Linear Regression | Machine Learning Beginners

Hello learners,

In the previous post we were learn about four flavors of Machine Learning. Now It is time to know about basic machine learning concepts which is Regression. Regression is example of Supervised Learning.

In statistical modeling, Regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the ‘outcome variable’) and one or more independent variables(often called ‘predictors’).

First we have to understand that what is simple linear regression means? So simple means that there is only one input(independent variable). Linear means output(dependent variable) is linearly dependent on input variable(independent variable). When we plot this type of function then the plot display linear line. Example of linear regression is predict the house selling price based on historical data.

In Mathematics, Straight line connects Input and Output data linearly. So in simple words we can say that in simple linear regression straight line is our model. All straight lines are valid for model, but we have to choose one from which our error would be minimized. All straight line which is valid for model have different slop and intercept we have to find that slop and intercept which has less errors and should be accurate.

In this for understanding we are using simple example of house prediction.

We can find error by calculate difference between predicted value and actual value. In above figure line is predicted value and blue dots are actual value.

Total Error = Σ (pi – ei) for i = 0 to m

Where m is total number of inputs,

pi is predicted output value for that input

ei is actual output value for that input

Now, in above case there is some of the individual errors are positive and some of the individual errors are negative. So there is possibilities that some positive errors are discarded with negative errors(problem of nullification).

So there is alternative we can use absolute of individual errors and then generate total error. But the problem with this approach is absolute functions cannot differentiable.

So another alternative is we can use squared error, we all knows that square of any is positive, and also square is differentiable. So finally we can use squared errors to calculate total error.

Total Error = Σ (pi – ei)2 for i = 0 to m

Hope you all understand basics of Regression.

Leave a Reply

Your email address will not be published.