Thursday, May 26, 2022
HomeMachinelearningWhat is regression in machine learning !!!

What is regression in machine learning !!!

What is regression in machine learning !!!

If we want to predict the continuous value at that time, we use the regression model. The regression model further has branch-like linear and non-linear.

First of all, let’s take the basic concept,

History of linear regression.

The concept of Linear regression was coming into the picture in the 1886 period by “Francis Galton”. He was studying the relationship between parents and children. The area of interest is particularly the relationship between the height of father and their sons. His data consisted of 928 adult children. He represented the height of parents using a single statistic, the “mid‐parent”, this being the mean of the height of the father and of his wife’s height multiplied by 1.08.

Galton, the breakthrough was that the son’s height tended to be closer to the overall average height of all people.

Let’s understand with example,

Person/Father name Rahul and his son Micky. If the height of the father let’s say 6ft + then it would be possible that his son may have the same or nearer or less height. Here Galton used the term Regression, as father’s son’s height tends to regress or drift towards the mean or average height of everyone else.

In the above image, we have two-point and let’s calculate the regression. To calculate the regression, we draw the line that is close to every dot. When we use the “Least Square Method”, only measure the closeness in the UP AND DOWN direction. We can perform this operation using multiple numbers of persons with their sons, and predict/expect how his son to be BEFORE he even has a son.

What is Least Square Method?

It is the method to find out the best fir for a set of points/data by minimizing the sum of the offsets or residual. The residual is the difference between the observation and fitted value. Here the residual is marked by the ORANGE LINE. The difference between the true point in blue and your fitted model line.

Our goal in linear regression is to minimize the vertical distance between all the data points and our line.

To perform this operation, we have several ways like the sum of absolute error, etc.

Now, we know the history and some calculation terms let’s dive deep in it.

Simple Linear Regression: – Y= b0 + b1*x1 (Check the image down below)

Y: – is dependent variable, like the salary changes based on your experience Or How much weight a person will loss based on the time he or she spent for exercises.

B0: – is constant term. Chech the image down below, the point where the line crosses the vertical axis.

X: – is independent variable, the simple linear has only one independent variable.

B: –is coefficient, for independent variable. Unit change in X1 how it is going to affect a unit change in Y.

B1: – it is the slope of the line. The steeper the line the more you get the money or lose weight. The lesser the coefficient the lesser the result you will get.

Here how your salary/weight is depending on the experience/exercises. Image is drawn manually by myself.

So,

Salary/ Weight = b0 + b1 * Experience/Exercises.

The line denotes that best fits this data.

Here, if your experiences ZERO at that time you will receive the 5000 rupees.

This is how the simple linear regression works.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -

Most Popular

Recent Comments