Okay. So, for multiple linear regression, so a simple linear regression is for a single response variable y in a single independent variable x. So I will put in these data okay as usual same as a previous one okay. So I will remove all these bodies yeah I will remove bodies okay. Okay so I have our data here. So for multiple linear regression, the model can be linear model data why price, no data y equal to data x plus data has to So, this is a multi linear regression.
So multi linear regression, we have two variables as the input and then y as the response variable. So, we can do something like this. So model to lm data y equal data x okay raw data s to K data equal data okay then we can look into the model or we can go into the summary. So, usually we do lamotta okay we go into the model, then we will go into the summary Okay, so in this model, we don't have to look into a PowerPoint slide we also know why quad se m one plus m S 10 plus Alesia intersect. So, y equal to the m one will be minus 0.2664343 then a brass m two lb Zopa zone i phi two phi times r s two and then process y intercept three which is 2.517425. So, we are very familiar with this linear regression equation which is taught in our college or maybe in secondary school.
So, there is not the image, but we these are p value here. So, when we read the PowerPoint slide we can further adisa intercept Okay, all these p value is actually usually compared to the response variable, which is why here So, direct one variable is y. So, the p value of this or intercept is our y or compared to this our y here is 7.978 or power minus 13 which is less than 0.05 The End data, so, the p value is 0.2 or seven compared to the wire as one variable is 0.07 more than 0.05 and then has to be about 0.81 when compared to the y variable here, so, 0.8 is obviously more than 0.05 we can see okay so, when we look into the result, you can see something like this Okay, so, we can infer whether the or these are variables are important.
We have the y variable using the same concept here So, let's see Bollinger hypothesis coefficient associated with a variable is equal to zero. So, for alternate hypothesis coefficient is not equal to zero. So, we forget the p value is more than 0.05 issue recha etc. Now hypotheses we can look at some eyes on both internet and also in this PowerPoint slide here. So, the intercept is our P value all bodies which is smaller than 0.04 Haida is a significance with Y variable. So, obviously when the p value is less than 0.05 So, there is significance.
So, when we look into our result here for multiple linear regression, so, when the p value is less than 0.05 is our significance our relationship between these intercept and the y variable and then when the p value is more than 0.05 days no significant relationship with Y variable and the same here More than 0.05 days s two has no significant relationship with a Y variable. So, like with these we can look into how we see whether the p value whether these are all these variables have any relationship with this response variable. So, although all of them are when we see something like this in our audience p value usually refer to the response variable okay then for linear regression you can also do into r square. So, the higher the R square the better the higher the R square the better I will say the better the model Okay, the data s ss is smaller.
We can search these on the web content in my book. You can also find The implementation will go the whole formula day Okay, so, our r square so standard error okay then we can be easily seduced into error then we can read our multiple r square and then to adjust to ask why we will usually take the adjusted R square okay. So that is T and the degree of freedom and the p value. So, the adjusted R square is minus 0.00299 So, the higher the R squared and better the more days okay so via web cover or the simple linear regression multiple linear regression