본문 바로가기
컴퓨터공학 & 정보통신

[머신러닝] Linear Regression

by TaeGyeong Lee 2023. 4. 25.

해당 글은 교재 Statistical Learning with R을 챕터별로 정리한 글입니다.

 

Linear Regression

RSS (Residual Sum of Squares)

we have to choose minimized RSS

LSM (Least Square Method)

can get intercept and slope(coefficient) with minimized RSS

SE (Standard Error)

Confidential Interval

 

Hypothesis Test

H0 and H1 Hypothesis test

P-value

smaller -> important

T-Statistic

Bigger T-stat -> smaller p-value -> reject H0

 

Assessing Model

RSS (Residual Standard Error) -> standard error of e

R Square Statistic -> 0 ~ 1 as ratio

 

Multiple Linear Regression

H0 checck

F-Statistic -> if not F = 1 -> reject H0

Which preditctors are useful?

1. brute force, 2. greedy approach (forward[0 -> full-model], backward[full model -> 0])

R Squared  Statistics are normally increase when num of params bigger

interaction effects

nonlinear effects