해당 글은 교재 Statistical Learning with R을 챕터별로 정리한 글입니다.
Linear Regression
RSS (Residual Sum of Squares)
we have to choose minimized RSS
LSM (Least Square Method)
can get intercept and slope(coefficient) with minimized RSS
SE (Standard Error)
Confidential Interval
Hypothesis Test
H0 and H1 Hypothesis test
P-value
smaller -> important
T-Statistic
Bigger T-stat -> smaller p-value -> reject H0
Assessing Model
RSS (Residual Standard Error) -> standard error of e
R Square Statistic -> 0 ~ 1 as ratio
Multiple Linear Regression
H0 checck
F-Statistic -> if not F = 1 -> reject H0
Which preditctors are useful?
1. brute force, 2. greedy approach (forward[0 -> full-model], backward[full model -> 0])
R Squared Statistics are normally increase when num of params bigger
interaction effects
nonlinear effects
'컴퓨터공학 & 정보통신' 카테고리의 다른 글
[자료구조] Planar graph 와 plane graph (0) | 2023.05.31 |
---|---|
[컴퓨터그래픽스] Winged edge table (0) | 2023.05.29 |
[머신러닝] Introduction (0) | 2023.04.25 |
[컴퓨터그래픽스] 영상 워핑(warping)과 모핑(morphing) (0) | 2023.04.25 |
[컴퓨터그래픽스] 기하학적 처리 정리 (0) | 2023.04.25 |