해당 글은 교재 Statistical Learning with R을 챕터별로 정리한 글입니다.
statistical learning : understanding data with vast tools
Supervised Learning vs Unsupervised Learning
- supervised : given input and output ex) regression(wage), classification(stock)
- unsupervised : Given only input ex) dimension reduction, clustering
Prediction vs Inference
- prediction : predict Y from X
- inference : understand relationship -> want to know what really make f
Reducible vs Irreducible errors
- reducible : able to reduce with updating f
- irreducible : only god knows
MSE (Mean Squared Error)
- E(Y - Y hat)
- estimator of f
- hope reducible error to be minimized
Parametic vs Non-Parametic method
- Parametic : assumption of model shape ex) linear
- Non-parametic : Not assumption
Flexibility vs Interpretability
- trade-off
- flexible -> overfitting
- interpretability -> underfitting
Test MSE vs Training MSE
- U-shape
- can be different outcome
- too flex -> overfitting
Bias-Variance trade-off
- Bias : lower flexible, may closer to True f, more K in KNN
- Variance : more flexible with training data, less K in KNN
'컴퓨터공학 & 정보통신' 카테고리의 다른 글
[컴퓨터그래픽스] Winged edge table (0) | 2023.05.29 |
---|---|
[머신러닝] Linear Regression (0) | 2023.04.25 |
[컴퓨터그래픽스] 영상 워핑(warping)과 모핑(morphing) (0) | 2023.04.25 |
[컴퓨터그래픽스] 기하학적 처리 정리 (0) | 2023.04.25 |
[컴퓨터그래픽스] 영역 기반 처리 방법 정리 (0) | 2023.04.25 |