# 기계 학습, Machine Learning 강좌 – 문일철 교수님

강의소스

강좌영상

### Week 01. Introduction——————————————

1. Motivation

2. MLE (Maximum likelihood Estimation)

3. MAP(Maximum a Posterior Estimation) – Bayes

4. Probability & Distribution

## Supervised Learning

Week 02. Fundamentals of ML————————————-

1. Rule-Based ML

2. Decision Tree
– Entropy & Information Gain
noise & inconsistencies

3. linear regression  (How to create a D.T)

http://archive.ics.uci.edu/ml/datasets/Housing

Week 03. Naive Bayes Classifier————————————-

1. Optimal Classification

pr( X|Y)가 Combination(Compound, joint)이 될때, 분포를 구하기 힘들다. –> Naive

2. Conditional Independence

Week 04. Logistic Regression ————————————-

1. Decision Boundary

Parameter Approximation of Logistic Regression

Week 05. SVM (Support vector machine)  ————————————-

1. soft margin penalization

2. Kernel trick

Week 06. Training Testing and Regularization   ————————————-

1. Over, Under fitting

2. Bias, Variance

3. Occam’s razor

4. Cross Validation

5 Performance Metrics

6 Regularization

Graphical Model====================================================

Week 07. Bayesian Network  ————————————-

8 Potential Function and Clique Graph

## Unsupervised Learning

Week 08. K-Means Clustering & Gaussian Mixture model  ————————————-

Week 09. Hidden Markov Model ————————————-

– evaluation
– decoding
– learning

4 Viterbi Decoding Algorithm

5 Baum-Welch Algorithm

Week 10. Sampling Based Inference ————————————-

Sampling methods
– Forward Sampling
– Rejection Sampling
– Importance Sampling

Sampling Based inferense
– metropolis-Hastings
– Gibbs samling

Categories: Machine Learning

Blog Owner