1. |
|
Lecture 1: Introduction |
1. 강의에 대한 전반적인 내용 설명
- 강의 구성
- 강의 평가방법
2. Pattern Recognition 소개
- main objectives
- classification / clustering
- applications |
|
2. |
|
Lecture 2-1: Probability and Statistics |
•Probability Theory
-Parameter Estimation
-Minimum Expectation
-Bayes Rule
-The Gaussian Distribution
-Exponential Family
•Probabilistic Decision Theory
–Reject option
–Minimizing risk
-Unbalanced class priors
-Combining models |
|
|
|
Lecture 2-2: Probability and Statistics |
•Probability Theory
-Parameter Estimation
-Minimum Expectation
-Bayes Rule
-The Gaussian Distribution
-Exponential Family
•Probabilistic Decision Theory
–Reject option
–Minimizing risk
-Unbalanced class priors
-Combining models |
|
3. |
|
Lecture 3-1: Bayesian Decision Theory & Cross Validataion |
•Probability Theory
-Bayesian Decision Rule
-Maximum a Posteriori decision rule
-Maximum Likelihood decision rule
–Reject option
•Risk Minimization
–Minimizing risk
-Unbalanced class priors
-Combining models
•Cross Validation
–Comparison of CV and Boostrapping |
|
|
|
Lecture 3-2: Bayesian Decision Theory & Cross Validataion |
•Probability Theory
-Bayesian Decision Rule
-Maximum a Posteriori decision rule
-Maximum Likelihood decision rule
–Reject option
•Risk Minimization
–Minimizing risk
-Unbalanced class priors
-Combining models
•Cross Validation
–Comparison of CV and Boostrapping |
|
4. |
|
Lecture 4: Normal Random Variable and Its Discriminant Function Designs |
Normal Random Variable
-Properties
-Quadratic Discriminant Function Designs
Gaussian Mixture Model
-GMM Expression |
|
5. |
|
Lecture 5: Principal Component Analysis |
Principal Component Analysis-finds orthonormal basis for data
-sorts dimensions in order of importance
-discard low significance dimensions |
|
6. |
|
Lecture 6: Support Vector Machines |
The VC dimension
-Classifier Margin
-Margin Estimation
-The Dual Problem |
|
7. |
|
Lecture 7-1: Unsupervised clustering |
Partitional Clustering
-Centroid-based clustering
-K-means and K-medoids
-Gaussian mixture model |
|
|
|
Lecture 7-2: Unsupervised clustering |
Partitional Clustering
-Centroid-based clustering
-K-means and K-medoids
-Gaussian mixture model |
|
8. |
|
Lecture 8: Unsupervised clustering(2) |
Partitional Clustering
-Centroid-based clustering
-K-means and K-medoids
-Gaussian mixture model |
|
9. |
|
Lecture 9: Perceptron, Logistic Regression, Multi Layer Perceptron |
Perceptron
-canonical representation
-optimization problem
-gradient decent search
Logistic Regression
-maximum likelihood learning |
|
10. |
|
Lecture 10: Handwritten Digit(MNIST) Recognition Using Deep Neural Networks |
MNIST hand written digit data base
Neural Networks
Autoencoder
Softmax Regression
Convolutional Neural Networks for MNIST |
|
11. |
|
Lecture 11: Dynamic time warping dynamic pattern recognition |
Dynamic Time Warping
Isolated word recognition
-metric distance
-isolated word recognition with DTW
DTW Applications |
|