CS60050: Machine Learning

From Metakgp Wiki
Jump to navigation Jump to search
CS60050
Course name Machine Learning
Offered by Computer Science & Engineering
Credits 3
L-T-P 3-0-0
Previous Year Grade Distribution
32
46
47
31
20
7
1
EX A B C D P F
Semester Spring


Syllabus[edit | edit source]

Syllabus mentioned in ERP[edit | edit source]

The concept learning task. General-to-specific ordering of hypotheses. Version spaces. Inductive bias. Decision Tree Learning. Rule Learning: Propositional and First-Order, Over-fitting, Cross-Validation. Experimental Evaluation of Learning Algorithms Instance-Based Learning: k-Nearest neighbor algorithm, Radial basis functions. Case-based learning. Computational Learning Theory: probably approximately correct (PAC) learning. Sample complexity. Computational complexity of training. Vapnik-Chervonenkis dimension. Artificial Neural Networks: Linear threshold units, Perceptrons, Multilayer networks and backpropagation, recurrent networks. Probabilistic Machine Learning Maximum Likelihood Estimation, MAP, Bayes Classifiers Naive Bayes. Bayes optimal classifiers. Minimum description length principle. Bayesian Networks, Inference in Bayesian Networks, Bayes Net Structure Learning Unlabelled data: EM, preventing overfitting, cotraining Gaussian Mixture Models, K-means and Hierarchical Clustering, Clustering and Unsupervised Learning, Hidden Markov Models, Reinforcement Learning Support Vector Machines Ensemble learning: boosting, bagging.


Concepts taught in class[edit | edit source]

Student Opinion[edit | edit source]

Study smart[edit | edit source]

Classroom resources[edit | edit source]

Additional Resources[edit | edit source]

Mid-sem

Mid-sem_solutions

Time Table[edit | edit source]

Day 8:00-8:55 am 9:00-9:55 am 10:00-10:55 am 11:00-11:55 am 12:00-12:55 pm 2:00-2:55 pm 3:00-3:55 pm 4:00-4:55 pm 5:00-5:55 pm
Monday
Tuesday
Wednesday NC244
Thursday NC244
Friday NC244