This course is designed for 16 weeks. Every week has 3 units and each unit is for 45 minutes.

Time: Thurday from 1:30pm to 4:00pm.

Location: 6A411

News

2012/12/3 Website is up.

Course Description

The course is designed to provide a broad introduction of the theories, algorithms and applications of machine learning. Topics include: supervised learning (generative/discriminative learning, support vector machines, logistic regressions), unsupervised learning (clustering, feature selection, nonparametric Bayesian methods), learning theory (bias/variance tradeoff, model selection, VC theory), probabilistic graphical models (HMM, structure learning) and applications to data mining, text processing, etc. Students are expected to have basic knowledge of computer, programming skills, and linear algebra. It is beneficial if the students know basic knowledge of probability, statistics and algorithms.

Textbook

  1. Pattern Recognition and Machine Learning. Christopher Bishop, Springer, 2006.
  2. Machine Learning. Tom Mitchell, McGraw-Hill, 1997.
  3. Additional readings will be provided after each lecture.

Prerequisites

Grading

  1. Homework (4 assignments, 40%)
  2. Projects: 40%
  3. Final presentation: 10%
  4. Attendance/presentations: 10%

Homework & Projects

There will be four basic homeworks and four projects:

Homeworks

Hw1: Perceptron implementation
Hw2: PLSI implementation
Hw3: SMO implementation
Hw4: Dirichlet Process Mixtures implementation

Projects

Prj1: Followback prediction: Predict whether a user will follow another user back when she received a new following link from the other user.
Prj2: Friendship relationship prediction: Predict whether two users have a friendship if there were at least one voice call or one text message sent from one to the other.
Prj3: Review rating prediction: Predict the rating scores of online hotel reviews.
Prj4: Algorithm analysis for topic models: Implement and compare approximate inference algorithms for LDA which includes: variational inference (Blei et. al. 2003), collapsed gibbs sampling (Griffth et. al. 2004) and (optionally) collapsed variational inference (Teh. et. al. 2006).

Students are required to select three basic homeworks plus one project. The final project is allowed to be completed by a team including two students. You can also determine the subject of the final project by your own and should submit a project proposal and a final project report.

Addtional References

Books

Links