CS2750  Machine Learning (ISSP 2170)


Time:  Tuesday, Thursday 1:00-2:15pm, 
Location: Sennott Square, Room 5313


Instructor:  Milos Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos at cs pitt edu
office hours: Monday 2:30-4:30pm


TA:  Charmgil Hong
Computer Science Department
5406 Sennot Square
phone:
e-mail: charmgil at cs pitt edu
office hours: Tu 2:30-4:00pm, W 1:30-3:00pm


Announcements !!!!!



Links

Course description
Lectures
Homeworks
Term projects
Matlab



Abstract

The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, gene discovery, financial forecasting, credit card fraud detection, collaborative filtering, design of adaptive web agents and others.

This introductory machine learning course will give an overview of many models and algorithms used in modern machine learning, including linear models, multi-layer neural networks, support vector machines, density estimation methods, Bayesian belief networks, mixture models, clustering, ensamble methods, and reinforcement learning. The course will give the student the basic ideas and intuition behind these methods, as well as, a more formal understanding of how and why they work. Students will have an opportunity to experiment with machine learning techniques and apply them a selected problem in the context of a term project.

Course syllabus

Prerequisites

Knowledge of matrices and linear algebra (CS 0280), probability (CS 1151), statistics (CS 1000), programming (CS 1501) or equivalent, or the permission of the instructor.



Textbook:

Other very useful books:

Lectures
 
 
Lectures  Topic(s)  Assignments
January 7 Introduction to Machine Learning.

Readings: Bishop: Chapter 1

January 9 Machine Learning.

Readings: Bishop: Chapter 1

January 14 Matlab Tutorial

Readings: .

January 16 Density estimation

Readings: Bishop. Chapter 2.1.

Homework assignment 1 ( Data )
January 21 Density estimation

Readings: Bishop. Chapter 2.1-3.

January 23 Density estimation

Readings: Bishop. Chapter 2.

Homework assignment 2 ( Data )
January 28 Nonparametric density estimation
Linear regression

Readings: Bishop. Chapters 2.5, and 3.1.

January 30 Linear regression

Readings: Bishop. Chapter 3.1

Homework assignment 3 ( Data )
February 2 Classification learning: logistic regression, QDA, LDA.

Readings: Bishop. Chapters 4.2-3.

February 6 Classification learning II.

Readings: Bishop. Chapter 4

Homework assignment 4 ( Data )
February 11 Classification learning: Fisher Linear Discriminant. Support Vector Machines.

Readings: Bishop. Bishop: Chapter 4.1.2-4, Chapter 7.

February 13 Support vector machines for regression. Multilayer neural networks.

Readings: Bishop. Chapter 5

Homework assignment 5 ( Data )
February 18 Non-parametric classification methods

Readings: Bishop.

February 20 Multiclass classification. Decision trees.

Readings: Bishop. Chapter 5

Homework assignment 6 ( Data )
February 25 Bayesian belief networks

Readings: Bishop: 8.1-2.

February 27 Bayesian belief networks: inference and learning

Readings: Bishop. Chapter 8.4.

Homework assignment 7 ( Data )
March 4 Midterm exam

Readings: all readings on or before February 27, 2014

March 6 Learning BBNs with hidden variables and missing values. Expectation maximization.

Readings: Bishop: 8

March 18 Expectation-maximization. Mixture of Gaussians.

Readings: Bishop: Chapter 9.

March 20 Clustering

Readings: Bishop. Chapter 9.

Homework assignment 8 ( Data )
March 25 Dimensionality reduction. Feature selection

Readings: Bishop: Chapter 12.1, 12.4.

March 27 Mixture of experts

Readings: Bishop: Chapter 14.

Homework assignment 9 ( Data )
April 1 Ensamble methods: bagging and boosting

Readings: Bishop. Chapter

.
April 3 Concept learning

Readings: Bishop. Chapter

Homework assignment 10 ( Data )
April 8 Reinforcement learning I.

Readings: Kaelbling, Littman, Moore. Reinforcement Learning: a survey

.
April 10 Reinforcement learning II.

Readings: Kaelbling, Littman, Moore. Reinforcement Learning: a survey

.



Homeworks

The homework assignments will have mostly a character of projects and will require you to implement some of the learning algorithms covered during lectures. Programming assignments will be implemented in Matlab. See rules for the submission of programs.

The assignments (both written and programming parts) are due at the beginning of the class on the day specified on the assignment. In general, no extensions will be granted.

Collaborations: No collaboration on homework assignments, programs, and exams is permitted unless you are specifically instructed to work in groups.
 



Term projects

The term project is due at the end of the semester and accounts for a significant portion of your grade.



Matlab

Matlab is a mathematical tool for numerical computation and manipulation, with excellent graphing capabilities. It provides a great deal of support and capabilities for things you will need to run Machine Learning experiments. The CSSD at UPitt offers $5 student licenses for Matlab. To obtain the licence please check the following link to the Matlab CSSD page . In addition, Upitt has a number of Matlab licences running on both unix and windows platforms. See the following web page for the details.

Matlab tutorial file.

Other Matlab resources on the web:

Online MATLAB  documentation
Online Mathworks documentation including MATLAB toolboxes


Cheating policy: Cheating and any other anti-intellectual behavior, including giving your work to someone else, will be dealt with severely and will result in the Fail (F) grade. If you feel you may have violated the rules speak to us as soon as possible. Please make sure you read, understand and abide by the Academic Integrity Code for the Faculty and College of Arts and Sciences.

Students With Disabilities:
If you have a disability for which you are or may be requesting an accommodation, you are encouraged to contact both your instructor and Disability Resources and Services, 216 William Pitt Union, (412) 648-7890 as early as possible in the term. DRS will verify your disability and determine reasonable accomodations for this course.


Course webpages from Spring 2012, Spring 2011, Spring 2010, Spring 2004 and Spring 2003



Last updated by Milos on 01/07/2013