CS2750  Machine Learning (ISSP 2170)


Time:  Monday, Wednesday 11:00am-12:15pm
Location: Sennott Square, Room 5313


Instructor:  Milos Hauskrecht
Computer Science Department
5329 Sennott Square
phone: x4-8845
e-mail: milos at cs pitt edu
office hours: Mondays 1:45-3:15pm


TA:  Zhipeng (Patrick) Luo
Computer Science Department
6509 Sennot Square
phone:
e-mail: ZHL78 at pitt edu
office hours: Tuesdays: 11:00am - 2:00pm


Announcements !!!!!



Links

Course description
Lectures
Homeworks
Term projects
Matlab



Abstract

The goal of the field of machine learning is to build computer systems that learn from experience and that are capable to adapt to their environments. Learning techniques and methods developed by researchers in this field have been successfully applied to a variety of learning tasks in a broad range of areas, including, for example, text classification, gene discovery, financial forecasting, credit card fraud detection, collaborative filtering, design of adaptive web agents and others.

This introductory machine learning course will give an overview of many models and algorithms used in modern machine learning, including linear models, multi-layer neural networks, support vector machines, density estimation methods, Bayesian belief networks, mixture models, clustering, ensamble methods, and reinforcement learning. The course will give the student the basic ideas and intuition behind these methods, as well as, a more formal understanding of how and why they work. Students will have an opportunity to experiment with machine learning techniques and apply them a selected problem in the context of a term project.

Course syllabus

Prerequisites

Knowledge of matrices and linear algebra, probability (CS 1151) and statistics (CS 1000), programming (CS 1501) or equivalent, or the permission of the instructor.



Textbook:

Other ML books:

Lectures
 
 
Lectures  Topic(s)  Assignments
January 5 Introduction to Machine Learning.

Readings: Bishop: Chapter 1

January 7 Introduction to Machine Learning.

Readings: Bishop: Chapter 1

January 12 Density Estimation I.

Readings: Bishop. Chapter 2.1-3.

January 14 Matlab tutorial

Readings:

Homework assignment 1 ( Data for homework 1)
January 21 Density estimation II

Readings: Bishop. Chapter 2.1-3.

Homework assignment 2 ( Data for homework 2)
January 26 Density estimation III

Readings: Bishop. Chapter 2.

.
January 28 Linear regression

Readings: Bishop. Chapter 3.1.

Homework assignment 3 ( Data for homework 3)
February 2 Classification learning: Logistic Regression, Generative classification models.

Readings: Bishop. Chapter 4.2-3.

.
February 4 Classification learning II. Evaluation of classifiers.

Readings: Bishop. Chapter 4.

Homework assignment 4 ( Data for homework 4)
February 9 Fisher Linear Discriminant. Support Vector Machines.

Readings: Bishop: Chapter 4.1.2-4, Chapter 7.

.
February 11 Support vector machines for regression. Nonparametric/instance based classification methods

Readings: Bishop. Chapter 4.

Homework assignment 5 ( Data for homework 5)
February 16 Multilayer Neural Networks

Readings: Bishop. Chapter 5.

.
February 18 Multiclass classification. Decision trees.

Readings: Bishop. Chapter .

Homework assignment 6 ( Data for homework 6)
February 23 Bayesian Belief Networks

Readings: Bishop. Chapter .

.
February 25 Bayesian Belief Networks. Inference and Learning.

Readings: Bishop. Chapter .

Homework assignment 7 ( Data for homework 7)
March 2 Midterm exam

Readings: Everything covered before or on February 25, 2015.

.
March 4 Expectation maximization algorithm.

Readings: Bishop. Chapter 8.

Homework assignment 8 ( Data for homework 8)
March 16 Expectation maximization algorithm. Mixture of Gaussians.

Readings: Bishop. Chapter 9.

.
March 18 Clustering.

Readings: Bishop. Chapter 9.

Homework assignment 9 ( Data for homework 9)
March 23 Ensemble methods: Mixture of experts. Bagging.

Readings:

.
March 25 Ensemble methods: Boosting.

Readings:

Homework assignment 10 ( Data for homework 10)
March 30 Dimensionality reduction. Feature selection.

Readings: Bishop: Chapter 12.1, 12.4.

.
April 1 Dimensionality reduction II. Reinforcement learning.

Readings:

.
April 6 Reinforcement learning

Readings:

.
April 8 Concept learning

Readings:

.
April 15 Final exam

Readings: all semester

.
April 20 and 22 Term project presentations

Readings: all semester

.



Homeworks

The homework assignments will have mostly a character of projects and will require you to implement some of the learning algorithms covered during lectures. Programming assignments will be implemented in Matlab. See rules for the submission of programs.

The assignments (both written and programming parts) are due at the beginning of the class on the day specified on the assignment. In general, no extensions will be granted.

Collaborations: No collaboration on homework assignments, programs, and exams is permitted unless you are specifically instructed to work in groups.
 



Term projects

The term project is due at the end of the semester and accounts for a significant portion of your grade.



Matlab

Matlab is a mathematical tool for numerical computation and manipulation, with excellent graphing capabilities. It provides a great deal of support and capabilities for things you will need to run Machine Learning experiments. The CSSD at UPitt offers $5 student licenses for Matlab. To obtain the licence please check the following link to the Matlab CSSD page . In addition, Upitt has a number of Matlab licences running on both unix and windows platforms. See the following web page for the details.

Matlab tutorial file.

Other Matlab resources on the web:

Online MATLAB  documentation
Online Mathworks documentation including MATLAB toolboxes


Cheating policy: Cheating and any other anti-intellectual behavior, including giving your work to someone else, will be dealt with severely and will result in the Fail (F) grade. If you feel you may have violated the rules speak to us as soon as possible. Please make sure you read, understand and abide by the Academic Integrity Code for the Faculty and College of Arts and Sciences.

Students With Disabilities:
If you have a disability for which you are or may be requesting an accommodation, you are encouraged to contact both your instructor and Disability Resources and Services, 216 William Pitt Union, (412) 648-7890 as early as possible in the term. DRS will verify your disability and determine reasonable accomodations for this course.


Course webpages from Spring 2014, Spring 2012, Spring 2011, Spring 2010, Spring 2004 and Spring 2003



Last updated by Milos on 01/04/2015