Welcome to CSI 772

Statistical Learning

Spring, 2016

Instructor: James Gentle

Lectures: Thursdays 4:30pm - 7:10pm, Innovation Hall 133

If you send email to the instructor, please put "CSI 772" in the subject line.


Course Description

``Statistical learning'' refers to analysis of data with the objective of identifying patterns or trends. We distinguish supervised learning, in which we seek to predict an outcome measure or class based on a sample of input measures, from unsupervised learning, in which we seek to identify and describe relationships and patterns among a sample of input measures. The emphasis is on supervised learning, but the course addresses the elements of both supervised learning and unsupervised learning. It covers essential material for developing new statistical learning algorithms.

Prerequisites

Calculus-level probability and statistics, such as in CSI 672/STAT 652, and some general knowledge of applied statistics.

Text and other materials

The text is T. Hastie, R. Tibshirani, and J. Friedman (HTF) The Elements of Statistical Learning, second edition, Springer-Verlag, 2009. ISBN 978-0-387-84857-0. The website for the text is http://www-stat.stanford.edu/ElemStatLearn/.

The course organization and content will closely follow that of the text. The text is quite long, however, and so some topics will be covered very lightly, and some whole chapters will be skipped completely. The main chapters we will cover are 1--4, 7, 9, 10, and 12--15. Also, we will discuss some methods that are not covered in the text.

The software used in this course is R, which is a freeware package that can be downloaded from the Comprehensive R Archive Network (CRAN). It is also available on various GMU computers in student labs.

No prior experience in R is assumed for this course. A good site for getting started with R, especially for people who are somewhat familiar with SAS or SPSS, is Quick R.

Lectures

Students are expected to attend class and take notes as they see appropriate. Lecture notes and slides used in the lectures will usually not be posted.


Grading

Student work in the course (and the relative weighting of this work in the overall grade) will consist of

  • homework assignments, mostly exercises in the text (15)
  • project (15)
  • midterm exam (30)
  • final exam (40)

    You are expected to take the final exam during the designated time period.

    Incomplete grades will not be granted except under very special circumstances.

    Homework

    Each homework will be graded based on 100 points, and 5 points will be deducted for each day that the homework is late, and will not be accepted if more than 5 days late (weekends count!). Start each problem on a new sheet of paper and label it clearly. Homework will not be accepted as computer files (and certainly not as faxes!); it must be submitted on paper.

    Project

    Each student must complete a project in the area of statistical learning. The project will involve comparison of classification methods using a dataset from the University of California at Irvine (UCI) Machine Learning Repository.

    Because the available time for the class is not sufficient to cover all of even the most common methods of learning, a student may wish to do a project involving methods addressed in the text, but which are not covered in class.

    The project will require a written report and, depending on available class time, may involve an oral presentation.


    Academic honor

    Each student enrolled in this course must assume the responsibilities of an active participant in GMU's scholarly community in which everyone's academic work and behavior are held to the highest standards of honesty. The GMU policy on academic conduct will be followed in this course.

    Make sure that work that is supposed to be yours is indeed your own

    With cut-and-paste capabilities on webpages, it is easy to plagarize.
    Sometimes it is even accidental, because it results from legitimate note-taking.

    Some good guidelines are here:
    http://ori.dhhs.gov/education/products/plagiarism/
    See especially the entry "26 Guidelines at a Glance".

    Collaborative work

    Students are free to discuss homework problems or other topics with each other or anyone else, and are free to use any reference sources. Group work and discussion outside of class is encouraged, but of course explicit copying of homework solutions should not be done.

    Students with disabilities

    Certification of a disability that requires accommodations must be be made by the Office of Disability Services (ODS). If you are a student with a disability and desire academic accommodations, please contact ODS and inform me during the first two week of classes.

    All academic accommodations must be arranged through the ODS.


    Approximate schedule

    The details of the schedule will evolve as the semester progresses.

    Week 1, January 21


    Week 2, January 28


    Week 3, February 4


    Week 4, February 11


    Week 5, February 18


    Week 6, February 25


    Week 7, March 3

    Midterm: mostly material from Chapters 1 through 3 in HTF.
    Closed book, closed notes, and closed computers except for one sheet (front and back) of prewritten notes.

    March 10

    Class does not meet.

    Week 8, March 17


    Week 9, March 24


    Week 10, March 31


    Week 11, April 7

  • General comments on fitting (estimation, classification, prediction, smoothing, etc.)
  • More on trees:
  • Recent developments in neural nets

    Week 12, April 14

  • Support vector machines (Chapter 12)

    Week 13, April 21


    Week 14, April 26


    Cinco de Mayo (May 5)

    Final Exam.
    4:30pm - 6:30pm
    Closed books, notes, and computers.