A 2-Day Seminar Taught by Kevin Grimm, Ph.D.
Machine learning has emerged as a major field of statistics and data analysis where the goal is to create reliable and flexible predictive models. These methods have gained much attention for analyzing large datasets that may be composed of several hundred variables and many thousands (perhaps millions) of participants. In these situations, machine learning algorithms attempt to identify key variables needed in the predictive model, and several techniques search for nonlinear associations and interactive effects.
While machine learning techniques have been most attractive for large datasets, these same techniques can be useful in smaller datasets for the same reasons–to create simpler and more reliable predictive models, and to search for nonlinear and interactive effects. These techniques are also a natural follow-up to standard hypothesis-driven statistical analyses (e.g., multiple regression) to search for additional important patterns in the data.
The first day starts with an overview of machine learning and continues with an introduction to the basic techniques. Topics for day 1 include cross-validation, multiple regression, and basic variable selection methods, as well as an overview of the R statistical framework. The second day focuses on advanced variable selection methods for regression analysis. Topics include multivariate adaptive regression splines, lasso regression, classification and regression trees, bagging, and random forests. Throughout the course, participants gain experience with these methods through hands-on exercises.
This seminar will use R for the empirical examples and exercises. To participate in the hands-on exercises, you are strongly encouraged to bring a laptop computer with the most recent version of R and RStudio installed. RStudio is a front-end for R that makes it easier to work with. This software is free and available for Windows, Mac, and Linux platforms
Who should attend?
If you have a desire to learn how to effectively explore your data and have a strong statistical background in regression, this course is for you. You should have a good working knowledge of the principles and practice of multiple regression. It is also helpful to have familiarity with the R programming language. There are a number of excellent introductory books to R as well as a collection of online tutorials for people who are unfamiliar with R (e.g., https://www.tutorialspoint.com/r/).
LOCAtions, Format, And Materials
The class will meet from 9 am to 5 pm each day with a 1-hour lunch break at Temple University Center City, 1515 Market Street, Philadelphia, PA 19103.
Participants receive a bound manual containing detailed lecture notes (with equations and graphics), examples of computer printout, and many other useful features. This book frees participants from the distracting task of note taking.
Registration and lodging
The fee of $995.00 includes all seminar materials. The early registration fee of $895 is available until March 20.
If you cancel your registration at least two weeks before the course is scheduled to begin, you are entitled to a full refund (minus a processing fee of $50).
Lodging Reservation Instructions
A block of guest rooms has been reserved at the Club Quarters Hotel, 1628 Chestnut Street, Philadelphia, PA at a special rate of $159 per night. This location is about a 5 minute walk to the seminar location. In order to make reservations, call 203-905-2100 during business hours and identify yourself by using group code SH0419 or click here. For guaranteed rate and availability, you must reserve your room no later than Monday, March 19, 2018.
If you make reservations after the cut-off date, ask for the Statistical Horizons room rate (do not use the code) and they will try to accommodate your request.
1. Introduction to Machine Learning
a. Introduction to machine learning
b. Introduction to R
c. Single predictor regression models & cross-validation
d. Multiple regression
e. Best subsets regression & forward selection
2. Advanced Variable Selection
a. Multivariate adaptive regression splines & lasso regression
b. Review of logistic regression & decision theory
c. Classification & regression trees
d. Bagging trees & random forests