Last updated: 2021-05-29 10:31:47

- Lecture 17 – Neural Nets, SGD, Optimization – Rmd
- Lecture 18 – Review of Course – Rmd
- Textbook refs: Neural Nets – ESL Ch. 11

- Lecture 15 – Causal Inference 1: RCTs – Rmd
- JTPA Data: raw, cleaned, cleaning code

*Optional*Predictions 3 – Sunday Cases Redux – Rmd- My predictions, Rmd, and data

- Lecture 16 – Causal Inference 2: Targeting, Observational Methods – Rmd
- HW7 and Rmd
- Textbook References:
- RCTs/ATEs: Causal Inference Mixtape Ch. 4
- Observational Causal: Causal Inference Mixtape, Chapters 6, 7, 8, and 10 for Regression Discontinuity, Instrumental Variables, Diff-in-Diff, and Synthetic Controls respectively
- Targeting/Heterogenous Treatment Effects: (no good texts at the moment, so I’ve linked a few papers) Tibshirani, Hastie, et al (and here). CATE meets ML. Optimal Targeting with Heterogenous TEs

- Lecture 13 – Data Cleaning – Rmd
- Data: cigarettes

- Lecture 14 – Data Cleaning and Bayes – Rmd
- Cleaned Zillow Data

- Homework 6 and Rmd – Cleaning
- Textbook References: (giant references for R) Both free online.
- R for Data Science – Hadley Wickham: The wrangle section particularly.
- R Cookbook: Help with various data types

- Lecture 11 – Forests and Boosting – Rmd
- Predictions Competition 2 and Rmd
- My predictions, Rmd, and data

- Lecture 12 – Boosting, Ensembles, and Rolling Block CV – Rmd
- Homework 5 and .Rmd
- Textbook References:
- Boosting: ESL Ch. 10
- Model Averaging: ESL Ch 8
- Ensembles: ESL Ch 16

- Lecture 9 – ROC and Trees – Rmd
- Lecture 10 – Trees and Forests – and Rmd
- Textbook References:
- Trees: ESL Ch. 9.2
- Bagging: ESL Ch 8.7
- Forests: ESL Ch 15

- Lecture 7 – Classification – Rmd, and PDF
- Credit Data and codebook

- Homework 4 and Rmd
- Lecture 8 – Classification 2 – Rmd
- Surveys: General Feedback and 1-on-1 Meeting Signup
- Textbook References:
- KNN: ESL Ch. 2.3.2, 13.3
- Binomial Classification: ESL Ch 4, Ch 9.2.3, Ch 13.3
- Sensitivity/Specificity/ROC curve: Ch. 9.2.5
- Multinomial Regression: Ch. 4.4.1 (not the most helpful)

- Lecture 5 – Variable Selection – Rmd and PDF
- For those of you struggling with homeworks, there is a lot of valuable code in the old Lecture Rmd files
- Semiconductors: Data, code in lecture Rmd
- Comscore: Code in lecture Rmd, Data: domains, sites, total spend

- Lecture 6 – Cross Validation – Rmd and PDF
- The last few slides include a lot of useful code. If you can interpret and run the cross-validation code there, you should gain a solid grasp on cross-validation generally.

- Homework 3 and Rmd
- Textbook References
- AIC: ESL Ch 7.5
- BIC: ESL Ch 7.7
- Stepwise: ESL Ch. 3.3
- LASSO/Ridge/Shrinkage: ESL Ch. 3.4, 3.6, 3.8
- Cross-Validation: ESL Ch 7.10
- Bias-Variance Decomp: ESL Ch 7.2, 7.3

- Lecture 3 – Regression: Linear and Logit – Rmd and PDF
- Homework 2 – Due Wednesday April 14
- Lecture 4 – Deviance, OOS, Bootstrap – Rmd and PDF
- Textbook References:
- Linear Regression: ESL Ch. 3.2
- Logistic Regression: ESL Ch. 4.4
- Deviance: ESL p. 124
- Out-of-Sample: ESL Ch. 7.1 (uses the term ‘generalization error’ or ‘validation error’)
- Bootstrap: ESL Ch 7.49

- Syllabus.
- Textbooks:
- HW0: An ungraded assignment for making sure everyone is up to speed. Entirely optional.
- HW0 and the .Rmd code it is based on
- HW0 Answers and .Rmd file

Questions? Email me (cdowd@chicagobooth.edu) and I’ll help you out.