Table of Contents |
---|
I. Welcome Video |
II. Course Overview at a Glance |
III. Course Objectives |
IV. Course Description |
V. Tentative Schedule |
VI. Sample Lecture |
I. Welcome Video
Videos removed on print.
Jump to top
II. Course Overview at a Glance
Time & Place | TuTh 10:00am – 11:15am, Wood Building WG73 |
Instructor | Abdus Sattar, PhD |
Office | Wood building: W-G51 |
Teaching Assistant | TBD |
Office Hours | Monday and Wednesday 3:00pm – 4:00pm or by appointment |
E-mail/Phone: | Phone: 1-216-368-1501, Email: sattar@case.edu |
Course Web Page | canvas.case.edu |
Textbook (Required) | In All Likelihood: Statistical Modeling and Inference Using Likelihood by Yudi Pawitan |
Prerequisites:
|
|
Disability Help:
If you have a disability and need help, please contact me and the Office of Educational Support Services at disability@case.edu, 216.368.5230 as early as possible in the term. |
|
Academic Integrity:
You are expected to maintain the highest integrity in your work for this class. This includes not passing off anyone else’s work as your own, even with their permission. Your homework solutions must be your own work, not from outside sources, consistent with the university rules on academic honesty. I expect you to follow this policy scrupulously. Evidence of academic dishonesty may lead to loss of credit for the assignment, and possibly failure of the course. |
II. Course Objectives
- Gain proficiency in likelihood-based modeling and inferences.
- Hone skills by applying contemporary likelihood theory in solving statistical problems.
- Gain competency in computing by solving estimation problems and judging the quality of inferences via simulation study.
III. Course Description
This course introduces contemporary likelihood theory and its applications in solving statistical problems. The course will cover maximum likelihood theory; profile-, pseudo-, quasilikelihood theory, generalized estimating equations, h-likelihood, and nonparametric smoothing. We will use these likelihood theories in modeling and inference. Although we will rely on statistical theory and mathematics, the course is more about developing statistical thought process in addressing real-world statistical challenges. We will apply computational approaches in understanding estimation and making likelihood based inferences. There will be a midterm project in this course, which will allow you to demonstrate independent statistical research working in your own content area. The course is taught at the doctoral level, and much of the theory is illustrated through applications.
Course Requirements and Grading
Homework: Homework will be assigned biweekly. There will be approximately 6 – 7 homework assignments. No late homework will be accepted unless you have a university-excused absence.
Midterm Project: There will be a midterm project, which will provide the opportunity for you to demonstrate your statistical knowledge and computational skills in model building and making statistical inferences. You will receive more detailed information on a separate handout in the 13th lecture.
Final Exam: There will be a final exam. No makeup exam is allowed for missing this exam except in the case of a university excused absence. This will be a closed book, closed note cumulative exam.
Grading Scale: The course grade will be determined according to the following:
Homework | 30% |
Project | 40% |
Final Exam | 30% |
IV. Tentative Schedule
Week | Day | Lecture # | Topic | Reading* |
---|---|---|---|---|
1 | ||||
Tue | 1 | Introduction, Ch 2 (2.1-) | Handouts | |
Thu | 2 | Ch 2: Elements of likelihood inference ( -2.9) | Ch 2 | |
2 | ||||
Tue | 3 | Ch 3: More properties of likelihood (3.1-3.5) | Ch 3 | |
Thu | 4 | Ch 4 Basic models and simple applications (4.1-4.7) | Ch 4 | |
3 | ||||
Tue | 5 | Ch 4 Basic models and simple applications (4.8-4.11) | Ch 4 | |
Thu | 6 | Chapter 4 Continued (extra) | Ch 4 | |
4 | ||||
Tue | 7 | Ch 5 Frequentist properties (5.1-5.6) | Ch 5 | |
Thu | 8 | Ch 5 Frequentist properties (5.7-5.9) | Ch 5 | |
5 | ||||
Tue | 9 | Ch 8: Score function and Fisher Information (8.1-8.7) | Ch 8 | |
Thu | 10 | Ch 9 Large sample results (9.1-9.7) | Ch 9 | |
6 | ||||
Tue | 11 | Ch 9: Large sample results (9.8-9.12) | Ch 9 | |
Thu | 12 | Ch 10: Dealing with nuisance parameters (10.1-10.3) | Ch 10 | |
7 | ||||
Tue | 13 | Ch 10 Dealing with nuisance parameters (10.4-10.6) | Ch 10 | |
Thu | 14 | Ch 10 Dealing with nuisance parameters (10.7-10.8) | Ch 10 | |
8 | ||||
Tue | 15 | Ch 12 EM Algorithm (12.1-12.4) | Ch 12 | |
Thu | 16 | Ch 12 EM Algorithm (12.5-12.7) | Ch 12 | |
9 | ||||
Tue | — | Fall Break | — | |
Thu | 17 | Ch 13 Robustness of likelihood specification(13.1-13.5) | Ch 13 | |
10 | ||||
Tue | 18 | Ch 13 Robustness of likelihood specification(13.6) | Ch 13 | |
Thu | 19 | Ch 14 Estimating equations and quasi-likelihood(14.1-14.3) | Ch 14 | |
11 | ||||
Tue | 20 | Ch 14 Estimating equations and quasi-likelihood(14.4-14.6) | Ch 14 | |
Thu | 21 | Ch 16 Likelihood of random parameters(16.1-16.3) | Ch 16 | |
12 | ||||
Tue | 22 | Ch 17 Random and mixed effects models(17.1-17.3) | Ch 17 | |
Thu | 23 | Ch 17 Random and mixed effects models(17.4-17.7) | Ch 17 | |
13 | ||||
Tue | 24 | Ch 17 Random and mixed effects models(17.8-17.10) | Ch 17 | |
Thu | 25 | Ch 18 Non-parametric smoothing (18.1-18.4) | Ch 18 | |
14 | ||||
Tue | 26 | Ch 18 Non-parametric smoothing (18.5-18.8) | Ch 18 | |
Thu | — | Thanksgiving Holiday | — | |
15 | ||||
Tue | 27 | Ch 18 Non-parametric smoothing (18.9-18.12) | Ch 18 | |
Thu | 28 | Review | — | |
Tue | 29 | Final Exam | — |
*Relevant handouts, articles, etc will be provided.
Jump to top
VI. Sample Lecture
Videos removed on print.
Jump to top
Materials
PDF preview removed on print.
Jump to top