Below is a more comprehensive version of my transcript:
This is a reading course with Professor Dan Simpson, focusing on Bayesian Inference. A reading course means a one-on-one course with the professor. The student completes material on their own and meets and discusses with the supervisor, which is graded based on progress and projects. The material used for this class is the Bayesian Data Analysis textbook by Gelman, supplemented with tutorials made by Michael Betancourt, and research papers. The two projects will be: Critiquing the Bayesian LASSO paper and showing where it works nad where it doesn’t, and a bigger one with a topic still to be determined. Second part of the course will have a significant component of Hamiltonian Monte Carlo, and other MC methods.
This is a reading course with Professor David Duvenaud. It follows the Fields Machine Learning Graduate Course. My final project will be empirically verifying the paper by Michael Betancourt The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling and building on top of it to explore failiure modes of a variety of MCMC related methods.
Data acquisition trends in the environmental, physical and health sciences are increasingly spatial in character and novel in the sense that modern sophisticated methods are required for analysis. This course will cover different types of random spatial processes and how to incorporate them into mixed effects models for Normal and non-Normal data. Students will be trained in a variety of advanced techniques for analyzing complex spatial data and, upon completion, will be able to undertake a variety of analyses on spatially dependent data, understand which methods are appropriate for various research questions, and interpret and convey results in the light of the original questions posed.
Through case studies and collaboration with researchers in other disciplines, students develop skills in the collaborative practice of Statistics. Focus is on pragmatic solutions to practical issues including study design, dealing with common complications in data analysis, and ethical practice, with particular emphasis on written communication.
I am working in collaboration with a Master’s student in the Biology department analyzing seasonal adaptation of fish, and their changes in behaviour, measured by levels of Triglyceride in the liver and muscle protein density.
Statistical aspects of supervised learning: regression with spline bases, regularization methods, parametric and nonparametric classification methods, nearest neighbours, cross-validation and model selection, generalized additive models, trees, model averaging, clustering and nearest neighbour methods for unsupervised learning.
Course website
This course gives an overview of both the foundational ideas and the recent advances in neural net algorithms. Roughly the first 2/3 of the course focuses on supervised learning – training the network to produce a specified behavior when one has lots of labeled examples of that behavior. The last 1/3 focuses on unsupervised learning and reinforcement learning.
Programming in an interactive statistical environment. Generating random variates and evaluating statistical methods by simulation. Algorithms for linear models, maximum likelihood estimation, and Bayesian inference. Statistical algorithms such as the Kalman filter and the EM algorithm. Graphical display of data.
Discrete and continuous time processes with an emphasis on Markov, Gaussian and renewal processes. Martingales and further limit theorems. A variety of applications taken from some of the following areas are discussed in the context of stochastic modeling: Information Theory, Quantum Mechanics, Statistical Analyses of Stochastic Processes, Population Growth Models, Reliability, Queuing Models, Stochastic Calculus, Simulation (Monte Carlo Methods).
Advanced topics in statistics and data analysis with emphasis on applications. Diagnostics and residuals in linear models, introduction to generalized linear models, graphical methods, additional topics such as random effects models, designed experiments, model selection, analysis of censored data, introduced as needed in the context of case studies.
Statistical theory and its applications at an advanced mathematical level. Topics include probability and distribution theory as it specifically pertains to the statistical analysis of data. Linear models and the geometry of data, least squares and the connection to conditional expectation. The basic concept of inference and the likelihood function.
Material Covered: Statistical models and distributions; fundamentals of inference: estimation, hypothesis testing, and significance levels; likelihood functions and likelihood-based inference; prior distributions and Bayesian inference.
An overview of probability from a non-measure theoretic point of view. Random variables/vectors; independence, conditional expectation/probability and consequences. Various types of convergence leading to proofs of the major theorems in basic probability. An introduction to simple stochastic processes such as Poisson and branching processes.