# Statistics

## Program Description

Statistics is concerned with the development and use of mathematical and computational methods for the collection, analysis, and interpretation of data in support of scientific inquiry, informed decision-making, and risk management. It calls on a broad range of tools from probability theory to computer-intensive techniques. The main areas of research by statisticians in the ISM network include

• Bayesian inference and Markov chain Monte Carlo methods
• causal inference
• computational statistics
• dependence modeling and multivariate analysis
• directional statistics
• empirical process theory
• extreme-value analysis
• high-dimensional data modeling
• machine learning
• nonparametric statistics
• statistical learning
• survey sampling
• survival analysis
• time series

Statistical research is largely motivated by collaboration with other disciplines. It finds applications in many fields, including biology, environmental science, finance and insurance, health sciences, hydrology, market research, and social sciences. With the abundance of very large and complex data sets coming, for example, from the social media and digital processes, financial transactions, astronomy, genomics, meteorology or Big Science like the Giant Hadron Collide, the statistical treatment and analysis of Big Data has become a major challenge of modern statistics.

## Program Members

The statistics program gives an opportunity to graduate students to study in these two major areas of modern statistics. The curriculum allows the students to get well acquainted with the basic elements of mathematical statistics, decision theory and applied statistics. Furthermore, advanced graduate courses can be offered in some more specialized areas.

This program welcomes graduate students with a good background in calculus, mathematical statistics, numerical analysis, and probability (all at the undergraduate level). To get strong training in decision theory and mathematical statistics students should take the basic course in measure and integration (for PhD students) and at least three courses at the intermediate and advanced levels.

## 2023-24 Course Listings

### Statistical Inference 1

This course is an introduction to statistical inference for parametric models. The following topics will be covered:
1. Distribution of functions of several random variables (distribution function and change of variable techniques), sampling distribution of mean and variance of a sample from Normal distribution.
2. Distribution of order statistics and sample quantiles.
3. Estimation: unbiasedness, Cramér-Rao lower bound and efficiency, method of moments and maximum likelihood estimation, consistency, limiting distributions, delta-method.
4. Sufficiency, minimal sufficiency, completeness, UMVUE, Rao-Blackwell and Lehman-Scheffe theorems.
5. Hypothesis-testing: likelihood-ratio tests.
6. Elements of Bayesian estimation and hypothesis-testing.

Text: Introduction to Mathematical Statistics (6th, 7th or 8th Edition), by R.V. Hogg and A.T. Craig, Prentice Hall Inc., 1994. Recommended reading: (for problems, examples etc) Statistical Inference (2nd Edition), by G. Casella and R. L. Berger, Duxbury, 2002. Evaluation: Assignments (4), Midterm exam, Final exam.

### Regression and Analysis of Variance

Multivariate normal and chi-squared distributions; quadratic forms. Multiple linear regression estimators and their properties. General linear hypothesis tests. Prediction and confidence intervals. Asymptotic properties of least squares estimators. Weighted least squares. Variable selection and regularization. Selected advanced topics in regression. Applications to experimental and observational data.

### Mathematical Statistics I

Distribution theory, stochastic models and multivariate transformations. Families of distributions including location-scale families, exponential families, convolution families, exponential dispersion models and hierarchical models. Concentration inequalities. Characteristic functions. Convergence in probability, almost surely, in Lp and in distribution. Laws of large numbers and Central Limit Theorem. Stochastic simulation.

### Bayesian Theory and Methods

Subjective probability, Bayesian statistical inference and decision making, de Finetti’s representation. Bayesian parametric methods, optimal decisions, conjugate models, methods of prior specification and elicitation, approximation methods. Hierarchical models. Computational approaches to inference, Markov chain Monte Carlo methods, Metropolis—Hastings. Nonparametric Bayesian inference.

### Computation Intensive Statistics

General introduction to computational methods in statistics; optimization methods; EM algorithm; random number generation and simulations; bootstrap, jackknife, cross-validation, resampling and permutation; Monte Carlo methods: Markov chain Monte Carlo and sequential Monte Carlo; computation in the R language.

### Statistical Inference

Conditional probability and Bayes’ Theorem, discrete and continuous univariate and multivariate distributions, conditional distributions, moments, independence of random variables. Modes of convergence, weak law of large numbers, central limit theorem. Point and interval estimation. Likelihood inference. Bayesian estimation and inference. Hypothesis testing.

### Méthodes de rééchantillonnage

Étude du « bootstrap ». Estimation du biais et de l'écart-type. Intervalles de confiance et tests. Applications diverses, incluant la régression et les données dépendantes. Étude du « jackknife », de la validation croisée et du sous-échantillonnage.

### Méthodes d’analyse biostatistique

Survol de méthodes d'analyse couramment utilisées en biostatistique (théorie et application). Modèles linéaires généralisés et équations d'estimation.

Analyse de survie paramétrique ou semiparamétrique. Introduction à l'inférence causale et la théorie semiparamétrique.

### Séries chronologiques (UdeM)

Techniques descriptives. Processus stationnaires. Meilleure prévision linéaire. Modèles ARMA, ARIMA et modèles saisonniers. Estimation et prévision dans les ARMA. Éléments d’analyse spectrale. Modèles ARCH et GARCH.

### Inférence statistique I

Espérance conditionnelle. Prédiction. Modèles statistiques, familles exponentielles, exhaustivité. Méthodes d'estimation: maximum de vraisemblance, moindres carrés etc. Optimalité: estimateurs sans biais à variance minimum, inégalité de l'information. Propriétés asymptotiques des estimateurs. Intervalles de confiance et précision. Éléments de base de la théorie des tests. Probabilité critique, puissance en relation avec la taille d'échantillon. Relation entre tests et intervalles de confiance. Tests pour des données discrètes.

### Analyse statistique multivariée

Étude des distributions échantillonnales classiques: T2 de Hotelling; loi de Wishart; distribution des valeurs et des vecteurs propres; distribution des coefficients de corrélation. Analyse de variance multivariée. Test d'indépendance de plusieurs sous-vecteurs. Test de l'égalité de matrices de covariance. Sujets spéciaux.

### Principes de simulation

Nombre aléatoire. Simulation de lois classiques. Méthodes d'inversion et de rejet. Algorithmes spécifiques. Simulation des chaines de Markov à temps discret et continu. Solution numérique des équations différentielles ordinaires et stochastiques. Méthode numérique d'Euler et de Runge-Kutta. Formule de Feynman-Kac. Discrétisation. Approximation faible et forte, explicite et implicite. Réduction de la variance. Analyse des données simulées. Sujets spéciaux.

### Méthodes d’analyse des données - UQTR

Théorie et application des méthodes classiques d'analyse de données multivariées : analyse en composantes principales, réduction de la dimensionnalité, analyse des correspondances binaire et multiple, analyse discriminante, classification hiérarchique, classification non hiérarchique, choix optimal du nombre de classes. Initiation aux réseaux de neurones artificiels. Utilisation de logiciels statistiques pour le traitement des données.

### Time Series

Statistical analysis of time series in the time domain. Moving average and exponential smoothing methods to forecast seasonal and non-seasonal time series, construction of prediction intervals for future observations, Box-Jenkins ARIMA models and their applications to forecasting seasonal and non-seasonal time series. A substantial portion of the course will involve computer analysis of time series using computer packages (mainly MINITAB). No prior computer knowledge is required.

### Statistical Learning

This course is an introduction to statistical learning techniques. Some applications to finance and insurance will be illustrated. Topics covered include:

• Cross-validation
• Regression methods (Linear models, Variable selection, Shrinkage)
• Classification methods (K-nearest neighbors, Linear and quadratic discriminants, Logistic regression, Support vector machines)
• Decision trees
• Unsupervised learning (Clustering, Principal component analysis)

### Reinforcement Learning

This course is an introduction to reinforcement learning techniques. It requires extensive programming with the R language. Topics covered include: Multi-armed bandit problem, Markov Decision Problems, Dynamic Programming, Monte-Carlo solution methods, Temporal difference methods, Multi-period Approximation methods, Policy gradient.

### Levy Processes

This course gives a brief introduction of fluctuation theory for spectrally negative Levy processes. It covers  topics including Levy-Khintchine formula, Wiener-Hopf factorization and exit problems for spectrally negative Levy processes. Some applications in risk theory will be discussed. The lectures are  based on Introductory Lectures on Fluctuations of Levy Processes with Applications and Gerber-Shiu Risk Theory both authored by Andreas Kyprianou.

### Design of Experiments

This course is an introduction to basic experimental designs and analysis of linear statistical models related to them. The following topics will be covered:
1. Review of estimation and hypothesis-testing in Normal error-based linear models.
2. Analysis of completely randomized design (CRD), randomized complete block design (RCBD), balanced incomplete block design (BIBD), Latin Square design (LSD), Graeco-Latin Square design (GLSD).
3. Factorial experiments: 2-factor and 3-factor designs, confounding, fractional replication.
4. Response-surface models.
Text: Design and Analysis of Experiments, 10th Edition, by Douglas C. Montgomery (John Wiley). Evaluation: Assignments (4), Midterm exam, Final exam.

### Generalized Linear Models

Exponential families, link functions. Inference and parameter estimation for generalized linear models; model selection using analysis of deviance. Residuals. Contingency table analysis, logistic regression, multinomial regression, Poisson regression, log-linear models. Multinomial models. Overdispersion and Quasilikelihood. Applications to experimental and observational data.

### Mathematical Statistics 2

Sampling theory (including large-sample theory). Likelihood functions and information matrices. Hypothesis testing, estimation theory. Regression and correlation theory.

### Méthodes de statistique bayésienne

Principes de l’analyse bayésienne; loi à priori et à postériori, inférence statistique et théorie de la décision. Méthodes computationnelles; méthodes de Monte Carlo par chaînes de Markov. Applications.

### Régression

Rappels sur la régression linéaire multiple (inférence, tests, résidus, transformations et colinéarité), moindres carrés généralisés, choix du modèle, méthodes robustes, régression non linéaire, modèles linéaires généralisés.

### Données catégorielles

Tableaux de contingence. Mesures d'association. Risque relatif et rapport de cote. Tests exacts et asymptotiques. Régression logistique, de Poisson. Modèles log-linéaires. Tableaux de contingence à plusieurs dimensions. Méthodes non paramétriques.

### Inférence statistique

Principes d'inférence : estimation ponctuelle, distribution des estimateurs, test d’hypothèse, région de confiance. Approche bayésienne. Méthodes de rééchantillonnage. Estimation non paramétrique. Applications modernes de la statistique.

### Analyse des données

Analyse en composantes principales. Analyse des corrélations canoniques et régression multidimensionnelle. Analyse des correspondances. Discrimination. Classification. Analyse factorielle d'opérateurs.

### Statistique mathématique

Fonctions de variables aléatoires, fonction génératrice des moments, quelques inégalités et identités en probabilité,  familles de distributions dont la famille exponentielle, vecteurs aléatoires, loi multinormale, espérances conditionnelles, mélanges et modèles hiérarchiques.  Théorèmes de convergence, méthodes de simulation, statistiques d'ordre, exhaustivité, vraisemblance.  Estimation ponctuelle et par intervalles : construction d'estimateurs et critères d'évaluation, méthodes bayésiennes.  Normalité asymptotique et efficacité relative asymptotique.