Probability theory is the branch of mathematics concerned with the analysis of random phenomena. The members of the ISM Probability Group are involved in research in a broad range of areas spanning theoretical and applied, continuous and discrete probability. A particular focus is on the development and analysis of probabilistic models for real-world phenomena from physics, biology, statistics and computer science. Some specific topics of interest are: statistical physics in a random environment, branching systems in biology, distances and random energy landscapes, data structure analysis using random trees, genetics and population biology.
Many members of the group are also members of the CRM Probability Lab.
Students interested in graduate study in any of the areas cited above are invited to apply for admission to the program. There are no formal prerequisites other than those required by the departments. The following guidelines should be followed, however, and courses selected in consultation with an advisor from the group.
Students in the program are expected to have mastered the subject matter of the undergraduate curriculum in probability theory. All students are required to take the basic courses: Real Analysis Measure Theory and Probability Theory. Students are then expected to take a number of more specialized courses.
Probability spaces. Random variables and their expectations. Convergence of random variables in Lp. Independence and conditional expectation. Introduction to Martingales. Limit theorems including Kolmogorov's Strong Law of Large Numbers.
Espace de probabilité, variables aléatoires, indépendance, espérance mathématique, modes de convergence, lois des grands nombres, théorème central limite, espérance conditionnelle et martingales. Introduction au mouvement brownien.
Tribus et variables aléatoires. Théorie de l'intégration: théorème de Lebesgue, espace Lp, théorème de Fubini. Construction de mesures, mesure de Radon. Indépendance. Conditionnement.
Characteristic functions: elementary properties, inversion formula, uniqueness, convolution and continuity theorems. Weak convergence. Central limit theorem. Additional topic(s) chosen (at discretion of instructor) from: Martingale Theory; Brownian motion, stochastic calculus.
Mouvement brownien, intégrale stochastique, formule d’Itô, équations différentielles stochastiques, théorèmes de représentation, théorème de Girsanov. Formule de Black et Scholes.
Ce cours-séminaire est une introduction au contrôle stochastique optimal. Il portera principalement sur la résolution du problème de dividendes optimales de Bruno de Finetti et les variantes publiées plus récemment.
Pré-requis: connaissances élémentaires sur le mouvement brownien, la formule d'Itô et le processus de Poisson composé.
If needed, this course will be given in English.
Content: the Erdös-Kac theorem; Poissonian distribution of prime factors, of cycles, and of irreducible factors; elements of sieve theory; distribution of divisors of integers, and of invariant sets of permutations; Erdos's multiplication table problem and its generalizations; the Luczak-Pyber theorem; applications to the irreducibility and the Galois group of random polynomials.
Probabilistic graphical models are a framework for representing large systems of random variables with complex interactions. Theory of probabilistic graphical models studies probability distributions on directed and undirected graphs and combines statistical and optimization theory to develop effective computer algorithms. It has been used in many applications in machine learning, computer vision, natural language processing and bioinformatics.
This course introduces probabilistic graphical models from the very basics to some commonly used algorithms in machine learning.
i) Forms of graphical representation: Bayesian networks; Markov random fields; undirected versus directed models
ii) Inference: variable elimination; belief propagation; MAP inference, sampling based inference; variational inference
iii) Learning: maximal likelihood estimators for Bayesian networks; maximal likelihood estimation with gradient descent; Bayesian learning
References and Required Reading:
Koller, Daphne, and Nir Friedman. “Probabilistic graphical models: principles and techniques”. MIT press, 2009.
Murphy, Kevin P. “Machine learning: a probabilistic perspective”. MIT press, 2012.
Wainwright, Martin J., and Michael I. Jordan. “Graphical models, exponential families, and variational inference.” Foundations and Trends in Machine Learning, 2008.
Grading will be based on bi-weekly theoretical assignments from the readings and on a final project.
Prerequisites: advanced courses in probability theory and statistical inference