Aug 30 | Announcement Our seminars will still be held on Thursdays from 5 to 6 PM, in room 4-231, and dinner will follow in the Applied Math Lounge (2-349). I know many of you are interested in giving a talk, so please contact me ASAP with a possible topic (you can change it later on) and the date(s) that you prefer. I would especially appreciate volunteers to give the first talk of the semester, on September 4-th. Here are the benefits of being a speaker for the seminar:
|
|
Sep 4 | Leonid Chindelevitch | Need for speed, or what can you do in linear time? Abstract: It is a source of never-ending surprise that so much can be done in a very short time! In this talk I will present some linear time algorithms for a variety of problems: "standard" problems (selection, sorting), graph theory (graph traversal, graphs with special structure), bioinformatics/language (edit distance, tree isomorphism), and computational geometry (triangulation, convex hulls). |
Sep 11 | David Shirokoff | Random matrix theory: some unexpected connections Abstract: In 1998, Peter Sarnak delivered a lecture at the conference of Mathematics and the Media describing connections between Riemann zeta zeros, eigenvalues of random matrices and models in nuclear physics. I will recount this interesting story, including the main characters and their math. I'll present the connection between zeta zeros and random matrix theory, as well as between random matrix theory and physics. I'll include topics from related works, which have found practical applications, such as the Tracy ? Widom distribution for the largest eigenvalue of a Gaussian unitary ensemble (GUE). |
Sep 18 | Jose Soto | The biggest numbers in mathematics Abstract: If I give you a piece of paper, what is the biggest number that you can write in it? You are allowed to use English or any well-defined mathematical notation, or if you want, you can create your own. Of course, you can't write things like "The biggest number expressible in this paper" since this is not well defined. In the talk, I will cover some of the biggest number ever used in a mathematical proof: the Skewes number and the Graham number and the problems that gave rise to those monstrosities. We will also see what can you do with products, exponentiations, tetrations, etc. and how to create functions that grow faster than all of them (some of these appears very often in the analysis of algorithms). If time permits, I will also talk about non-computable functions, and their relation to some open questions in Mathematics. |
Sep 25 | Dorian Croitoru | The probabilistic method The probabilistic method proves existence of various mathematical objects by literally showing that the probability of their existence is positive. It has lots of applications to combinatorics, number theory and analysis. I will present some beautiful techniques (such as moment inequalities and the Lovasz local lemma) and applications (such as Ramsey numbers and the number of prime divisors of n). |
Oct 2 | Department Social |
|
Oct 9 | Alex Levin | A random talk on expanders Abstract: Expanders are graphs that are sparse, yet highly connected. They have appeared in numerous contexts throughout the years. While a random graph will be a good expander with high probability, for many applications, it is important to be able to produce infinite families of expanders explicitly. We will survey some recently-discovered constructions for doing so, and then show how these ideas were used to find a very space-efficient algorithm for deciding connectivity in an undirected graph. |
Oct 16 | Alexey Spiridonov | Failing to apply math Abstract: Mathematics is the art of logical reasoning. We use it on well-posed problems, abstracted away $n$ levels from messy real-life situations. We train to avoid logical fallacies, and to present our arguments as easily verifiable proofs. All this would be well, but we have not evolved to rigorously reason with formal systems. Our minds, though amazingly plastic, come preloaded with many heuristics that enable us to make fast intuitive decisions, which usually turn out fine. Lose these intuitions, and I bet that mathematical insight will become an incredibly, frustratingly, unachievably rare event. But, sometimes these heuristics fail in spectacular ways, and can lead you to choices that range from suboptimal to terrible. This talk will remind you of a few such failures. |
Oct 23 | Elette Boyle | Reed-Solomon codes Abstract: Reed-Solomon codes are a type of linear block error-correcting code frequently used in media storage. Their encoding works by oversampling a polynomial over a finite field, where the polynomial is uniquely defined by the sequence of input bits. This oversampling allows the polynomial to be recovered even in the case of data errors or erasures. While the encoding portion of the Reed-Solomon algorithm is reasonably universal, there have been several proposed decoding algorithms—including an exhaustive solution of simultaneous equations from the original 1960 Reed-Solomon paper; a variant of the BCH code decoding method due to Sugiyama, Kasahara, Hirasawa, and Namekawa utilizing the Euclidean algorithm in 1975; and an algebraic list-decoding algorithm due to Guruswami and Sudan in 1999. In this talk, I will introduce linear codes and Reed-Solomon encoding, and will treat a selection of the Reed-Solomon decoding algorithms. |
Oct 30 | Jiawei Chiu | Turing patterns, subdiffusion, Gillespie algorithm |
Nov 6 | Hila Hashemi | An Interesting Physical Nonlinear System |
Nov 13 | Martin Frankland | Geometric feature identification using algebraic topology |
Nov 20 | Lionel Levine | Chip-Firing and A Devil's Staircase |
Dec 4 | Po-Ru Loh | Image Deblurring: How Cleverness (and CPU Cycles) Correct for Clumsy Fingers |