Ph.D. Candidate: Qingsan Zhu
Mathematics Department

Thu 10 Aug 2017, 10:00am
SPECIAL
Auditorium Annex (AUDX) Room 142, 1924 West Mall, UBC

Oral Defense: Critical Branching Random Walks, Branching Capacity and Branching Interlacements

Auditorium Annex (AUDX) Room 142, 1924 West Mall, UBC
Thu 10 Aug 2017, 10:00am12:00pm
Details
Abstract:
This thesis concerns critical branching random walks. We focus on the supercritical (d>=5) and critical (d=4) dimensions.
In this thesis, we extend the potential theory for random walk to critical branching random walk. In the supercritical dimensions, we introduce branching capacity for every finite subset of Z^d and construct its connections with critical branching random walk through the following three perspectives.
i) The visiting probability of a finite set by a critical branching random walk starting far away;
ii) Branching recurrence and branching transience;
iii) Local limit of branching random walk in torus conditioned on the total size.
In the critical dimension, we also construct some parallel results. On the one hand, we give the asymptotics of visiting a finite set and the convergence of the conditional hitting point. On the other hand, we establish the asymptotics of the range of a branching random walk conditioned on the total size.
Also in this thesis, we analyze a small game which we call the MajorityMarkov game and give an optimal strategy.
hide

Ph.D. Candidate: Matt Coles
Mathematics Department

Thu 10 Aug 2017, 12:30pm
SPECIAL
Room 207, Anthropology and Sociology Bldg., 6303 NW Marine Drive, UBC

Oral Examination: Behaviour of Solutions to the Nonlinear Schrödinger Equation in the Presence of a Resonance

Room 207, Anthropology and Sociology Bldg., 6303 NW Marine Drive, UBC
Thu 10 Aug 2017, 12:30pm2:30pm
Details
ABSTRACT:
The present thesis is split in two parts. The first deals with the focusing Nonlinear Schrödinger Equation in one dimension with purepower nonlinearity near cubic. We consider the spectrum of the linearized operator about the soliton solution. When the nonlinearity is exactly cubic, the linearized operator has resonances at the edges of the essential spectrum. We establish the degenerate bifurcation of these resonances to eigenvalues as the nonlinearity deviates from cubic. The leadingorder expression for these eigenvalues is consistent with previous numerical computations.
The second considers the perturbed energy critical focusing Nonlinear Schrödinger Equation in three dimensions. We construct solitary wave solutions for focusing subcritical perturbations as well as defocusing supercritical perturbations. The construction relies on the resolvent expansion, which, is singular due to the presence of a resonance. Specializing to pure power focusing subcritical perturbations we demonstrate, via variational arguments, the existence of a ground state soliton, which, is then shown to be the previously constructed solution. Finally, we achieve a dynamical theorem which characterizes the fate of solutions whose initial data are below the action of the ground state. Such solutions will either scatter or blowup in finite time depending on their initial data.
hide

Ph.D. Candidate: Curt Da Silva
Mathematics, UBC

Mon 21 Aug 2017, 12:30pm
SPECIAL
Room 202, Anthropology and Sociology Bldg., 6303 NW Marine Drive, UBC

Oral Examination: Largescale optimization algorithms for missing data completion and inverse problems

Room 202, Anthropology and Sociology Bldg., 6303 NW Marine Drive, UBC
Mon 21 Aug 2017, 12:30pm2:30pm
Details
ABSTRACT: Inverse problems are an important class of problems found in many areas of science and engineering. In these problems, one aims to estimate unknown parameters of a physical system through indirect multiexperiment measurements. Inverse problems arise in a number of fields including seismology, medical imaging, and astronomy, among others.
An important aspect of inverse problems is the quality of the acquired data itself. Realworld data acquisition restrictions, such as time and budget constraints, often result in measured data with missing entries. Many inversion algorithms assume that the input data is fully sampled and relatively noise free and produce poor results when these assumptions are violated. Given the multidimensional nature of realworld data, we propose a new lowrank optimization method on the smooth manifold of Hierarchical Tucker tensors. Tensors that exhibit this lowrank structure can be recovered from solving this nonconvex program in an efficient manner. We successfully interpolate realistically sized seismic data volumes using this approach.
If our lowrank tensor is corrupted with nonGaussian noise, the resulting optimization program can be formulated as a convexcomposite problem. This class of problems involves minimizing a nonsmooth but convex objective composed with a nonlinear smooth mapping. In this thesis, we develop a level set method for solving compositeconvex problems and prove that the resulting subproblems converge linearly. We demonstrate that this method is competitive when applied to examples in noisy tensor completion, analysisbased compressed sensing, audio declipping, totalvariation deblurring and denoising, and onebit compressed sensing.
With respect to solving the inverse problem itself, we introduce a new software design framework that manages the cognitive complexity of the various components involved. Our framework is modular by design, which enables us to easily integrate and replace components such as linear solvers, finite difference stencils, preconditioners, and parallelization schemes. As a result, a researcher using this framework can formulate her algorithms with respect to highlevel components such as objective functions and hessian operators. We showcase the ease with which one can prototype such algorithms in a 2D test problem and, with little code modification, apply the same method to largescale 3D problems.
hide

Courant Institute of Mathematical Sciences, New York University

Tue 29 Aug 2017, 12:30pm
Scientific Computation and Applied & Industrial Mathematics
ESB 4133 (PIMS Lounge)

Nonsmooth, Nonconvex Optimization: Algorithms and Examples

ESB 4133 (PIMS Lounge)
Tue 29 Aug 2017, 12:30pm1:30pm
Abstract
In many applications one wishes to minimize an objective function that is not convex and is not differentiable at its minimizers. We discuss two algorithms for minimization of nonsmooth, nonconvex functions. Gradient Sampling is a simple method that, although computationally intensive, has a nice convergence theory. The method is robust and the convergence theory has recently been extended to constrained problems. BFGS is a well known method, developed for smooth problems, but which is remarkably effective for nonsmooth problems too. Although our theoretical results in the nonsmooth case are quite limited, we have made some remarkable empirical observations and have had broad success in applications. Limited Memory BFGS is a popular extension for large problems, and it is also applicable to the nonsmooth case, although our experience with it is more mixed. Throughout the talk we illustrate the ideas through examples, some very easy and some very challenging. Our work is with Jim Burke (U. Washington) and Adrian Lewis (Cornell).
hide

Seminar Information Pages

Note for Attendees
Latecomers will not be seated.