The Department of Applied Mathematics is pleased to host this series of colloquium lectures, funded in part by a generous gift from the Boeing Company. This series will bring to campus prominent applied mathematicians from around the world.
Title: Computational Hypergraph Discovery, a framework for connecting the dots
Abstract: Function approximation can be categorized into three levels of complexity. Type 1: Approximate an unknown function given (possibly noisy) input/output data. Type 2: Consider a collection of variables and functions indexed by the nodes and hyperedges of a hypergraph (a generalization of a graph in which edges can connect more than two vertices). Assume some of the functions to be unknown. Given multiple samples from subsets of the variables of the hypergraph (satisfying the functional dependencies imposed by its structure), approximate all the unobserved variables and unknown functions of the hypergraph. Type 3: Expanding on Type 2, if the hypergraph structure itself is unknown, use partial observations (of subsets of the variables of the hypergraph) to uncover its structure (the hyperedges and potentially missing vertices) and then approximate its unknown functions and unobserved variables. Numerical approximation, Supervised Learning, and Operator Learning can all be formulated as type 1 problems (with functional inputs/outputs spaces). Type 2 problems include solving and learning (possibly stochastic) ordinary or partial differential equations, Deep Learning, dimension reduction, reduced-ordered modeling, system identification, closure modeling, etc. The scope of Type 3 problems extends well beyond Type 2 problems and includes applications involving model/equation/network discovery and reasoning with raw data. While most problems in Computational Sciences and Engineering (CSE) and Scientific Machine Learning (SciML) can be framed as Type 1 and Type 2 challenges, many problems in science can only be categorized as Type 3 problems. Despite their prevalence, these Type 3 challenges have been largely overlooked due to their inherent complexity. Although Gaussian Process (GP) methods are sometimes perceived as a well-founded but old technology limited to curve fitting (Type 1 problems), a generalization of these methods holds the key to developing an interpretable framework for solving Type 2 and Type 3 problems inheriting the simple and transparent theoretical and computational guarantees of kernel/optimal recovery methods. This will be the topic of our presentation.
Youtube: Watch the talk online here.