Seminar Talk

Presentation of EPFL

Presentation of EPFL

When: 10 February, 2022, 17:00-18:30 UTC+1.

Abstract: This presentation will be divided into two talks.

First speaker: Raphael Reinauer
Title: Persformer: A Transformer Architecture for Topological Machine Learning
Abstract: One of the main challenges of Topological Data Analysis (TDA) is to extract features from persistent diagrams directly usable by machine learning algorithms. Indeed, persistence diagrams are intrinsically (multi-)sets of points in R^2 and cannot be seen in a straightforward manner as vectors. In this talk, I will introduce Persformer, the first Transformer neural network architecture that accepts persistence diagrams as input. The Persformer architecture significantly outperforms previous topological neural network architectures on classical synthetic benchmark datasets. Moreover, it satisfies a universal approximation theorem. This allows us to introduce the first interpretability method for topological machine learning, which I will explore in two examples. This is joint work with Matteo Caorsi and Nicolas Berkouk.

Second speaker: Darrick Lee
Title: Signatures, Lipschitz-free Spaces, and Paths of Persistence Diagrams
Abstract: Paths of persistence diagrams describe the temporally evolving topology of dynamic point cloud data sets. As with the case of static diagrams, such objects are difficult to work with in a machine learning setting, and a feature map is often required. The path signature provides a reparametrization-invariant feature map which is both universal and characteristic, allowing us to study both functions and measures on the space of paths of persistence diagrams via kernel methods. We explore the theoretical and computational aspects of using path signatures to study such paths, and apply it to a parameter estimation problem for models of collective motion. This is joint work with Chad Giusti.

Most Recent