Abstract: In this talk, we present a geometric framework for learning and processing information from data. First, we introduce the feature geometry, which unifies statistical dependence and features in functional space equipped with geometric structures. Then, we formulate each learning problem as solving the optimal feature representation of the associated dependence component. Specifically, we will demonstrate deep neural networks as one specific method for achieving this goal. Building on this observation, we will propose more adaptable ways to design neural networks for multivariate learning tasks. We will discuss several learning applications, including (1) handling multimodal data with missing modalities and (2) learning dependence structures from sequential data.
[Based on https://arxiv.org/abs/2309.10140]
Speaker bio: Xiangxiang Xu is a postdoctoral associate in the Department of EECS at MIT, hosted by Prof. Lizhong Zheng. His research focuses on information theory and statistical learning, with applications in understanding and developing learning algorithms.