Seminar, Namrata Vaswani

Seminar, Namrata Vaswani

Dec 8, 2025 - 11:00 AM
to , -

Speaker: Namrata Vaswani, Professor of Electrical and Computer Engineering, Iowa State University

Title: ML for CyMath and (Cy)Math for M: Alternating GD and Minimization for Secure Federated Low Rank Matrix Learning

Abstract: This talk will consist of two parts. The first will be “Math for ML” where I describe my group’s mathematical ML research on the AltGDmin algorithm and its Byzantine-resilient distributed extension. The second part will be “ML for Math” where I will describe our CyMath program’s ML-Enabled K-12 Math Tutoring and Support. Many CyMath tutors are Statistics graduate students. https://cymath.iastate.edu/ ML for CyMath: Fixing the early math skills of students is critical for the future of statistics and all STEM professions. We discuss ways in which Stat/STEM professionals can help – tutoring and encouraging math practice at home (use of an ML-enabled math learning application ALEKS makes this task easier and more reliable), e.g., https://cymath.iastate.edu/ ; raising awareness of the need/resources for early math skills; and advocating for better K-12 math policies. We argue that some current policies, based on short-term research, should be critically re-examined from a long-term (college STEM) student success perspective. (Cy)Math for ML: Modern distributed and federated learning systems are vulnerable to various kinds of adversarial attacks. Byzantine attacks are one of the most difficult attacks to deal with, since these are model update poisoning attacks (poison algorithm iterates of the attacked nodes), the adversarial nodes are omniscient, and these nodes can collude. We introduce provably Byzantine-resilient algorithms for solving three different vertically federated learning low-rank (LR) matrix learning problems – LR Matrix Completion, LR Column-wise Sensing, and LR Phase Retrieval – all of which involve solving a partly-decoupled optimization problem, and all involve dealing with data heterogeneity across nodes. These problems find important applications in recommender system design, multi-task representation learning for few-shot learning, federated sketching, accelerated dynamic MRI, and Fourier ptychography. Our proposed algorithms, Byz-AltGDmin, are provably Byzantine-resilient modifications of Alternating GD and minimization (AltGDmin). AltGDmin, introduced in our recent work, is a novel faster, and more communication-efficient, alternative to Alternating Minimization (AltMin) for partly-decoupled optimization problems. These are problems in which the set of optimization variables can be split into two subsets such that the optimization with respect to at least one subset, keeping the other fixed, is decoupled. If time permits, we may also show real-data experimental results on the advantage (speed and generality) of AltGDmin-based methods over the existing state-of-the-art within dynamic MRI.