Event
Weekly OptiML Lab Group Meeting
Short summary
In this seminar, I will talk about my own paper “Fast and Efficient MMD-based Fair PCA via Optimization over Stiefel Manifold” (Lee et al., AAAI 2022).
Abstract
(taken directly from the paper)
This paper defines fair principal component analysis (PCA) as minimizing the maximum mean discrepancy (MMD) between dimensionality-reduced conditional distributions of different protected classes. The incorporation of MMD naturally leads to an exact and tractable mathematical formulation of fairness with good statistical properties. We formulate the problem of fair PCA subject to MMD constraints as a non-convex optimization over the Stiefel manifold and solve it using the Riemannian Exact Penalty Method with Smoothing (REPMS; Liu and Boumal, 2019). Importantly, we provide local optimality guarantees and explicitly show the theoretical effect of each hyperparameter in practical settings, extending previous results. Experimental comparisons based on synthetic and UCI datasets show that our approach outperforms prior work in explained variance, fairness, and runtime.
Papers
Paper discussed in the seminar:
- Main: Junghyun Lee, Gwangsu Kim, Matt Olfat, Mark Hasegawa-Johnson, Chang D. Yoo. “Fast and Efficient MMD-based Fair PCA via Optimization over Stiefel Manifold.” In AAAI 2022.