Poisson Learning: Graph Based Semi-Supervised Learning At Very Low Label Rates

Event

Weekly OSI Lab Seminar

Short summary

In this seminar, I will talk about the paper “Poisson Learning - Graph Based Semi-Supervised Learning At Very Low Label Rates” (Calder et al., ICML'20).

Abstract

(taken directly from the paper)

We propose a new framework, called Poisson learning, for graph based semi-supervised learning at very low label rates. Poisson learning is motivated by the need to address the degeneracy of Laplacian semi-supervised learning in this regime. The method replaces the assignment of label values at training points with the placement of sources and sinks, and solves the resulting Poisson equation on the graph. The outcomes are provably more stable and informative than those of Laplacian learning. Poisson learning is efficient and simple to implement, and we present numerical experiments showing the method is superior to other recent approaches to semi-supervised learning at low label rates on MNIST, FashionMNIST, and Cifar-10. We also propose a graph-cut enhancement of Poisson learning, called Poisson MBO, that gives higher accuracy and can incorporate prior knowledge of relative class sizes.

Papers

Papers discussed in the seminar:

  • Main: Jeff Calder, Brendan Cook, Matthew Thorpe, and Dejan Slepčev. Poisson Learning: Graph Based Semi-Supervised Learning At Very Low Label Rates. In ICML 2020.
  • Xiaojin Zhu, Zoubin Ghahramani, and John Lafferty. Semi-Supervisd Learning Using Gaussian Fields and Harmonic Functions. In ICML 2003.
  • Boaz Nadler, Nathan Srebro, and Xueyuan Zhou. Semi-Supervised Learning with the Graph Laplacian: The Limit of Infinite Unlabelled Data. In NIPS 2009.
Previous
Next