Nonlocal linear image regularization and supervised segmentation Gilboa_Osher_MMS07.pdf

Segmentation example: nonlocal diffusion of user marks

G. Gilboa, S. Osher, “Nonlocal linear image regularization and supervised segmentation”, SIAM Multiscale Mod. Simul. (MMS), Vol. 6, No. 2, pp. 595-630, 2007.


A nonlocal quadratic functional of weighted differences is examined. The weights are based on image features and represent the affinity between different pixels in the image. By prescribing different formulas for the weights, one can generalize many local and nonlocal linear denoising algorithms, including the nonlocal means filter and the bilateral filter. In this framework one can easily show that continuous iterations of the generalized filter obey certain global characteristics and converge to a constant solution. The linear
operator associated with the Euler-Lagrange equation of the functional is closely related to the graph Laplacian. We can thus interpret the steepest descent for minimizing the functional as a nonlocal diffusion process. This formulation allows a convenient framework for nonlocal variational minimizations, including variational denoising, Bregman iterations and the recently proposed inverse-scale-space.

It is also demonstrated how the steepest descent flow can be used for segmentation. Following kernel based methods in machine learning, the generalized diffusion process is used to propagate sporadic initial user’s information to the entire image. Unlike classical variational segmentation methods the process is not explicitly based on a curve length energy and thus can cope well with highly non-convex shapes and corners. Reasonable robustness to noise is still achieved.


matlab_logo  Matlab Code for non-local diffusion