Mean Shift Segmentation Refactoring

From OTBWiki
Revision as of 16:26, 9 May 2012 by Jonathan guinet (Talk | contribs) (Command line example)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Mean Shift Segmentation Refactoring

Mean Shift is a non parametric kernel density mode estimation for feature space analysis. First application to computer vision has been presented by Fukunaga and Hostetler. This filter has interesting properties like :

  • no a priori knowledge of cluster numbers is nedeed,
  • guaranty of convergence,
  • ...

Thus, it achieves notoriety in numerous vision fields like segmentation, visual tracking, or any algorithm, which requires mode seeking,and clustering. The most popular C++ code is given by EDISON system (Edge Detection and Image SegmentatiON). A presentation of this code has been made by C. Christoudias. OTB 3.12 MeanShift filter code relies on this library. A new filter framework will be available in 3.14 release.


The need of a new framework

It has been decided to create a new MeanShift class, which will be available in 3.14 release for following reasons :

  • EDISON based mean shift does support threading. Thus it is not well adapted for the new OBIA scheme.
  • MeanShift filter is a black box which could be very opaque for users who want to customize the filter.
  • EDISON segmentation and labelization scheme is well adapted for image segmentation, denoising... However some spatial image specificities made it not well adapted for this domain :
    • For example fusion step (detailed in EDISON Based implementation next sub-section) may aggregate small regions with an adjacent region with very high spectral value difference. However user may want not to fuse the small regions, and for example affect a "Background" or a "NaN" label value to these.

the next section presents the computation and internal differences between old and re-factored classes.

Computation Difference.

EDISON Based implementation

EDISON based MeanShift filtering (detailed filter description can be found in D.Comaniciu article) has two main steps :

  • Filtering : Apply meanshift core algorithm itself on each image pixel until convergence. After this step filtered data is available using GetOutput() method.
  • Segmentation and Labelization :
    • A label image is first created from clustered output by agregation of pixels, which belong to the same mode. GetClusteredOutput() method given spectral image after mode clustering.
    • A fusion step is then applied, adjacent regions with spectral value difference under the spectral range are fused.
    • Finally too small regions (regions with size under the region minimum area threshold) are agregated with nearest adjacent spectral region. GetLabeledClustered() method gives segmentation output. GetClusteredBoundaries() gives region boundaries image.
New framework WIP

Current development can be found in MeanShiftFilter2 class.

Filtering steps rely on classic meanshift filtering with mode optimization (used in EDISON) in order to avoid to recalculate pixel mode convergence if the convergence path had already been calculated (if two pixels follow the same path, it will converge to the same spectral value). Numerous outputs are given :

  • Range output : (by calling GetRangeOutput()) spectral value image after filter convergence.
  • Spatial output : (by calling GetSpatialOutput()) pixel displacement map after filter convergence. maybe renamed to Displacment output
  • Metric output : (by calling GetMetricOutput()) meanshift vector image.
  • Iteration output : (by calling GetIterationOutput()) iteration counter map.
  • Label output : (by calling GetLabelOutput()) label image created by assigning one label for each mode.

The Segmentation and labelization follows dedicated step.

  • A label image is created for each mode. This step results in over segmented image.
  • Label image is transformed to label map with adjacency. Each label object is represented by this mode value and this spectral value.
  • Adjacency matrix is created for each label object (2 label objects have to be fused if they are adjacent and their spectral value difference is under range value).
  • Fuse objects using adjacency matrix.
  • Use minimum object size strategy WIP. Assign a background label to each label object under the minimum object size threshold.
  • transform resulting label map to label image.
Command line example

The following command line launch new meanshift filter test with a spectral radius of 4 pixels, a range radius of 30, a threshold of 0.1, and with a maximum of 100 iterations.

otbMeanShiftImageFilter2 ${INPUTDATA}/qb_RoadExtract.img ImageFilterSpatialOutput.tif FilterSpectralOutput.tif FilterMetricOutput.tif FilterIterationOutput.tif FilterLabelOutput.tif 4 30 0.1 100

Results

This example illustrates filter output (input image and spectral output)

Nantes Extrait.jpg
Nantes Extrait MS.jpg


TODO List

  • Implement complete Segmentation step from filtered output
  • Implement new kernels : Gaussian, Laplacian, Epanechnikov (only Uniform actually) and Metrics (L2 norm actually)
  • Create a Meanshift application