Go to page

Bibliographic Metadata

Making texture descriptors invariant to blur
AuthorGadermayr, Michael ; Uhl, Andreas
Published in
EURASIP Journal on Image and Video Processing, London, 2016, Vol. 2016, Issue 14, page 1-9
PublishedSpringer Open, 2016
Document typeJournal Article
Keywords (EN)Feature extraction / Invariance / Robustness / Texture recognition
Project-/ReportnumberFWF 24366
URNurn:nbn:at:at-ubs:3-4715 Persistent Identifier (URN)
 The work is publicly available
Making texture descriptors invariant to blur [1.44 mb]
Abstract (English)

Besides a high distinctiveness, robustness (or invariance) to image degradations is very desirable for texture feature extraction methods in real-world applications. In this paper, focus is on making arbitrary texture descriptors invariant to blur which is often prevalent in real image data. From previous work, we know that most state-of-the-art texture feature extraction methods are unable to cope even with minor blur degradations if the classifier's training stage is based on idealistic data. However, if the training set suffers similarly from the degradations, the obtained accuracies are significantly higher. Exploiting that knowledge, in this approach the level of blur of each image is increased to a certain threshold, based on the estimation of a blur measure. Experiments with synthetically degraded data show that the method is able to generate a high degree of blur invariance without loosing too much distinctiveness. Finally, we show that our method is not limited to ideal Gaussian blur.

The PDF-Document has been downloaded 18 times.
CC-BY-License (4.0)Creative Commons Attribution 4.0 International License