Go to page
 

Bibliographic Metadata

Title
Making texture descriptors invariant to blur
AuthorGadermayr, Michael ; Uhl, Andreas
Published in
EURASIP Journal on Image and Video Processing, London, 2016, Vol. 2016, Issue 14, page 1-9
PublishedSpringer Open, 2016
LanguageEnglish
Document typeJournal Article
Keywords (EN)Feature extraction / Invariance / Robustness / Texture recognition
Project-/ReportnumberFWF 24366
ISSN1687-5281
URNurn:nbn:at:at-ubs:3-4715 Persistent Identifier (URN)
DOI10.1186/s13640-016-0116-7 
Restriction-Information
 The work is publicly available
Files
Making texture descriptors invariant to blur [1.44 mb]
Links
Reference
Classification
Abstract (English)

Besides a high distinctiveness, robustness (or invariance) to image degradations is very desirable for texture feature extraction methods in real-world applications. In this paper, focus is on making arbitrary texture descriptors invariant to blur which is often prevalent in real image data. From previous work, we know that most state-of-the-art texture feature extraction methods are unable to cope even with minor blur degradations if the classifier's training stage is based on idealistic data. However, if the training set suffers similarly from the degradations, the obtained accuracies are significantly higher. Exploiting that knowledge, in this approach the level of blur of each image is increased to a certain threshold, based on the estimation of a blur measure. Experiments with synthetically degraded data show that the method is able to generate a high degree of blur invariance without loosing too much distinctiveness. Finally, we show that our method is not limited to ideal Gaussian blur.

Stats
The PDF-Document has been downloaded 18 times.
License
CC-BY-License (4.0)Creative Commons Attribution 4.0 International License