In this paper we focus on automatically learning object models in the framework of keypoint based object recognition. The proposed method uses a collection of views of the objects to build the model. For each object the collection is composed of N×M views obtained rotating the object around its vertical and horizontal axis. As keypoint based object recognition using a complete set of views is computationally expensive, we focused on the definition of a selection method that creates, for each object, a subset of the initial views that visually summarize the characteristics of the object and should be suited for recognition. We select the views by determining maxima and minima of a function, based on the number of SIFT descriptors able to evaluate views similarity and relevance. Experimental results for recognition on a publicly available dataset are reported.

Views selection for SIFT based object modeling and recognition

BRUNO, Alessandro;
2016-01-01

Abstract

In this paper we focus on automatically learning object models in the framework of keypoint based object recognition. The proposed method uses a collection of views of the objects to build the model. For each object the collection is composed of N×M views obtained rotating the object around its vertical and horizontal axis. As keypoint based object recognition using a complete set of views is computationally expensive, we focused on the definition of a selection method that creates, for each object, a subset of the initial views that visually summarize the characteristics of the object and should be suited for recognition. We select the views by determining maxima and minima of a function, based on the number of SIFT descriptors able to evaluate views similarity and relevance. Experimental results for recognition on a publicly available dataset are reported.
2016
9781509019298
Feature matching
Object Modeling
Object Recognition
SIFT
Media Technology
Signal Processing
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11699/74335
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 1
social impact