Search-based automatic image annotation using geotagged community photos

Abstract : In the Web 2.0 era, platforms for sharing and collaboratively annotating images with keywords, called tags, became very popular. Tags are a powerful means for organizing and retrieving photos. However, manual tagging is time consuming. Recently, the sheer amount of user-tagged photos available on the Web encouraged researchers to explore new techniques for automatic image annotation. The idea is to annotate an unlabeled image by propagating the labels of community photos that are visually similar to it. Most recently, an ever increasing amount of community photos is also associated with location information, i.e., geotagged. In this thesis, we aim at exploiting the location context and propose an approach for automatically annotating geotagged photos. Our objective is to address the main limitations of state-of-the-art approaches in terms of the quality of the produced tags and the speed of the complete annotation process. To achieve these goals, we, first, deal with the problem of collecting images with the associated metadata from online repositories. Accordingly, we introduce a strategy for data crawling that takes advantage of location information and the social relationships among the contributors of the photos. To improve the quality of the collected user-tags, we present a method for resolving their ambiguity based on tag relatedness information. In this respect, we propose an approach for representing tags as probability distributions based on the algorithm of Laplacian Score feature selection. Furthermore, we propose a new metric for calculating the distance between tag probability distributions by extending Jensen-Shannon Divergence to account for statistical fluctuations. To efficiently identify the visual neighbors, the thesis introduces two extensions to the state-of-the-art image matching algorithm, known as Speeded Up Robust Features (SURF). To speed up the matching, we present a solution for reducing the number of compared SURF descriptors based on classification techniques, while the accuracy of SURF is improved through an efficient method for iterative image matching. Furthermore, we propose a statistical model for ranking the mined annotations according to their relevance to the target image. This is achieved by combining multi-modal information in a statistical framework based on Bayes' Rule. Finally, the effectiveness of each of mentioned contributions as well as the complete automatic annotation process are evaluated experimentally.
Document type :
Theses
Complete list of metadatas

Cited literature [182 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-01208008
Contributor : Abes Star <>
Submitted on : Thursday, October 1, 2015 - 4:37:08 PM
Last modification on : Friday, May 17, 2019 - 10:19:36 AM
Long-term archiving on : Saturday, January 2, 2016 - 11:30:30 AM

File

these.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-01208008, version 1

Citation

Hatem Mousselly Sergieh. Search-based automatic image annotation using geotagged community photos. Artificial Intelligence [cs.AI]. INSA de Lyon; Universität Passau, 2014. English. ⟨NNT : 2014ISAL0084⟩. ⟨tel-01208008⟩

Share

Metrics

Record views

424

Files downloads

1122