“Interactively browsing large image collections” by Richter, Eitz and Alexa

  • ©Ronald Richter, Mathias Eitz, and Marc Alexa

Conference:


Type(s):


Title:

    Interactively browsing large image collections

Presenter(s)/Author(s):



Abstract:


    Manually locating an image in a large collection has become infeasible with the recent rapid growth in size of such collections. Nowadays, even private collections easily contain tens of thousands of images; public collections have long passed the billion images mark. Current approaches for finding images in large collections, therefore, try to confine the set of images by returning only those images that correspond to certain properties defined by a query. Such properties can include: keywords, semantic information associated with the images, similarity to an example image, a rough sketch of the desired outlines, or any combination thereof.

References:


    1. Fukunaga, K., and Narendra, P. M. 1975. A branch and bound algorithms for computing k-nearest neighbors. IEEE Trans. Computers 24, 7, 750–753.
    2. Heesch, d., and Rüger, S. 2004. NNk networks for content-based image retrieval. Advances in Information Retrieval, 253–266.
    3. Oliva, A., and Torralba, A. 2001. Modeling the Shape of the Scene: A Holistic Representation of the Spatial Envelope. IJCV 42, 3, 145–175.


ACM Digital Library Publication:



Overview Page: