Museums and the Web

An annual conference exploring the social, cultural, design, technological, economic, and organizational issues of culture, science and heritage on-line.

You are hereICHIM / 3D Viewpoint-based Content Exploration of 3D Digital Archive

3D Viewpoint-based Content Exploration of 3D Digital Archive


Title3D Viewpoint-based Content Exploration of 3D Digital Archive
Publication TypeConference Paper
Year of Publication2005
AuthorsKadobayashi, R.
Secondary TitleDigital Culture and Heritage. Proceedings of ICHIM05. Paris
Keywords3D models, 3D viewpoint-based photo search, content browsing method, digital archive, ichim, ichim05
Abstract

We propose a novel approach for multimedia content exploration that enables users to search and browse photographs and content in a simple and intuitive way when viewing 3D content. This approach is based on the 3D viewpoint-based image retrieval which is especially useful for searching collections of archaeological photographs, which contain many different images of the same object. Our method is designed to enable users to retrieve images that contain the same object but show a different view and to browse groups of images taken from a similar viewpoint. In addition, the information attached to each image, such as description and URL to other resources, is automatically displayed so that the user can obtain knowledge about the objects in the current 3D scene. We also propose using 3D scenes to query by example, which means that users do not have the problem of trying to formulate appropriate queries. This combination gives users an easy way of accessing not only photographs but also archived information. A 3D viewpoint-based method can also be used when viewing 3D models or walking through 3D virtual spaces. To view particular scene/scenes, a user just needs to choose one or more photograph(s) in the digital archives. Then, the system automatically detects the viewpoints of the photographs and uses them to render 3D scenes. This helps users to view the 3D models/scenes easily. The prototype system that uses the data of an actual archaeological site will also be introduced.

Notes

Abstract only. No paper submitted.

URLhttp://www.archimuse.com/publishing/ichim05/RiekoKadobayashi.pdf