Skip to main content

Museums and the Web

An annual conference exploring the social, cultural, design, technological, economic, and organizational issues of culture, science and heritage on-line.

Mixing Realities to Connect People, Places, and Exhibits Using Mobile Augmented-Reality Applications

Abstract

With mobile augmented-reality technologies and applications, museums can extend their relationship with visitors beyond physical boundaries to engage them further in discovery-based learning.

Using freely available mobile augmented-reality (AR) authoring platforms and publishing tools, the Exploratorium is experimenting with extensions to both its physical and web exhibit spaces to allow visitors to interact with exhibits and natural phenomenon around the San Francisco Bay Area. Applications developed with two platforms, Layar and Junaio, allow mobile users with AR-capable smart phones (equipped with a camera and GPS) to explore their surroundings through live camera views of these devices and to interact with overlaid multilayered, geo-referenced information. Interactive content “layers” let visitors with compatible devices, including iPhone and Android smartphones, see annotated information about physical exhibits and locations in outdoor spaces, leave “markers,”of their visit experience in places where others can see them, and explore temporal data about objects and locations. While it’s possible to get started right away developing content and user-experiences with mobile AR, advances in mobile image processing, tagging technologies, and wearable computing – combined with the convergence of distributed networks and location-aware applications – will help define more intimate modes of human-computer interaction that museum exhibit and experience developers can further explore.

This paper discusses content creation for mobile AR experience design using these two platforms, strategies for incorporating preparation of mobile AR-ready information into a digital content creation workflow, and measuring impact via mobile analytics. It covers key processes including  geotagging  different types of assets (images, videos, audio, and 3-D objects) and methods of interfacing the AR authoring platforms with content in external systems including event calendars, media portal/content management systems, and social-sharing sites such as YouTube, and Flickr.

Type: 

Paper - in formal session

Authors

rrothfarb's picture
Rob has been working in online media and technology since 1994. He directs web and web infrastructure projects at the Exploratorium in San Francisco and is interested in using 3D graphics, photography, virtual worlds, augmented reality, digital audio and video applications, geospatial data,...