ph: +1 416-691-2516
info @ archimuse.com
published: March 2004
What Clicks? An Interim Report on Audience Research
Jim Ockuly, Minneapolis Institute of Arts, Minneapolis, Minnesota, USA.
The Minneapolis Institute of Arts is conducting a major research and development project assessing its audiences' awareness, usage, and satisfaction regarding its interactive media/Web resources. The Institute, an encyclopedic art museum, produces and maintains two Web sites (http://www.artsmia.org and http://www.artsconnected.org, the latter in conjunction with Walker Art Center). It also provides its physical visitors with a host of interactive media programs in the museum itself from a museum directory to a large number of permanent collection-based programs located throughout the building. The research has been designed to a) measure audience awareness, usage, and satisfaction regarding these resources; b) respond to those findings with improvements; then c) re-measure to gauge the effect of the improvements. A further goal of this project is to share its logic model, methodology, instruments, and findings with the museum community. To date, the benchmarking data has been compiled and analyzed. Production and marketing work are now in progress. This paper is a mid-project report on what we've learned so far, and where we're going. It includes detailed descriptions of the instruments used for data collection (from Usability Lab to Web and in-gallery surveys), presentation of the initial findings, and a discussion of how those findings informed the actions now being taken. What Clicks? is funded by an Institute of Museum and Library Services (IMLS http://www.imls.gov) National Leadership Grant.
Keywords: audience research, evaluation, interim report
In 2000, The Minneapolis Institute of Arts (MIA) received a National Leadership Grant from the Institute of Museum and Library Services (IMLS http://www.imls.gov ). The grant's purpose was to fund a media and technology-oriented audience research and development project. The project titled What Clicks? is now underway. This is an interim report on the project's progress, findings, and activities to date.
What Clicks? Project Profile
The specific purpose of the What Clicks research and development project is to study audience effectiveness of The Minneapolis Institute of Arts' digital media resources. This includes measurement of audience awareness, use, and satisfaction as it applies to the museum's Interactive Directory, Interactive Learning Stations, and Web site. Further, the project aims to interpret and react to the initial baseline findings by setting aside significant time to make improvements to the above resources, and then re-measure in an effort to gauge the impact those improvements have made. Ultimately, it is hoped that the What Clicks project will benefit The Minneapolis Institute of Arts and its audience through the identification of audience needs, and in turn benefit other museums and their audiences through the publication of both the project's process and its findings.
Established in 1883 and now considered one of the top ten art museums in the United States, The Minneapolis Institute of Arts has built an encyclopedic collection of approximately 100,000 objects dating from classical to contemporary times. The Institute offers regular programs that include permanent collection display, special exhibitions, lectures, classes, and tours. It also has a longstanding record of using media and technology to help connect its audiences with art particularly under current Museum Director Evan Maurer.
The Minneapolis Institute of Arts employs a staff of seven full-time professionals in its Interactive Media Group or IMG. The IMG works with other museum staff to design, implement, and support the Institute's media and technology resources.
Currently, the museum houses an Interactive Directory as an information aid for visitors, and 17 Interactive Learning Stations situated throughout the building. The museum also offers both special exhibition and permanent collection audio tours, as well as an electronic Daily Events screen.
For the off-site audience, the Institute provides two Web sites the general museum site (http://www.artsmia.org) and ArtsConnectEd, a resource designed for K-12 teachers and students (http://www.artsconnected.org). ArtsConnectEd is a joint project of The Minneapolis Institute of Arts and Walker Art Center (http://www.walkerart.org).
The museum recorded approximately 500,000 on-site museum visits in fiscal year 2001/2002, while the Institute's general Web site logged roughly 2.5 million visits, and ArtsConnectEd approximately 500,000. For several years, on-line visitation has grown consistently at a rate of around 50% per year.
The Directory consists of three touch-screens that are used from a standing position or from a wheelchair. It is located in the inner lobby at the museum's most frequently used entrance. Contents include information about special exhibitions, permanent collection galleries, lectures, films, Family Days, tours, membership, and amenities (restrooms, coat check, cafes, etc.).
Interactive Learning Stations
The museum's 17 Interactive Learning Stations each concentrate on a specific area of The Minneapolis Institute of Arts' permanent collection (e.g. photography, Prairie School architecture, African art, etc.). They provide further content and context for works of art on display. Early thinking was to avoid creating a centralized media ghetto. Most are installed in discrete spaces in or near the galleries whose objects they address. Some are installed in plain view in permanent collection galleries. They range from video "jukeboxes" with a small set of linear segments to highly interactive Web programs with database components.
The Web site acts as both an aid for museum visitation and an on-line art resource. Its major sections are The Collection, Special Exhibitions, Events, Visit, General Info, Education, Interactive Media, Join, Shop, and Electronic Postcards. There are also prominent links on the home page to ArtsConnectEd, press releases, and highlighted exhibitions and events. The site has been noted for its in-depth permanent collection programs (e.g. Modernism, Arts of Asia) and on-line curriculum units (also available through ArtsConnectEd). Awards and high use have testified to the quality of these programs. There is presently no e-commerce or transaction-based activity on the site.
What Specifically is What Clicks Measuring?
Awareness > Use > Satisfaction
Audience awareness, use, and satisfaction with each condition leading to the next constitute the What Clicks mantra. Because of the breadth of the Institute's electronic media resources and the impossibility of studying audience relationships within that entire range, there was a desire to limit the scope of the What Clicks project. It was decided that the main focus of What Clicks in terms of media resources would be the Interactive Directory, two of the 17 Interactive Learning Stations (Arts of Asia and African Art and Culture), and the museum's general Web site artsmia.org. General public awareness of all of these resources was also measured, and basic demographics were captured to see if awareness levels were consistent across groups.
Baseline Research > Interpretation/Analysis > Enrichment/Redesign > Follow-up Research
The project process can be reduced to a general arc, starting with baseline audience measurement for benchmarking; then analysis and interpretation of findings; a six-month period of enrichment and redesign; then a second round of audience measurement, performed exactly a year after the baseline measurement and conducted with identical instruments. Interim and summary reporting, both internally (to MIA staff) and to the field, are woven into the process. The project is currently in the enrichment and redesign phase.
The first step was to form a project team that included the museum's Assistant Director and members of the Interactive Media Group, the Education Division, the External Affairs Division (which includes Marketing and Communications), and the museum's Visitor and Member Services department. These people would work directly with the research data and are primary decision-makers in matters regarding the Institute's media programs.
Logic Model and Evaluation Plan
To focus the project team's thinking, a consultant (evaluator Mary Ellen Murphy) was hired to work with the group in developing a project evaluation plan, including a logic model. This also addressed a grant component, since an outcome-based process and final report had been requested by the Institute of Museum and Library Services. Before any audience research was done, the project team attempted to envision a process whereby the initial goals of the project might be met (the goals being to increase audience awareness, use, and satisfaction regarding the Institute's electronic media resources, and to share the process that led to that outcome with peer institutions). The resulting logic model is expressed in a spreadsheet that includes columns headed by these questions:
The project team met several times in this phase, each time getting closer to consensus an important condition if the project was to get off to a good start. This process revealed the kinds of questions that the team was most interested in answering and eventually informed the survey instruments themselves.
An evaluation plan was developed so there would be a way for the project team to measure success after what was sure to be a long and complex process, and also to provide some early, agreed-upon project structure. The plan matrix included:
Survey Instruments and Methodology
With an agreed-upon logic model and evaluation plan in hand, a consulting firm (Cincinnatus, Inc., http://www.cincinnatus.com) was hired to develop the instruments to be used in the audience research and then to carry out the baseline study itself. The project team and consulting firm identified the following research instruments:
There was also existing data from a previous audience survey that included some questions about media and technology, as well as demographic information for comparison.
Each instrument was developed collaboratively among the project team, the research firm, and the IMG. Feeding into this process were the logic model and evaluation plan developed earlier.
Internal Focus Group
The What Clicks project came at a time in the Institute's development when a great deal of successful work in the realm of interactive media had already taken place over the course of more than a decade. To identify the best practices that had contributed to the museum's success so far, a roundtable discussion was held. It was facilitated and recorded by members of the Cincinnatus consulting firm, and included several members of the Interactive Media Group and the Chair of The Minneapolis Institute of Arts' Education Division. A transcript of the conversation resulted, as did lists of Best Practices and Goals. This instrument was implemented before any of the others and proved to be a catalyst for thinking about how far the museum had come, and where it was hoping to go next in terms of interactive technology.
Usability Lab Study
The Institute was fortunate to partner with Minneapolis-based Target Corporation in the Usability Lab Study. While the What Clicks project team brought questions, ideas, and desired outcomes to Target, it was Target's well-developed process, their ability to accommodate a slightly unusual (read non-commercial) client, and their newly reinstalled facilities that made this study work.
Three days of testing were planned, with three subjects (users) per day. (In the end, one subject was unable to attend, so the total was eight.) The users had been screened based on the desire to have a good mix of: people who had personally visited the Institute versus those who had not; museum members versus non-members; people with a high degree of interest in art versus those with moderate interest. Experience with Web design was a quality that ruled potential subjects out. A monetary incentive of $50 was offered to compensate for each user's time commitment.
Upon testing, each subject was led into the facility and briefly interviewed. Some of the initial screening questions were repeated (Have you been to The Minneapolis Institute of Arts? for example) and, for the first time, users were asked if they'd been to the Institute's Web site.
Then, sitting at a monitored computer setup with a browser set to an unrelated Web site, each user was asked to find The Minneapolis Institute of Arts' site. Following this step, users were given several scenarios one at a time and given several minutes to complete them. The facilitator got them started, then left the room while encouraging them to verbalize their experience and impressions as they went. Audio and video recordings were made as the sessions unfolded. In the control room, members of the project team and other Institute staff observed the tests and recorded their own comments.
The first scenario given the users was simply to browse the site based on their personal interest. The next was to imagine they had visitors coming in a few weeks and wanted to get an idea of what would be going on at the museum at that time. Then, with their visitors' arrival just days away, they were asked to get more logistical information. Sometimes the facilitator would re-enter the room to ask a specific question like, "Would you be able to see [a particular film] as part of your visit?" or "What would you do if your visitors' children were especially energetic?"
Other scenarios were designed to get users to specific parts of the site. In one, users were encouraged to research the work of Frank Lloyd Wright. Ideally, this would get them into the Permanent Collection section as well as to one of the site's many rich interactive sections in this case the Unified Vision project (http://www.artsmia.org/unified-vision). Another scenario encouraged users to find and send an E-Postcard.
After each session, the project team would meet in a conference room with Target staff to review the session and make notes about observations. This led to a list of findings, some of which turned up time and again over the 3-day period. This list became an important working document. The IMG is relying heavily on this list during enrichment and redesign.
Technology Awareness Survey
In an effort to measure general awareness of museum technology among museum visitors, 379 visitors aged 15 or older were randomly intercepted on their way into the museum and given an interviewer-administered questionnaire. This took place between the dates of August 2 and August 15, 2002. The questions were designed to get at visitor awareness of various facets of the museum (restaurant, coffee shop, Interactive Directory, etc.), and allow for comparisons between technology-based and non-technology-based amenities. Visitors were also asked when, if ever, they'd visited last, and basic demographic data was also gathered. A small incentive gift was offered (a packet of MIA postcards).
A total of 128 Interactive Directory users were given a self-administered survey upon being observed using the Directory. This took place between August 16 and 29, 2002. The original sampling plan based on usage history was abandoned because of unexpectedly low traffic during the study period, and, instead, the researchers intercepted everyone who touched the Directory screen. The questions measured awareness (e.g. How did you first become aware of the Interactive Directory?), motivation (e.g. What initially motivated you to use the Directory?), and satisfaction (Users chose from a list ranging from Extremely Satisfied to Not at All Satisfied). Further questions asked specifically what could be done to improve the Directory, and, again, basic demographic information was captured. Packages of MIA postcards were given as an incentive for participation.
Learning Stations Survey
Museum volunteers intercepted 105 users of the two Interactive Learning Stations between August 16 and September 1, 2002. Because two Learning Stations in different parts of the building were being studied, motion detectors and pagers were used to alert the volunteer when a Station was in use. Each Learning Station's motion detector triggered an auto-dialer which caused the volunteer's pager to indicate which Station was occupied. The volunteer could then go to the Station in time to intercept the user. Users were given a self-administered questionnaire similar to the one used for the Directory. Again, packages of postcards were offered to survey participants.
Online Web Survey
Between August 28 and September 11, 2002, a pop-up survey appeared as viewers entered the artsmia.org Web site. The pop-up window designed to graphically complement the site aesthetic contained text that invited visitors to participate in a survey by providing an email address. Those who complied received an email invitation to complete the on-line survey. It was essential that visitors fill out the survey post-visit. If visitors initially declined, a second window popped under their main browser window, with the idea that visitors would eventually see it upon exiting their browser. The pop-under again invited participation through email, but also offered the option of completing the survey immediately. A chance for a $500 gift certificate to Amazon.com was offered as an incentive. A total of 573 people completed the survey. Considering that there were 35,357 site visits during the survey period, the response rate was 1.62%. The on-line survey was designed to measure awareness, use, and satisfaction regarding the Web site itself, and collected demographic information consistent with the other instruments.
With reams of data from the surveys and countless subjective impressions from the more observation-based instruments, the project team and the members of the Interactive Media Group faced the formidable tasks of interpretation and analysis. A pattern quickly emerged. It was immediately clear that satisfaction was extremely high across the board. Once visitors found their way to the Institute's electronic resources, they generally reported positive experiences with the Directory, Learning Stations, and Web site. On the other hand, general awareness and, in some cases, usage, scored relatively low. This suggested that the biggest opportunity would be in getting more people to the resources. In brainstorming actions to be pursued in the enrichment phase, emphasis was placed on strategies that might increase awareness and use.
Of course, much light was shed on possible satisfaction enhancements as well particularly through the in-depth research, as in the usability lab. Some relatively simple methods were also employed subsequently to get at the satisfaction questions (For example, Visitor and Member Services staff and Security personnel confirmed in interviews that museum visitors' top request is for help in locating specific works of art or types of art in the museum. This information supported survey data as well as decisions about improvements for the Interactive Directory.).
It should be noted that curatorial perspective was brought to the project at this point, and that the initial findings were reported to the entire staff. In fact, an internal communications plan was developed to get staff thinking and talking about the project, and to help generate interest in and support for eventual changes.
Interactive Learning Station Findings
Web Site Findings
From Analysis to Action
The next step was to approach the enrichment and redesign phase with four separate but related projects in mind. Three would be based on the medium-specific resources (Interactive Directory, Learning Stations, and Web site) and the fourth would be a general marketing campaign that would raise awareness of all of the above.
The project team, having worked through an analysis of the data with input from the IMG, decided to form four working groups with each group addressing one of the major projects to be completed during enrichment and redesign. Each working group was to be chaired by a member of the project team, and was to include resource members from the IMG, Visitor and Member Services, and The Minneapolis Institute of Arts' External Affairs division (which includes Marketing and Communications). The working groups would meet anywhere from one to three times and report their recommendations back to the project team.
This process got underway and led to a clearer picture of what actual action steps were to be taken. In the course of planning these actions, the project team drew up a list of criteria that specific ideas would have to meet.
These criteria included:
The process yielded a realistic set of proposed actions that were then taken to the Museum Director, who having been primed with regular project updates provided a thoughtful and thorough review. Discussions between the Director and the project team took place and, ultimately, the Director's approval paved the way for enrichment and redesign.
Case by Case: Recommended Actions
As the project is currently in the midst of the enrichment and redesign phase, what follows are general descriptions of the kinds of actions being taken in the four major projects.
Key ideas to emerge regarding the Directory include recasting the concept to make it clearer to visitors and to better address their immediate needs: What's going on right now in the museum? How can I find a specific work of art or type of art? The plan is to redesign the Directory's interface, contents, and installation in its current location. Next, additional locations will be tested. The Directory currently a multimedia program not connected to the event database or collections management system, which feed the Web site, will be rewritten as a Web-based program. In response to findings indicating a desire for livelier graphics and images, a video preview of the museum's galleries and other offerings is under consideration. This preview would most likely be incorporated into the design as an additional non-interactive screen. The Directory will also be included in the general marketing campaign as an important component of in-museum technology.
Interactive Learning Stations
With the positive outcome that visitors find the Learning Station contents highly satisfying but have difficulty finding the Learning Stations in the first place, most of the changes will be physical ones. Work is underway to make the stations more visible through consistent signage and more lively attract screens. In some cases, dramatic changes to installations will be made, including the addition of two-sided vitrines in walls that currently hide the Stations. Addressing the desire to better incorporate these resources in the galleries, closer to the art, several instances have been identified where small LCDs can be installed next to objects. These screens will display object-specific video clips. If successful, this idea will open a whole new avenue for in-museum media. Also, the general marketing campaign will attempt to raise visitor awareness of Interactive Learning Stations as effective tools for learning about the collection.
Again, with awareness being a key factor, the major effort for the Web site will be to get more people to the site. These include more frequently and prominently featuring the site's URL in print publications and on take-away items, better using the member's magazine as a promotional vehicle, creating an on-line version of Recent Acquisitions exhibitions, asking museum staff to incorporate the URL in their email signatures, and purchasing on-line ads. The findings of the usability lab will guide changes to the site's navigation, structure and contents. Also under consideration is the addition of a new section featuring some of the Institute's recent acquisitions a concept that is mirrored with ongoing Recent Acquisitions exhibitions in the museum itself.
As of this writing, a museum technology marketing campaign is being designed to attract more people to the Web site, Learning Stations, and Directories. The campaign will be timed to have had substantial impact by August, '03, when re-measurement is done. A healthy portion of the grant funding has been set aside for this very purpose.
Questions and Issues
As we've worked to get a grasp of the findings, proposed enhancements, and process complexities, several questions have arisen.
Since this is an interim report, ultimate conclusions are still to come. The proof of the pudding, as they say, is in the eating. While much has been learned, it remains to be seen what impact the current phase of enrichment and redesign will have on audience awareness, use, and satisfaction. The emphasis is now on implementation structure and creative development. Currently, the project team is meeting monthly, the working groups are each keeping regular meeting schedules, the IMG has its own structure of project leadership and coordination, and ongoing museum-wide communication is taking place.
We look forward to the follow-up measurement and to sharing the results with our museum colleagues.