Skip to main content

Museums and the Web

An annual conference exploring the social, cultural, design, technological, economic, and organizational issues of culture, science and heritage on-line.

Exploring the Relationship between Visitor Motivation and Engagement in Online Museum Audiences

Silvia Filippini Fantoni, Rob Stein, Gray Bowman, Indianapolis Museum of Art, USA

http://www.imamuseum.org/ 

Abstract

In this paper, the authors will describe the rationale, methodology, and results of a series of studies that have been conducted with visitors to the Indianapolis Museum of Art website. The objective of the studies is to better understand people’s motivation for visiting the site and whether this motivation has an impact on the way they engage online. The hope is that these results will provide a reference dataset, and a replicable model for other museums that are interested in better understanding their online audience and in conducting similar studies for their own web efforts.

Keywords: online motivation, online behavior, Google Analytics, users, online visitors, evaluation

1.   Introduction

In the past few years, most museums have witnessed a growing number of visitors to their websites and other online platforms.  For the Indianapolis Museum of Art (IMA), those online visits easily surpass the number of visitors to the museum’s physical campus. Despite this success, however, very little is known about this audience: who they are, why they come to our website, how they engage with the site, and what experiences they might take away from their visit.

Information about online users is available to museums via powerful and easy-to-use web statistics tools like Google Analytics. Recent work by Finnis, Chan, and Clements (2011) describes a set of best-practices and analytical approaches for evaluating online success. However, many of these techniques tend to focus on the technical details of visits to a website, and—used in isolation—do not provide a deep understanding of more abstract information about user needs, motivations, and satisfaction. Without this information, it can be difficult for museums to effectively design, promote, and evaluate online content and services. 

User segmentation has been the basis of marketing practice for more than fifty years, during which it has been one of the most popular techniques for understanding consumers and attempting to provide a predictable model of behavior. This approach has been used also by museums as a way to better understand and identify their audience. Visitors have traditionally been segmented based on demographic information that is collected through surveys or based on observed behaviors.

More recently, however, several museum researchers (Hood, 1988) Moussouri, 1997; Doering and Pekarik, 1999; Packer and Ballantyne, 2002; Morris, Heargreaves, and McIntyre, 2004; Falk, 2009), independently from each other, have attempted to go beyond a demographic categorization of visitors to more thoughtfully determine what motivated people to visit museums, and think about the implications on the visitors’ experience of these motivations. 

Falk (2009) uses qualitative data to illustrate that demographic characteristics, type of museum, time of year, and group composition are not enough to truly understand and predict visitor behavior.  On the contrary, he believes that each visitor experience is the synthesis of an individual's identity-related motivations and how the museum is perceived to satisfy the needs and interests that are the consequence of that motivation.

In particular, he identifies five main identity-related motivations for visiting museums and other cultural organizations:

  1. Explorer: motivated by a need to satisfy personal curiosity and interest in an intellectually challenging environment
  2. Facilitator: motivated by the wish to engage in a meaningful social experience with someone whom they care about in an educationally supportive environment
  3. Experience Seeker: aspires to be exposed to the things and ideas that exemplify what is best and intellectually most important within a culture or community
  4. Professional/Hobbyist: possesses the desire to further specific intellectual needs in a setting with a specific subject-matter focus
  5. Recharger: motivated by the yearning to physically, emotionally, and intellectually recharge in a beautiful and refreshing environment.

This model offers a user-friendly, near-intuitive framework by which to examine and approach how museum professionals assist or inhibit the visitor experience and, as such, it has been adopted by many museums and other cultural organizations, including the IMA, as a way to segment their onsite audience and predict behavior in the physical museum.

As the Falk model shows, complex analyses that correlate motivation with actual behavior are common practice in the visitor studies field. Similar methods, however, have not been extensively applied for online audiences.  Following up on the pioneering work done by researchers such as Haley Goldman and Schaller (2004) and Peacock and Brownbill (2007), who were the first to discuss methods for addressing this kind of audience segmentation for online visitors, the IMA decided to conduct a series of studies. The objective of these studies was to better understand our online visitors’ motivation and whether this motivation has an impact on the way people engage with the museum’s website, in the hope that this would lead to the identification of segments that could be used as a reference in the evaluation and the development of content and services.

In the next sections of this paper, we will describe in detail the various steps that were taken in this process, from identifying the reason why people come to our website (Step 1), to tracking (Step 2) and comparing (Step 3) online behavior across different motivations.

2.   Step 1: Identifying online motivational categories

The first step in our attempt to better understand people’s motivations for visiting our website was to identify what these reasons might be. While Haley-Goldman and Schaller (2004) and Peacock and Brownbill (2007) had already provided classifications of online motivations for museum websites, we felt that each museum was different and did not want to enter our research with preconceived assumptions about what such motivations might be. Therefore, we decided to collect feedback directly from our visitors by asking them to tell us, using their own words, what was the main reason for visiting the website.

Figure 1: Screenshot of the IMA website’s homepage showing the promo for the first survey

In order to do so, we created an open-ended, one-question survey, which we posted on our website for a period of three weeks, during which we collected a total of 119 responses. A promo with a link to the survey appeared on every page of the main website (not the mobile version), as shown in figure 1. 

Four different people in the Web and Research and Evaluation teams individually reviewed the responses. The motivational categories that each came up with were presented and discussed at various meetings, as a result of which we were able to narrow them down to the following five:

  • Plan a visit to the museum
  • Find specific information for research or professional purposes
  • Find specific information for personal interest
  • Engage in casual browsing without looking for something specific
  • Make a transaction on the website.

The response rate was relatively low, mainly because the survey had limited visibility on the site, especially for people with large screens: the promo would appear outside of their main field of vision. Nonetheless, we had enough variety in the answers provided to allow us to carry out an analysis and come up with the above-mentioned categories.  

The definition of these motivational categories for online visitors was not based on content that people were referencing in their responses but rather on the type of activity that people came to do on the website. In this respect, our categories follow the example of the other motivational models used for onsite visitors, which are based on more abstract concepts rather than on more common variables, such as whether someone comes to visit the permanent collection or a temporary exhibition.

The resulting five categories that were identified are very much in line with those proposed by Peacock and Brownbill (2007), with the only difference that we chose to distinguish people that come to find specific information for professional reasons from those looking for information for personal interest. This distinction can help identify those users who are intrinsically motivated by personal interest from those who are seeking information in relationship to their occupation or academic studies. 

As part of our process of definition of the online motivational categories, we also compared the open-ended responses provided by visitors to various models of onsite motivation, including the Falk approach, which, as explained above, we use to segment our physical audience at the museum.  As a result of this analysis, we came to the conclusion that motivations for visiting museum websites differ significantly from the motivations of visitors to physical museums.

As Falk himself pointed out in one of his recent publications (Ellenbogen, Falk, and Haley Goldman, 2007), “[P]hysical museum goers are seeking experiences – learning experiences perhaps – but experiences nonetheless. In contrast, the Internet was created for resource-sharing and communication. This distinction shapes the current differences in motivation in the two venues.” Falk also attributes the difference in motivation between physical and online visitors to the fact that, “[P]hysical museum visits have high opportunity costs such as investments of time, efforts and money … in contrast, a virtual visitor would typically only invest a small fraction of time and efforts in their visit” (Ellenbogen, Falk, and Haley Goldman, 2007). While motivations for visiting the physical and virtual museum are different, it might be interesting to explore in future research whether visitors who self-selected “planning a visit” as their primary reason for visiting the IMA website may align with Falk’s onsite motivation for that visit.

3.   Step 2: Tracking online behavior across the five motivational categories

After identifying the main online motivational categories and comparing them to those of physical visitors, we moved to the next step of our research: trying to understand whether, based on their declared motivation, people interact differently with our website. 

In order to do so, we posted the exact same question on our website (“What is your main reason for visiting the IMA website today?”), but, instead of making it open-ended, we provided the five categories described above as possible options for people to choose from. Participants could only choose one option: the one that best represented the main reason for visiting.

Given the low response of the first survey, we wanted this second one to be more visible, so we decided to make it quite prominent on the website, as shown in figure 2.

Figure 2: Screenshot of the IMA website’s homepage showing the “pop-up” of the second survey

The pop-up question-and-answers options appeared on the first page of the website the visitor accessed, with the exception of the mobile site. Three different buttons were available on the pop up: Submit, Not Right Now, and Do Not Ask Again.  If the user answered the question and clicked on Submit, he or she would not be asked the same question again. However, we reserved the right to ask other questions in the future. If the user clicked on Not Right Now or chose to close the pop up manually, we would not ask him or her the question again for another twelve hours. If the user clicked on Do Not Ask Again, we would not ask them this or any other questions in the future. Since we used client-side cookies to determine which option the visitor had chosen, if people cleared their cookies between visits to the IMA website, the question would appear again. 

To track survey results and the behavior of the visitors across the five responses, we used the Custom Variables functionality of Google Analytics and a small amount of JavaScript. When a user selected an option and submitted the survey, JavaScript prevented the normal form submission. Instead, it used the Google Analytics API to set a session-level custom variable keyed to the specific survey question and carrying the value of the answer.  A custom event was then fired that served to notify Analytics of the custom variable.

Custom Variables werethen used as custom segments filters in Google Analytics, enabling the administrator to study the patterns of users who answered in a particular way.

The motivation question survey was available on the website for a period of three weeks from December 23, 2011, to January 18, 2012. During this period, a total of 4,074 unique visitors responded to the question, and their behavior was tracked in Google Analytics.  This corresponds to 7.7 percent of the total number of unique visitors that came to IMA website during that period, and to a 1.48-percent margin of error.  The margin of error is a statistic expressing the confidence level of a survey, and it is usually calculated whenever a population is incompletely sampled. One can determine this amount by using an algebraic formula or an online calculator (e.g., http://americanresearchgroup.com/moe.html), using the population size (in this case the total number of visitors to the website for the period in question), sample size (the number of respondents), and standard deviation (a whole number that represents the percentage of time respondents’ answers were evenly split). The smaller the margin of error, the more faith one should have that the survey's reported results are close to the "true" figures; that is, the figures for the whole population. In this case (1.48 percent margin of error), the level of representativeness of the sample is very high, as most of the surveys we conduct here at the museum have a margin of error closer to 5 percent.

4.   Step 3: Comparing online behavior across the five motivational categories

The results indicate that interesting differences of behavior exist amongst the five categories. An overview of the differences in online behaviors across the five motivations is provided in figure 3.

Figure 3: Table providing an overview of online behaviors across the five motivational categories

Based on the visitors’ responses to the online motivation question, the main reason why people came to the website is to plan a visit (50 percent), followed by looking for specific information for personal (21 percent) and professional reasons (16 percent). While 10 percent of respondents came to the website to engage in causal browsing without looking for something specific, only 2.6 percent of participants visited the website to make a transaction (figure 4).

Figure 4: Chart presenting the choice of motivation made by the 4,074 respondents

While a quick comparison with the responses provided to the open-ended question presents fairly similar results, we have to consider that the responses in both cases (particularly in the case of the first, less visible, survey) are probably more skewed (as in all online surveys) in favor of those categories with a higher percentage of repeat visitors or with an average longer time on the website (i.e., the motivations of “find specific information for professional purposes” and “make a transaction”).

When it comes to the average time spent on the IMA website (figure 3, row 1), data show that people who come to make a transaction or to search for something specific for professional reasons spend longer on the site. It is unclear, however, whether this is because they are more engaged with the site and its content or because they cannot find what they were looking for. Looking at the average number of pages could help us shed some light on this issue.

With regards to the average number of pages visited (figure 3, row 2), in fact, people that come to make a transaction not only have the highest average time spent on the site but also the highest number of average pages visited (15.4), followed by people that look for something for personal interest (8.34) or engage in casual browsing (7.72). People that come to look for something specific for professional reasons and for planning a visit have the lowest average number of pages (7.4 and 6.8 respectively). 

However, if we take the average number of seconds spent on the site and divide it by the average number of pages for each individual motivation (thus obtaining the average time spent per page across the five motivations—see figure 5), we can see how people that come to look for specific information for professional reasons and to plan for a visit spend on average more time on individual pages than those coming for other reasons, which possibly indicates a higher level of engagement with the content.

Figure 5: Chart showing the average time per page (in seconds) across the five motivations

As for the high average time spent by visitors who come to make a transaction, it is not surprising that people take longer, since the process of deciding what to purchase and going through the actual transaction can be quite long. However, the conversion rate of visitors that declared having come to the website to make a transaction is 37.38 percent (the conversion rate is the ratio of visitors who convert casual content views or website visits into desired action, in this specific case the purchase of product or tickets). While this is significantly higher than the general conversion rate of the site for that period (0.22 percent) and of people that come for other motivations (figure 3, row 6), it still means that over 60 percent of those who declared having come to the website for a transaction end up not making one. Pricing could be a factor, and so is the fact that they might not have found what they were looking for. The length and complexity of the process might also play a role. Further research is necessary to understand the reasons behind such results.

Looking at conversion rates for other motivations (figure 3, row 6), they are either null (for people coming to find specific information for professional reasons or just to engage in casual browsing) or quite low (0.39 percent for those who came to plan for the visit and 0.35 percent for those that came to look for something specific for personal interests). These are, however, higher than the overall conversion rate for the period in question which is 0.22 percent, thus suggesting that even if the site could benefit from more cross promotional opportunities for membership, ticket, and shop item purchasing, it might not make a huge difference in improving conversion rates.

When it comes to the location from where visitors access our website (figure 3, row 4), there is a higher percentage of people from outside the U.S. looking for specific information for personal or professional reasons, or engaging in casual browsing, while transactions and planning a visit are more associated with a domestic population. This is not surprising given that amongst our physical visitors, we do not count many foreigners.  On the other hand, our collections, as well as our institution, are recognized internationally and are therefore more likely to attract people from all over the world who are doing specific research for professional reasons, most of whom are also frequent visitors to the site.

If we look at the distribution of repeat versus new visitors for all five categories (figure 3, row 3), in fact the highest percentage of repeat visitors is amongst those who come to make a transaction (43.3 percent) and those who come to find something for professional reasons (33 percent). People who come to plan a visit and just browse the site without a specific purpose are the least likely to be repeat visitors (24.18 percent and 23.53 percent respectively).

When considering the type of device that people used to access our website (laptop, desktop, and iPad versus mobile; figure 3, row 5), data show a slightly higher percentage of mobile users amongst those who come to plan a visit and a slightly lower percentage amongst those who engage in casual browsing on the site. These results are quite logical, as people who are coming to prepare for their visit might quickly look up information on their mobile devices, while those who want to browse without a specific purpose might be less inclined to do so from a mobile phone. We have to consider, however, that these results are biased: the data only refers to those who used a mobile device to access the main website rather than the mobile version of the site, where, as explained above, the survey was not available.

A comparative analysis of the top ten content pages across the five motivations is very much in line with the reasons indicated for the visit (figure 6). People coming to plan a visit are more likely to look up visiting information (visit, dining, and directions), events (programs, calendar, and exhibitions), and, in smaller percentage, also general information about the museum and its collections. People who come to carry out transactions focus on the membership, donation, and shop sections of the site, while those visiting to find information for professional reasons mostly concentrate on art (collection, collection search, IMA search, exhibitions, American art, individual artworks) and job-related pages (residency, apply for residency, and jobs). People visiting to search for specific information for personal interests or to browse without looking for something specific seem to have less of a clear focus than the other three motivations, as they tend to hang around various areas of the site, from facility rentals to jobs, calendar, programs, visit, residency, collection, exhibitions, and about.

Figure 6: Top ten content pages across motivations. Note that most of the top pages in the “Make a transaction” category are related to the cart (e.g., check out, check out review, check out complete, login, user, etc.). The ones reported here are related to actual content pages.

The keywords searched on the site also reinforce the results of the top content page investigation. The most popular keywords used by people who are planning a visit relate to parking and admission. Those who are looking for information for personal reasons use keywords mostly associated with weddings, while people searching for professional reasons use terms related to specific objects, the collection, and staff. For the other two categories, we observed less-clear patterns in the keywords analysis. The keywords used by people browsing the website vary from “ArtBabble,” to specific objects, to art classes at the museum. Website visitors who come to make a transaction use keywords related to shop items.

For our comparative analysis, we also looked at the percentage of searches (both collection and site wide) and downloads (PDFs, DOCs, and images) that have been carried out on the website across motivations, and it appears that these happen more frequently when people visit for research purposes (particularly for professional reasons), even though the percentage of visits in which any of this type of activity occurs is still quite limited (figure 3, rows 5, 8, and 9).

Figure 7: Table representing the source of traffic across the five motivations

The last aspect that we considered for our comparative analysis is the traffic source—that is, where website visitors come from according to Analytics, as summarized in figure 7. It is not a coincidence that higher percentages of direct traffic (visitors who visited the site deliberately, rather than stumbling across it) are registered for those two motivations that have the highest number of repeat visitors (namely, people looking for information for professional reasons and people that come to make a transaction), while the highest percentage of search traffic comes from people planning a visit, who are mostly new visitors. Referral traffic—that is visitors who arrive at the IMA website via links from other websites—is higher mostly for people coming to look for something for personal or professional reasons, as well as to engage in casual browsing, but it is surprisingly low for people planning a visit. This indicates that we could possibly improve the way in which we promote the IMA on other travel or local websites.

5.   The motivational online segmentation

Our research has proven that motivation is a key variable in understanding visitors’ experiences online and can be used as a way to segment our audience and predict their behavior on the website. While the behaviors described above are in line with what we have for a long time suspected visitors with different motivations come to do on our website, we have now some useful data to back this up with.

In particular, we have learned that people who come to visit the website to search for specific information for professional reasons seem to be more engaged with the site. They log a higher percentage of average time per page, higher downloads and art searches, and represent a higher number of repeat visitors. They are also more likely to come directly to the site or through referrals from other sites. The majority of people with a specific reason for visiting the site are interested in our collections and in exploring job opportunities. 

People that come to the website to plan a visit to the museum are mostly domestic and local (at least in the case of our website; results might be different for museums that are more of an international tourist destination). These represent the highest percentage of our website’s new visitors. They tend to spend less time on the website than visitors from other categories, but their level of engagement with the content is relatively high, at least when it comes to the average time spent on a page, which is higher than that of most of the other motivations. They tend to focus on the visit and program sections of our site.

Visitors who come to make transactions spend the longest time on the website and on average visit the highest number of pages. However, their level of engagement with the content is relatively low, and, at least in the case of our website, their conversion rate is not as high as we might wish. 

Visitors coming to engage in casual browsing do not spend much time or visit many pages, and their level of engagement with the content is lower than other categories as regards to both the average time per page and the number of repeat visitors. Their level of engagement, however, is slightly higher when it comes to performing art searches and downloading information. Content-wise, they are less focused than other categories and have commonalities both with people planning a visit and people who are searching for information for professional reasons.

People looking for specific information for personal interests exhibit a similar behavior to those who engage in casual browsing when it comes to the average time per page and the percentage of repeat visitors. On the other hand, their frequency of art searches and downloads are more comparable to those who come to search for information for professional reasons. These visitors look for content similar to those who come to plan a visit, and these visitors share a similar conversion rate as well. 

6.   Conclusions and next steps

Thanks to the studies we have conducted so far, we now have a better understanding of the reasons why people come to our website and how their behaviors differ.  However, there are still many open questions, particularly regarding whether people felt that they could accomplish what they came to do on the site or not. The low conversion rate for people coming to carry out a transaction, and our analysis of the keywords entered on the IMA search engine, seem to suggest that people might encounter some difficulties in using the site. The next step in our research is, therefore, to try to better understand whether people are satisfied or not with their experience across the five motivations, and then look at what we can eventually do to help them, if this is not the case.

In order to address this issue, we are planning another quick two-question survey, which will attempt to correlate motivation, behavior, and satisfaction. We are expecting to launch this next phase of our study soon, and hopefully we will be able to report about this at the conference in San Diego.

In the future, we are also interested in exploring variables besides motivation that might have an impact on visitors’ online experience, such as familiarity with the website, familiarity with the museum and/or subject matter, how they access the site (via mobile, iPad, laptop and desktop, etc.), from where they access the site (at home, at work, in a library), the social context of the visit, etc.

References

Doering, Z.D., and A. Pekarik. (1999).Strangers, Guests, or Clients? Visitor Experiences in Museums.” Curator: The Museum Journal, 42: 74–87.

Ellenbogen K., J. Falk, and K. Haley Goldman. (2007). “Understanding the Motivations of Museum Audiences.” In Marty, P.F., and K.B. Jones (eds.). Museum Informatics: People, Information, and Technology in Museums. Routledge Studies in Library and Information Science.

Falk, J. (2009), Identity and the museum visitor experience. Walnut Creek, CA: Left Coast Press.

Finnis, J., S. Chan, and R. Clements. (2011). "Let's Get Real: How to Evaluate Online Success?" WeAreCulture24 | Action Research.. Available at: http://weareculture24.org.uk/projects/action-research/

Haley Goldman, K., and D. Schaller. (2004). “Exploring Motivational Factors and Visitor Satisfaction in On-Line Museum Visits.” In J. Trant and D. Bearman (eds.). Museums and the Web 2004: Proceedings. Toronto: Archives & Museum Informatics. Consulted September 27, 2011. Available at: http://www.archimuse.com/mw2004/papers/haleyGoldman/haleyGoldman.html

Hood, M. (1988). “Leisure criteria of family participation and nonparticipation in museum.” In Butler, B., and M. Sussman. (eds.). Museum visits and activities for family life enrichment. New York: Haworth Press, 151–169.

Morris G., Hargreaves J., and McIntyre A.. (2004). Tate Through Visitors' Eyes. REPORT, January 2004.

Moussouri, T. (1997). Family Agendas and Family Learning in Hands-On Museums. Unpublished Ph.D. thesis. Leicester, England: University of Leicester.

Packer, J., and R. Ballantyne. (2002). “Motivational Factors and the Visitor Experience: A Comparison of Three Sites.” Curator: The Museum Journal, 45: 183–198.

Peacock, D., and J. Brownbill. (2007). “Audiences, Visitors, Users: Reconceptualising Users Of Museum On-line Content and Services.” In J. Trant and D. Bearman (eds.). Museums and the Web 2007: Proceedings. Toronto: Archives & Museum Informatics. Published March 1, 2007. Consulted September 27, 2011. Available at: http://www.archimuse.com/mw2007/papers/peacock/peacock.html

For more information on how to set up and use advanced user segments in Google Analytics, visit the following link (consulted January 31, 2012): http://support.google.com/googleanalytics/bin/answer.py?hl=en&answer=108040