Museums and the Web 2011 (MW2011): Best of the Web: Review Criteria
Pending input from the Best of the Web 2011 Panel.
The MW2011 Best of the Web Panel will review sites in nominated categories, and rate them according to the following Review Process and Evaluation Criteria. For full contest details see the MW2011 Best of the Web Nomination, Review and People's Choice Voting Process.
Version 1.6 – J. Trant.
(with input from the MW2010 Judges - pending 2011 review)
Nominations for sites to be considered by the Best of the Web 2011 Review Panel will be accepted until February 21, 2011.
A site can be nominated for the Best of the Web 2011 if:
- it was launched or significantly enhanced in 2010 (with the exception of sites nominated in the Long-lived category) and
- it officially represents a museum (see the ICOM definition).
Anyone can nominate a site. Nominations from those other than the site's designer or host are encouraged.
Conflict of Interest
No site will be considered for a Best of the Web Award, if a judge is associated with it in any way. Nominated sites will be reviewed before judging begins to ensure no conflicts exist. Judges associated with nominated sites will be asked to step down from the panel, or have the site withdrawn from competition.
Each panel member will review sites in a particular category (or categories) and then participate in the selection of the Best of the Web.
The Timeline for Review follows. Criteria for evaluation are outlined below.
Nominations Close: February 21, 2011.
First Stage: Preliminary Category Review
Deadline: March 4, 2011
- two Judges review each nominated site, providing rankings for each of the identified criteria
- where there are a large number of sites nominated in a category, preliminary review will be divided among the Judges for the category.
- each judge will review approximately 10 (ten) web sites.
In the Preliminary Review:
- sites are assigned points (from 0 to 5) in each evaluation category
- sites are flagged for the category short list
- a maximum of 5 sites can be short-listed for any category; these are the semi-finalists.
Second Stage: Select Semi-Finalists / Category Winners
Deadline: March 18, 2011
- all category Judges review the short list of sites
- sites are again assigned points (from 0 to 5) in each evaluation category
- sites are flagged for the category short list
- a category winner is decided
- possible best of the web sites are identified (finalists)
Third Stage: Select Best of the Web
Deadline: March 25, 2011
- all Judges review the best from each category and sites identified as finalists
- all Judges rank all finalists in all categories.
- Best of the Web is identified
- honorable mentions are identified
On-line voting for People's Choice Award
March 25 - April 7, 2011
- Users registered on the conference community site at http://conference.archimuse.com may cast one vote for a People's Choice site.
Final Stage: Awards Presentation
Deadline: April 8, 2011
- all finalists are contacted and asked to prepare a 3 minute (maximum) tour of their site in a web-accessible format
- finalists' presentations are used to demonstrate the sites at the awards presentation, and are mounted on the MW2010 Web site.
All sites will be evaluated using the same set of criteria. Judges will assign a score from zero (0) to five (5) points in each of the following areas, to create a total score out of twenty-five (25). In addition, Judges will offer written comments on the sites.
Appropriate to Category
Is the site an excellent example of the category where it is nominated? Does it have all of the characteristics identified?
Reflect on the information or experience delivered by the site. Was the content or experience offered:
- regularly updated?
- open to user contributions?
- relevant to and supportive of user goals?
Functionality + Technical Approach
Assess the choice of technology and functions used to deliver the site's content and build the site's construction.
Was the technology chosen:
- well executed? (did it work?)
- appropriate to the content?
- appropriate to the user?
- supportive of interaction with the content?
- supportive of the user experience (search, print)?
Interface: Visual Design and Usability
Consider the way that the site was presented visually. How was the site designed? Was the visual presentation of the site:
- visually appealing?
- supportive of user tasks?
- sympathetic to the content?
- appropriate to the target user?
Review the ways in which the site took advantage of the Web, explored relationships between objects or ideas, and encouraged the user to engage with the content presented, with the sponsoring institution, and/or with other users. Did the site:
- provide appropriate links amongst related content and content areas?
- encourage user input?
- support interaction amongst users?
- enable contribution of user content?
- indicate how to contact the institution?
- remain accessible?
Consider the impact the site had on you and has on its community.
- Was your experience memorable?
- Did you want to go back?
- Did you stay a long time?
- Did you have fun?
- Did it make you smile or think?
A rubric for ranking
5 – Answers to all the questions listed under this criterion are a resounding “YES”. Can be pointed to as an exemplary model and inform best practices with respect to this criterion. No suggestions for improvements;
4 – Answers to most of the questions listed under this criterion are “YES” and the others are a “near miss”. Only one or two minor suggestions for improvement. Many aspects could be used as exemplars of outstanding practice.
3 – Answers to most of the questions listed under this criterion are “YES”, but there are one or two aspects that need major improvements OR the answers to the questions are a qualified “YES” but there are many minor suggestions for improvement.
2 – Answers to less than half of the questions listed under this criterion are “YES”, but there are one or two notable strengths.
1 – Answers to almost all of the questions listed under this criterion are “NO.”
0 – Answers to all of the questions listed under this criterion are “NO.” Can be pointed to as an example of practices to be avoided.
- CIDOC Multimedia Working Group, Multimedia Evaluation Criteria, 199.7 Revised Draft, J. Trant, Chair. Available http://www.archimuse.com/cidoc/
- Webby Awards, Judging Criteria. 2004.
- Public History Resource Center.
- Evaluating Web Sites. 2000.Debra DeRuyver, Jennifer Evans, James Melzer and Emma Wilmer. Available http://www.publichistory.org/evaluation/index2.html
- Rating System for Evaluating Public History Web Sites. Debra DeRuyver, Jennifer Evans, James Melzner and Emma Wilmer. April 30, 2000. Available http://www.publichistory.org/reviews/rating_system.html
- Carleton Center for Public History. Canadian History Website Reviews – Notes for contributors. (n.d.) Available: http://www.carleton.ca/canweb/notestocont.html