COI ran a consultative review of guidance on measuring website costs, quality and usage. The Review started on 27 March 2009 and ran for a period of three weeks under the banner of ‘Improving Government Online’.
Due to the fact that the proposed end-users of the guidance were already well-disposed to the use of collaborative editing tools online and because of the successes of other ‘early adopter’ departments, the Digital Policy Review team was persuaded of the value in trying a new approach to reviewing that used a range of ‘social media’ applications to place the draft documents in the public domain for open review and comment.
I advised on the applications to use and their set up (reporting the process in an earlier post). I also a carried out an evaluation independently of the Review team, so as to capture not only their own specific experience but to also to encourage wider evaluation and critique of the use of ‘commentable’ or ‘interactive’ documents was by government reviews and consultations.
- The format of the Review attracted participation by a small, knowledgeable group of end-users;
- The new Review format generated a greater number of comments that provided a more precise set of amendments for the Review team to consider;
- The Review team was unfamiliar with the format but quickly found their rhythm and became more confident;
- The Review team will use the format for future reviews because it strengthened the quality of the guidance and made the process easier despite generating greater traffic than usual;
- Future use of this review format would be welcomed by participants and spectators who interacted with this exercise.
- The format allowed the Review team to indicate where they made specific amendments to the guidance as a result of reviewer input.
A Twitter account (@digigov) and hashtag (#digigov) were created to help the review team promote and monitor the interest in the Review taking place off the core review-site. A Netvibes dashboard was set up to automatically collate any links to the Review or use of the hashtag. Finally, Uservoice was used to provide a means to capture feedback over the duration and Wufoo was used at the close to survey participants.
The WordPress blog software provides basic website traffic statistics and these were supplemented by Google Analytics to allow the monitoring of traffic to and within the Review site.
Over the duration of the exercise, 87 comments on the documentation were received from 44 individual participants. The website received a total of 691 unique visitors who viewed a total of 1,997 pages. Half of the visitors visited once, with nearly 25% visiting more than 8 times.
The average time spent on the site was 2 minutes and 11 seconds, though the average time on the day that the Review launched was 6 minutes 25 seconds. Search engines generated 44% of traffic to the site compared with 33% from referring sites and 22% coming from direct traffic.
User Feedback Over Duration
From launch, participants and spectators were able to feedback on the Review process using the Uservoice-supported survey forum. This was regularly monitored by the Review team.
Four points of feedback were received with the backing of 16 individuals. The Review team actioned two of these immediately: adding a sitemap and making PDF versions of the documentation available. Another suggestion (to continue the discussion via a blog) will be taken up following the exercise based on pre-existing plans, while the fourth point related to features of the ‘Commentariat’ theme and was not actionable by the team.
Post-Review User Feedback
At the close, participants and spectators were invited (via email, Twitter post and a link on the Review website homepage) to complete an online evaluation survey consisting of 10 questions (hosted on Wufoo survey software). Twenty people completed the survey, all of whom had interacted with the Review site either by commenting on or reading the draft documentation.
Most of the survey respondents identified themselves as digital media professionals working in central government. Two respondents were from the private sector, two were from academia and one was from another public sector organisation.
Respondents were asked where they had first become aware of the review: the most common answer (8) was Twitter, which the Review team was using to issue regular updates on the progress of the Review, while the second most common source was via a blog post (4).
Conscious that this was a new means of carrying out a consultative review, respondents were asked if the purpose of the exercise was clear: one respondent said ‘no’, three thought it was partially clear, but most said they satisfied with orientation provided. Indeed, all but one respondent said that they had participated in some form of collaborative editing online prior to this exercise.
The Review site offered users with both the means of accessing the draft documentation and commenting on each individual section of the guidance. Most respondents (15) said that they read the draft documents online either wholly or in sections, while others downloaded the documents first (one respondent said that they did not read the documents at all). All the respondents said they commented on the guidance.
Respondents were invited to identify any other ways in which they interacted with the exercise: responses included following the Twitter activity (5), reading other people’s comments (5) and promoting the Review to others (4).
Respondents were given the opportunity to specify a particular aspect of the exercise they thought worked well and another they thought could have been better. Of the things that worked well, respondents praised the openness of the exercise and the combined use of the various social media applications. Of the things that could be done better, most of comments related to the style the guidance was written in and the visibility of the Review, which was regarded as being too low.
The final question asked respondents if they would participate in a similar exercise in the future: 16 said they would, two were unsure and only two said ‘no’.
Review Team Interview
The Review team was interviewed after the exercise had closed and it had completed the analysis of user comments. The Review team was asked to reflect on the experience: how it compared to the review processes used in the past, what worked well and what could have been done better.
The Review team was almost entirely positive about the method trialled here. Past reviews received feedback from a closed network via email; in this case, however, the document and the review exercise were subject to a greater level of scrutiny, which attracted a larger pool of reviewers who made more precise amendments. The Review team judged that this not only benefited the guidance but also made the process of managing the Review easier (for example, by making document control simpler and being able to determine whether the documents had got out to the right reviewers).
Going forward, the team said that they would still make use of a service like Twitter to issue updates and promote the exercise, but said that they would be more aware of the need to make regular updates but also try to direct debate off of Twitter and onto the reviewable document. The team also said that it had learnt much about the process of moderating comments, such as the speed with which to review and publish comments as well as the way that the team could interact with those making comments on the site.
Overall, the Review team thought that its administration of the process had been good for a ‘first attempt’ and when asked if they would use this method of review in the future, the team were definite that they would. They were certain that this approach to making guidance open for review suited the end-users and, although in this first instance, the uptake was relatively low, the team were confident that over time the levels of awareness and interest would increase and the participation would follow suit.
Because the format required participants to attach their comments to particular sections of a document, the Review team were subsequently able to specify where they had taken on input and made amendments. These were reported as follows:
User satisfaction guidance updated (TG126)
- Alternative methods of maximising user satisfaction highlighted (e.g. user testing, polling user suggestions, competitor analysis etc.);
- Minor amendments to the core question set including standard answers to user profile questions;
- Added caution to avoid excessive survey length and encouragement to include an opt-in for follow-up e-mail communication.
Simple, jargon-free editorial content (TG126)
- Guidance on editorial quality updated to emphasise the importance of simple jargon-free content and linked to the Usability Toolkit.
Usability and user testing (TG126)
- Summary of the content of the Usability Toolkit added;
- Guidance added on user testing and the importance of prioritising on key site objectives and user groups.
Strengthened links to other guidance documents
- Minimise broken links: Added link to Managing URLs (TG125);
- The need to archive out of date material: Link added to Archiving websites (TG105).
Continue the discussion via a blog
Many areas discussed by participants were very broad and challenging in scope. We see here the potential to open up and sustain a dialogue with those who share a passion for digital communications. To facilitate this debate, we will set up a blog to promote better collaboration on digital policy guidance and to discuss some of the wider issues raised in the Review including the following:
- Understanding the cost-benefit of government’s overall digital engagement, not just websites;
- Re-use of data and information: impact on costs, usage and quality;
- The challenge of measuring Visit Duration as websites increasingly deliver content through Flash and AJAX interfaces.
A note of these amendments has been circulated to participants by the Review team.
I would like to thank my colleagues in the Digital Policy team at COI for getting me involved and for allowing me to report the results (Adam Bailin in particular). I enjoyed working with Seb Crump in setting up the tools, and would like to thank Steph Gray for his work on developing the ‘Commentariat’ theme. Finally, thank you to those who took the time to complete the evaluation survey. Much appreciated.