Roadmap elements

This is a write-up of a discussion the UK government’s community of product managers held to agree which elements are important to include in our product or service roadmaps.

We think it’s worth agreeing these elements so that roadmaps are consistent and can be understood across our organisations, regardless of who’s looking and where they are viewed.

The elements are being posted here to invite comments and suggestions before being published as guidance on the Service Manual.

***

A. Products should be judged on their delivery of value to users. Roadmaps are a tool to express what the value of a product will be and how that value will be understood and released in stages.

B. Each product has its own roadmap, or is represented on a roadmap providing a collective view of products related to a service. If user needs call for more than one instance (eg. simultaneously on a wall and one online), every version of the roadmap must be kept in sync.

C. Roadmaps are developed iteratively and regularly. Creation and iteration of a roadmap is a collaborative effort led by the product manager with the delivery team, stakeholders and the product’s users.

D. The roadmap is open and available in the public domain (unless there is a very good security reason not to).

E Roadmaps use simple illustration and plain English to provide information. This makes it easy for anyone to pick up and quickly understand how and why the product is being developed. They tell people where else to go if they need additional in-depth information (such as a product backlog or blog).

F. Roadmaps are framed by a vision explaining the ultimate outcome a product is trying to deliver. The roadmap is broken down into objective-based missions, each of which gets us closer toward achieving the long term vision.

G. Missions are outcome-based objectives with an explanation of what the goal is (often a problem to be solved) and how progress will be measured. Agreeing these missions involves conversations with users, the organisation’s leadership and the team doing the delivery.

H. Missions are mapped over time (eg. quarters). The segments of time on a roadmap are consistent. When the roadmap contains phases (such as discovery or beta) these should be timeboxed.

I. Roadmaps show what’s been done and what is coming next in order of priority. The method of prioritisation is transparent and consistent.

J. Roadmaps enable us to plan for change. They capture intent, not solutions. The further in the future a mission is, the more uncertain it is. The closer the mission gets, the more is learned and the more confident we become that it is the right thing to work on. Things can be dropped from roadmaps.

K. Each roadmap provides an explanation of who it’s for, how to read it, who maintains it, how often it is updated, and how to contribute to its development.

L. When roadmap software is used, the content should be exportable (preferably via API) to enable reuse.

Product Manager or Product Owner?

In government, in practice, they’re synonymous. Because we care more about the responsibilities than titles, right?

But sometimes it’s a thing for some people. They raise it as if to suggest it’s an unresolvable ‘chicken or egg’ type of question. This is how I explain it.

In GDS we use Product Manager.

We don’t think Product Owner is wrong. Product Manager is just more us.

Product Owner is a Scrum thing. Like Scrum Master is a Scrum thing. We like Scrum but it’s not the only method we use. We want the people managing our products to be able to use a range of delivery methods. So using Product Manager is a signal of intent.

Product Owner is a term that’s used less frequently than Product Manager. It makes sense for us to use a title that people relate to when we are trying to attract the highest number of quality candidates to product management jobs in government.

Search trends for ‘Product Manager’ vs ‘Product Owner’
Worldwide results for ‘Product Manager’ jobs on LinkedIn
Worldwide results for ‘Product Owner’ jobs on LinkedIn

Product Manager is used by the majority of government organisations. And when digital, data and technology people from across government got together en masse recently to agree the roles and responsibilities we need to deliver better services, we settled on Product Manager. By saying ‘the majority’, that’s acknowledgement that the agreement is new and taking some time to percolate through.

Where it gets a bit confusing is in a few departments they have Product Owners and Product Managers. In these cases the Product Owners come ‘from the business’ and don’t have product management responsibilities. What they are is subject matter experts, who are very welcome helping delivery teams to understand the policy and operational realities. But we could be doing with calling them something else. Maybe just referring to them by their actual policy or operations role titles. That would make things clearer.

Fundamentally role titles evolve. We should hold onto them loosely to protect against hubris.

What we must spend more time and effort on is making sure that when we configure teams we give people a combination of responsibilities that are compelling and commensal. Google it.

Another tour of duty

I’m finishing up at MyGov and the Scottish Government’s Digital Directorate. It’s been a short gig but it’s been well-played.

Scotland’s is a digital future

I am pleased to have had the opportunity to represent MyGov. I’ve done my bit by negotiating the team through a period of intense pressure while also bringing longer term stability to the resources in the programme, and mapping out an ambitious roadmap for future delivery.

My decision is not a reflection on MyGov. I believe in the mission and I especially believe in the team. You have a huge amount of talent; you are going to continue to make awesome products and services for Scotland, and make people’s lives better as a result of your efforts – of that I am certain.

You are a resilient and self-supporting bunch, plus you have great leaders in Colin, Graham and Rachel, who will ably see you through this period as you stand up beta.gov.scot and begin developing mygov.scot again with gusto. And then there are the new starters lined up to join the team, who will complement your abilities and effort.

I’m not leaving government and I’ll remain a strong and active supporter of MyGov.

Back to the future

I’m really pleased to have been invited onto another tour of duty with the Government Digital Service. This time around I’ll be serving the community of product and service managers in GDS and around government in the UK – helping them to stay connected with one another, share their skills and get people excited about their craft.

I’m ridiculously excited about this new role. I’m looking forward to being back with old colleagues of course, but 3 years is a long time to be away and I’m even more inspired about working with the new guard there. There is so much energy in this new era for GDS and the challenges are compelling.

It feels like a good time to be going back to something different.

More on that anon.

Thank you @mygovscot

For the time being, it’s thank you, goodbye and good luck to my colleagues and friends in the Scottish Government.

Continuous improvement as a product manager

Continuous improvement – it’s the way to develop products. It’s also the way to be a better product manager.

I spend more time recruiting and directing product managers than I do being hands on with products these days. But because I love my trade and want to be the best coach that I can, I am always trying to up my game by learning new things.

Here are three ways I’ve been trying to improve as a product manager recently:

Better the devs you know

I am a product person who appreciates programming.

If I have a better understanding of the developer’s craft then my product vision is optimised, my explanation to the engineering team is more articulate and my appraisal of their efforts is better judged. Surely.

To that end, I’ve been steadily working my way through Codecademy’s courses, and by way of revision I’ve installed some of the Sololearn course apps on my phone, which are a better format for the commute.

Don’t send me your pull requests anytime soon. But it’s definitely broadened my thinking about the art of the possible… and the trade-offs. And it’s good to look at a team I work with everyday and see them in sharper focus.

The Jobs at hand

Ever since I started working in product management, it’s involved user stories.

And that’s been going great because user stories place more importance on the task someone is trying to complete when they are using your product than who they are or what my business’s requirements are.

But I’ve been hearing more and more about Jobs-to-be-Done and its application to product development, and I’m intrigued. It might be that ‘Job stories’ offer an even more relentless focus on what someone is trying to do and what their motivation is than ‘User stories’.

It might be that it’s time to make a switch or maybe there’s room for both in the mix of techniques I can use to make sure we are meeting user needs.

I’ve started asking my product managers to explore the possibilities of Jobs-to-be-Done, and Intercom has recently released a book on how they made the switch from user stories, which reads as a good, thorough introduction.

What a release

I’ve written a lot of release notes but I have started reading a lot as well.

It’s really easy just to hit ‘update’ and ignore the release notes; even easier never to acknowledge that release notes might be a thing produced for the service you use. I suspect that release notes are read only slightly more often than updates to T&Cs.

When you start to study release notes, you see there’s more variation between them than you’d first assume. Technical. Conversational. Brief. Detailed. But you can tell the difference between good and poor, well judged and lazy. Good products ought to have good release notes but not always.

I’m going to keep a book of examples, learn from them and see if we can get more users to take note of our release notes and use them as a means of engaging in a dialogue about what we’re developing and why.

Three improvements on before. But there is so much more to do. 

GOV.UK is going Worldwide

This post originally appeared on the Government Digital Service blog

Today we are very pleased to release the Worldwide section of GOV.UK, which explains the structure and activities of British government organisations in over 200 locations around the world.

Worldwide (www.gov.uk/government/world) is the new home on the web for the overseas web presences of DFID and FCO, and much of UKTI‘s international-facing content (ahead of a wholesale transition later this year). These location profiles will be frequently updated to set out the government’s response to international events, present case studies of diplomacy, development and trade in action, and provide information about senior staff responsible for overseeing that activity.

Continue reading “GOV.UK is going Worldwide”

Three months on – Inside Government’s traffic, demand and engagement in numbers

This post originally appeared on the Government Digital Service blog

They can’t tell the whole story but digital analytics are a useful and readily available source of information about how people are interacting with the GOV.UK platform and content. Continue reading “Three months on – Inside Government’s traffic, demand and engagement in numbers”

Inside Government – traffic, demand and engagement numbers so far

This post originally appeared on the Government Digital Service blog

It’s early days for Inside Government but we wanted to share some analytical data on user traffic, demand and engagement.

Inside Government is just three weeks old. We launched with five government organisations on 15 November and it felt good to get going. With no time to waste, another four departments will be joining the original five in a few days time. Continue reading “Inside Government – traffic, demand and engagement numbers so far”

What we know about the users of Inside Government

This post originally appeared on the Government Digital Service blog

Publishing all the government’s corporate information on a shared platform hasn’t been done before. So the Inside Government team have been building a profile of the site’s users over the months as we’ve been developing the product. This is what we think we know about them. Continue reading “What we know about the users of Inside Government”

Have I got government news for you?

This post originally appeared on the Government Digital Service blog

Government is a major news source. As central government departments begin the transition to GOV.UK they’ll be publishing their news in a single place, and we have a fantastic opportunity to improve the user experience of this high profile content. Here’s how we plan to start doing that. Continue reading “Have I got government news for you?”

Feedback isn’t just for Cobain and Hendrix – what we heard from the Inside government beta

This post originally appeared on the Government Digital Service blog

‘I love this site! …This is perhaps the finest example of a government website in the history of the Internet.’
Member of the public

‘What idiot thought a single web site was a good idea? The separate ones are bad enough.’
Civil servant

These are genuine comments at the extreme ends of the feedback we received for the Inside government beta. Over the six weeks of the beta we received a lot more in between, and we were grateful for every last item of praise and criticism.

This post is about how we captured that feedback, what we learned from it and what we are going to do as a result.

Recap

We released Inside government as a beta on February 28th 2012, following 24 weeks of iterative development. The idea being to test – on a limited scale – the site that may come to accomodate all departmental corporate websites. The beta ran for six weeks in the public domain and involved 10 pilot departments (BIS, Cabinet Office, DCLG, Defra, DFID, DH, FCO, HMRC, MOD and MOJ).

Inside government ‘department’ page on an Ipad

By releasing Inside government we were testing a proposition (‘all of what government is doing and why in one place’), and two supporting products (a frontend website and a content management system). With this in mind, we wanted to ensure that we captured feedback from the public and from colleagues across government.

We wanted to know if – having used the site – people thought it was a good idea and whether it should be developed further. And besides testing the viability, we hoped that feedback would prove a rich source of ideas and steering on what was important for us to concentrate on in subsequent iterations.

Open feedback channels

The most obvious and noisy sources of feedback were, of course, ‘open channels’ which included Twitter, email, our service desk, our GetSatisfaction forum, and some people even had our phone numbers.

The value of these routes lies in their diversity and because they provided the opportunity for the project team to engage directly with the end users. They were particularly popular with public users, but we were pleased to see that civil servants also took up these opportunities to put forward their views and get into discussion with others.

Here’s a cross-sample of the comments that came in:

‘Love the product! The concept of GOV.UK is the right way to go.’
Member of the public

‘The design feels far more people-friendly, the language (of the site architecture as well as the content) feels like it has the right balance between being friendly and expert’
Member of public

‘I have looked through the test website and I think it is an excellent idea. I like the news reports and I like the idea of one website for all departments.’
Civil servant

‘I think the concept of gov.uk is sensible but there’s a way to go yet to get this website working well.’
Civil servant

‘Looks very basic and home made’
Civil servant

‘I don’t know why you’d need a new website. Why not just add a section on how government works to direct.gov (sic)’
Civil servant

Our open channels were a particularly rich source of product enhancement ideas. Examples included: adding a section on the mechanics of how government works, incorporating section-specific searches, and suggestions of what data to include in feeds.

While there was lots coming through the open channels, much of it was granular and from an engaged audience predisposed to take an interest and have an opinion. Highly valued stuff but only part of the picture, and so our evaluation squeezed three further tests into the time available.

User interviews

Inside government should be open and accessible to everyone and we expect the core users to be people with a professional or deep thematic interest in the policies and workings of government.

To get qualitative insights into how these users used and rated gov.uk/government, we arranged 12 face-to-face interviews (with professionals from academia, charities, media and the private sector). Each participant knew that they would be asked about their internet usage but they were not aware which site(s) they would be discussing.

Interviews were conducted on a one-to-one basis by a trained facilitator and began with a discussion of how the individual used central government’s current websites. It was evident that although a user gets to know their way around a particular section of a site, when they move off that section or onto another department’s site the inconsistencies present real frustrations.

‘It is tricky because currently you have to go to each department individually and its only done with civil servants in mind.’
Member of public

Department pages on Inside government beta (top) vs current departmental sites

The facilitator then pulled up Inside government. The participants began with a cursory browse and were then set tasks designed to move them through the site’s content and functionality. As they carried out the tasks, the facilitator asked them to comment on how well they felt the site was performing.

We learned that these professionals found the user interface clear and intuitive. There were common criticisms; a number of which were arguably down to the ‘rough’ nature of the beta, but important to heed as the site develops and takes on more content. These included issues with long lists, the visibility of ‘related content’ in columns, and the depth of content (some participants worrying that it was being ‘dumbed down’). Perhaps the most interesting finding being that participants wanted more of a departmental lead in the navigation, rather than the thematic approach that we were trialling.

Fundamentally, when asked, the participants were able to articulate who is responsible for the site, what its purpose is and who the users are likely to be. In this sense, the proposition of Inside government was clear and, they said, with some development this was a product that they welcomed because it would make their work easier.

Usability testing

With insights from the core users bagged, we were hankering after feedback from general public users (who we expect to be infrequent visitors to corporate sections of government websites and who have basic or no prior knowledge of the machinery of government). What would they think of the proposition and would they be able to find what they wanted easily?

Using the GDS’ summative test methodology, we put Inside government to the scrutiny of a panel of 383 users. The respondents were a mix of ages and gender, they were geographically-spread throughout the UK, participated in their home environment and there was no moderator or facilitator.

The tests use a range of measures to assess performance (such as journey mapping and completion times). Participants were prompted (by software) to find specific information (facts and figures) on the site through five tasks (which were tracked) and at the end of the tasks they were asked a series of questions about their experience.

We were pleased to see that on all but one of the tasks the successful completion rate was above 60%, and more than 50% of participants said they found it very or quite easy to complete the tasks. That was a positive overall trend on an unfamiliar and content-heavy site.

Below the overall trend, a few issues of concern were flagged up. Just over a third of participants found it difficult to complete the tasks, especially when it came to finding a specific piece of information on a page. Most participants described the content as ‘straightforward’, ‘to the point’ and ‘up to date’, while some said it was ‘longwinded’ and ‘complicated’. Over 50% of participants said they thought Inside government contained ‘the right amount of information’ but 39% thought there was too much.

From these findings, we were satisfied that in the beta we had build a site usable by the general public, but there is clearly a great deal of work still to be done to produce an excellent ‘product for all’.

Comparative CMS Tests

We wanted to tackle one of the common complaints government digital teams have about their digital operations: cumbersome, convoluted and costly content management systems. So we decided to try building a CMS from scratch that would be stable, cheap to run, include only the functionality required to manage the Inside government site, and be easy to use by even the most inexperienced government staff.

A view of the Inside government CMS

To test what we produced, we ran structured testing in departments, using their standard IT and setting Gov.uk in head-to-head tasks with their existing CMS. The aim was to assess the performance and usability of the beta publishing application, and understand where government publishers found it to be better, lesser or equivalent quality compared with what they were currently using. And, we wanted to get their ideas for further iteration.

We ran tests in six departments against five incumbent products with seven participants, who were from digital publishing teams. The participants were a mix of those with previous or no experience of using the Gov.uk publishing app, and each was given up to five tasks representing common ‘everyday’ CMS functions.

In all but one task, users completed tasks faster or in comparable times. The reason for the slower task was down to the users lack of experience with markdown formatting method (and even then it was only in one specific area – bulleted lists).

Taking into consideration all the tasks they were asked to complete during the tests, the users made positive comments about their overall experience of the Gov.uk publishing app. For most it was easier to use, better laid out and faster than their current content management system because it was customised to their specific professional needs.

‘The CMS is a dream – especially compared with the current [product name removed] system. It’s fast, user friendly and intuitive. It’s also easier to use visually.’
Civil servant

What we learned

We didn’t get to do as much testing as we would have liked. Time was against us. But from what we did do, we learned a lot.

Problems include:

  • Big improvements in findability need to be made if Inside government is to be able to cope with the weight of content from all departments and agencies
  • We need to adjust to make departments more prominent in the navigation and on pages
  • In trying to improve the readability and comprehension of the corporate content, Inside government needs to be careful not to over-simplify

Noted.

Overall, we tested well and the positives rang out louder:

  • People understood the concept and valued the proposition
  • The site design was applauded
  • The publishing app was lean and easily stood up to the pressures of real use

People were impressed by what GDS and the 10 participating departments had achieved and wanted to see more. No doubt, there is development to do but this thing can work.

Crucially, by conducting these tests we now have benchmarks against which to measure the performance of future releases of Inside government. This data was not previously available to the Inside government team in the way it was for those involved in the other betas to replace Directgov and Business Link.

We will plough the learning into forthcoming iterations and we will continue to run testing regularly and report back on the findings.

If you want to tell us about your experience of the Inside government beta, leave a comment or drop us an email.

Ross Ferguson is a Business Analyst for the Inside government project.