Who should run the sprint (review) demo?

Who should run the sprint (review) demo?

May 20, 2016 Corporate Life Lean Productivity Scrum Methodology 5

The demo is an important part of the Sprint Review ceremony. However, there is a lot of contention who should run the demo, and how much preparation should go into it. Here are a few ideas to help make your demos more effective without decreasing efficiency.

There are many camps here. Some say whoever developed the functionality should be the one to demonstrate it. When this is the case, though, “completed” features and stories are usually shown in a local environment, not deployed to a QA environment. Depending on your definition of done, this may be ok, but in general, demos should be run from a deployed environment as it indicates at least one successful build, deploy, and integration. Additionally, as there are [hopefully] many stories being delivered that have been worked on by various team members, the projector will be switched frequently between machines yielding a choppy and dragged out demo. This can lead to more problems with getting things to display, opening different systems, and general machine and screen sharing issues.

Another stance is for the product owner to run the demo. This would likely mean that all the functionality is deployed in a place where the product owner has access to it, and that they can describe the business value being delivered to the stakeholders. It’s a great way for the PO to better understand the development work that is happening, too. The downside is that while the product owner is demo-ing, she/he cannot as effectively be answering questions and observing stakeholder reactions to the new functionality. This was the largest piece that changed my opinion to the next camp – one team member.

I DO think that who developed the work should demo the work. However, we work together as a team, so the entire team should really be demoing the delivered functionality (potentially shippable product increment(s)). I suggest a different team member each sprint (including the ScrumMaster) demonstrate all the functionality that has been finished during the sprint. The team should ensure that all the functionality is deployed into a QA environment so that the presenter only needs to go to one place to access it. This will prove that it can run off a deployed environment instead of someone’s local machine. Secondly it gives everyone a chance to understand what the rest of the team has been working on if they were not directly involved – which can assist cross-functionality.

This also allows the product owner to interact with the stakeholders without having to be running a demo. They can describe the stories as well as the business value and then have the demo running while they capture feedback and reactions from stakeholders. The ScrumMaster can help to facilitate capturing the feedback or explaining the demo as well.

As far as preparation goes, how much time should be put into a sprint review demo? It’s likely for two weeks of features, so putting a large amount of time in every two weeks has very low returns. If one person is running it, they should spend approximately 15 – 30 minutes with each team member to understand what they developed and how to demo it effectively in the environment. As such, they should ensure that all applications that need to be open and ready are and that they are working as expected. Demos should always be in real-time and not recorded beforehand. No more than 1-2 hour should go into sprint review demo prep for a two weeks sprint. Obviously, more preparation should go into a release level demo.

If the demo is going to be reused to help others understand certain functionality, it is more acceptable to prepare more but it should be limited and remain sustainable. For example, once a team spent about 12 hours updating a paper user flow for a demo with leadership. That was not sustainable (people were in the office until 1:00am), and though it was regularly used after that, updating it was a hassle and leadership started expecting it to always be up to date. That led to a conversation about trade-offs between doing more new work, or updating a demonstration and completing less work. It’s important to ensure there is balance between demonstrating functionality for leadership and actually delivering it. Separate leadership demos (outside of the review) should be few and far between, otherwise it is not sustainable and is essentially throw-away work and waste.

Demos are a great tool to increase motivation, drive conversation and feedback, and increase transparency. It is important to balance the effectiveness and the efficiency of the demo to make sure all parties are getting the correct benefits with the least amount of waste.


5 Responses

  1. The best sprint reviews I’ve seen had the stakeholders doing the demo. We set up several stations with a flip chart at each one. Members of the development team were at each station, explaining the work to the stakeholder at that station and answering questions they had. They also recorded the questions and suggestions of the stakeholder on the flip chart.

    Most of the stakeholders went off-script pretty quickly, but they came to appreciate how much work had been done. It was far more effective than any demonstration that they could passively watch.

  2. I like your idea Natalie of having members of the team demo their work. That is the way I have seen it done most often. Although George’s suggestion sounds like a good thing to try with the stakeholders doing it.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.