Review Your Sprint Review

Imagine yourself a stakeholder in an organization that is adopting agile practices for software delivery. The teams have chosen Scrum as their agile management process. As part of this new process, you are informed that you will be able to inspect the teams delivery every 2 weeks through an event called a "Sprint Review". No other education has taken place from your standpoint, so you quickly do a Google search on "Sprint Review". You then seek out some members of the team to educate you on your role in the process. 

Hopefully by then a ScrumMaster can pick up the task of educating the organization and so a short sit down occurs where the ScrumMaster explains (at a very high level) what the teams are striving for with Scrum. It sounds exciting and everyone loves viewing progress, so you go on about your day waiting for the 2 weeks to go by so you can see the Sprint Review.

The morning of the Sprint Review, you sit down in the conference room. The team and others from the company start strolling in to take their seats. You look at your watch, it's 8:59 and the meeting starts at 9:00AM. After about 10 minutes of the team talking some other "geek" language, laughing, and having fun, you realize that it is now 9:09AM. The ScrumMaster begins the meeting and states that we are here for a Sprint Review and that the goal of this activity is to provide feedback on the "done" items in the sprint. He quickly looks around the room and asks the team "whose first"?

After about 2 minutes of the team discussing who should go first, John; a Developer on the team decides to go first and starts hooking his computer up to the projector. He struggles to remember the URL that he is suppose to be using to show the software (this is demonstration after all, so I am glad he doesn't have a deck). Once he finds the URL about 3 minutes later, he can't remember the user name and password. After a few more minutes, he does remember the username and password, it turns out the service for validation is not running in the QA environment. So he RDP's into the machine to restart the service and after about 5 minutes, he tries again.

At this point, we have wasted a ton of valuable time not only from the team's perspective, but also the stakeholders. I tell this story because I see this happen time and time again. Scrum tells us that we are not to spend a ton of time prepping for a Sprint Review. I totally agree with that, but I would highly recommend that the team gets together before the sprint review, lock eyes with each other, and validate that each and every item that we will be demo'ing is in fact DONE. I suggest you then agree on who will be presenting each item (it shouldn't always be the same person on all features). Ensure that the person who is going to demo understands the functionality, the acceptance criteria, and the data in which they will use for the application demo. I advise teams to choose real-world data so the stakeholders can see how this application might actually look with out using names like Test1, Test30923u, random characters. Rehearse it a bit, because the more time we waste or the more problems we encounter in the review, the more confidence and trust that gets whittled away from our stakeholders.

I also like the ScrumMaster to produce a Sprint Review agenda document. Just a lightweight document that summarizes the teams on the sprint, the dates, the team members, the next sprint, etc...This ensures people who couldn't make the meeting can review what was shown (we could even record the meeting and put that link on the front of the document). We should file these in a central location that is easily accesible. I like to also expose the teams metrics over the last few sprints, show impact to current release schedules and perhaps a status of said milestone/releases. Then I like to put every single backlog item in order of presentation, who is presenting, it's size, etc...This way the group can follow along and see what the actual user story is with its acceptance criteria. We can have a brief section outlining what we tried to get done, but we didn't. We only speak to why and what we will do better next time.

It is great to treat this time as professional as possible to show our stakeholders that we are a cohesive and professional team. We can be relied up on to do the right things and build the software in the right way.

Here is a quick checklist you can use to make sure your team is prepared for a Sprint Review:

  • Reconcile what is done / what is not / and why
  • Agree on who will present which item and in which order
  • Ensure good real-world data exists for your demo and that the Definition of Done for the story can be followed
  • Prepare the sprint review agenda document
  • Start the activity on time
  • Allow the Product Owner to elaborate on the goal of the sprint
  • Review metrics, schedules, impacts, etc...

fa0c32_8fb5f8b48c454c7ab9ee475bb4587e24.jpg#asset:174


Related Articles

"Tracking" to Done in Scrum

There are many analogies in use today to help teams understand the "delivering incrementally" part of agile processes. At the core, we struggle to take our silo'd views of software development into a cross-functional team that ships software. The idea that "code complete" doesn't ship to the customer allows everyone...

Read More

Certificate of Occupancy

Have you ever been in your building and noticed a “Certificate of Occupancy”? These gems enforce a safe occupancy number that are built around fire codes. How many people could realistically exit this building in the case of a fire? My question, as it relates to software development and Scrum...

Read More