As per the Agile gods, your teams and stakeholders get together every sprint for a review. The idea is to have immediate feedback. Helping us all steer the boat in the best direction. Another goal of the sprint review is mutual accountability. Scrum team measures their progress, and stakeholders are recommitting to their sponsorship.
Alas, in real life, often time sprint reviews are floundering. Instead of a useful ceremony, they become wasteful box-ticking exercises. They see dwindling participation, boring presentations, and an apathetic audience. A large number of stakeholders, competing priorities, too much juggling, remote working, and zoom over saturation. These are only some of the factors working against good sprint reviews.
Is this your organization? Are you the project manager that needs to drag people into the sprint reviews in the slim hope of getting even a single new insight?
Why not try a different approach?
This is how one organization transformed its sprint review process.
The key ideas were simple:
Share data with everyone: developers and business leaders alike.
Spare everyone the face to face presentation. Recognizing that not everyone needs to hear everything, they made all the information available on Confluence. Everyone had access to the data and could dig into it, skim through it, or skip it altogether. Their choice.
Elicit a-synchronous feedback, through Confluence. The project manager committed to following up on each comment raised in Confluence.
Replace the face-to-face review meeting with a face-to-face AMA(ask me anything) session. Anyone could raise any questions for discussions and debate.
Confluence was chosen as the central tool for this new system. As the organizations intranet it was the natural choice. From the CEO to the technical writer, developers and marketers, everyone were very comfortable using Confluence.
As the wanted to get everyone engaged, using a platform everyone liked gave them a head start.
How to organize a Confluence area for effective sharing of sprint data?
The PMO had a dedicated space for sharing information about the project. The project manager added the sprint reviews as a sub-tree in this space.
The list of information for each review was pretty long. They wanted to make it easy for people to locate the bits that interested them, so they split the data into several pages. The structure was consistent across all sprints. Each sprint review will have a collection of pages like this:
The main page. It included:
The list of other pages related to this review. This was the list of child pages. The Confluence macro “Children Display” generates this list, so there is no need to manually update it . The PM configured the macro to show the excerpt of each of the child pages. Like this the main page was more informative. People liked that it helped them judge if they want to spend time on the child page.
A page for performance charts. Like the burndown chart. The scrum master downloads these reports and puts them in Confluence at the end of each sprint.
A page for the Jira issues in the sprint. Most of our stakeholders have no access to Jira so the PM used the Jira Snapshots for Confluence app to create this page. With Jira Snapshots, we also have a static copy of the Jira data, which helps keep the scrum team accountable. See more about this in the next session. (Disclosure: I am with the vendor of the Jira Snapshots app)
A page with the bugs solved in this sprint. This page uses a Jira snapshots report for the same reasons described for point 3.
Each demo is captured in a video, on its own page. Having each demo on a separate page makes it easier to find the demo that is of interest to you. The page excerpt for the video pages provides the scope of the demo. It also includes a shout-out to specific people, whose feedback we seek.
Last page contains the recording of the sprint AMA session. Like this, people who can’t make the AMA could stay in synch.
Another advantage of this standardization is that its easy to collect the data for each sprint.
This is the easy process we go through to make the data available for each sprint:
Step 1: during the sprint grooming, copy the “sprint review tree template” page, along with all its descendants, to create the pages for the new sprint. All the page titles are ready with a # in their title. Upon copy, Confluence replaces the # with the name of the new sprint.
Step 2: When the sprint starts, the scrum master triggers the Jira snapshot in the Jira data pages. From that stage on, everyone knows what they planned for the sprint.
Step 3: While the sprint is ongoing when a story is done, the developer may record a demo video. They immediately upload it to Confluence and move on. The scrum team loves this because it means less context switching at the end of the sprint.
Step 4: at the end of the sprint: Scrum master copies the performance charts to Confluence. They also trigger a new snapshot to capture the Jira data at the end of the sprint.
Step 5: Two days later: The team holds a sprint AMA session. They upload the recording to Confluence.
Use Jira snapshots to open up the Jira silo
Before, in this organization, Jira was considered to be a silo. Most of the stakeholders did not have access to Jira, or did not know how to dig information from Jira.
This was creating friction in the sprint review process. Developers and sponsors did not have the same information about the stories in the sprint. This led to everyone wasting time on questions and quarreling around this fundamental point. There was suspicion and mistrust.
The Jira snapshot app solves this in a straightforward way. It copies Jira data into Confluence and displays it in a table there. The data is time-stamped, and static. Anyone with view permission to the page sees exactly the same data. Snapshots keep their history. Comparing sprint content between the start and the end of the sprint is a snap (sorry, could not resist the snap).
Did you ever have an argument with someone about what was actually in scope or not in scope for the sprint? With Jira snapshots, this is a non-issue. Anyone can look at the snapshot from the start of the sprint and see.
The new sprint review process turned out to be a pivotal change
This new sprint review process propelled the organization to new places. Now, they were discussing real issues. Anything from. Topics like how to scope their sprints, what is a priority, and who their customer is, were analyzed, dissected, and improved.
It's not that they no longer had challenges, but these were different challenges. Some people said that everything got less “political”. You can see this in even the smallest gestures. Developers are putting more heart into their demo videos, product managers are more thoughtful about how they provide feedback. The language of "us", and "them", no longer relates to the scrum team (us) and the people outside the team (them). Today, "us" relates to everyone in the organization.
How are your sprint reviews going? Do you have your tips and tricks to share?