Multi-Teams Sprint Review as Open Space. PandaDoc Experience

Evgeniy Labunskiy
PandaDoc Tech Blog
Published in
6 min readMay 31, 2021

--

We’re all familiar with the “one-team sprint review” concept. It’s easy to manage, the agenda is often simple, and it provides solid focus. But as soon as your company starts to grow, the number of teams also grows.

Today at PandaDoc, we have 18+ teams working together on one product — with a single sprint review in which each of these teams participates. And you can do it, too.

Read on and I’ll detail how you can manage a multi-team sprint review that is scalable to an almost unlimited number of participants and teams.

The Challenge

Given the rapidly escalating number of teams we now have at PandaDoc, it’s impossible to create an extension of the simple one-team sprint review we used before. But, we tried. It was a long meeting, during which each team presented one by one. And as we grew and grew, the approach turned problematic:

  • Meetings became longer and longer — up to two and a half hours. It was difficult to maintain focus while tracking demos, and it turned into an overload of information.
  • Since people weren’t focused on team presentations, there was barely any feedback.
  • There was seemingly never enough time to provide a demo — despite the length of each meeting, things often felt rushed.
  • It became clear this approach was not scalable. What would we do once we grew to 30 teams? Would our sprint review turn into a four-hour ordeal?
  • And not least of all, these meetings began to veer toward boring.

It’s fair to say this approach worked until the point where we reached 10–12 teams. After that, degradation of the overall experience was painfully apparent.

It was time for our scrum masters to convene and fully revise our sprint review approach.

Multi-team Sprint Review

When we began redesigning this event, we agreed on its core principles:

  • It should stimulate feedback, with enough time allotted to each team’s feature for demo and Q&A
  • It should be easy and interactive
  • It should be scalable, with the ability to handle at least 30 teams
  • It should be valuable to a wide audience

Based on these goals, we decided to use the Open Space approach for facilitating our sprint reviews.

How Open Space Works

Open Space is a way to efficiently organize people around an open agenda . It’s become very popular at “unconference”-type conferences. Open Space prioritizes a participant-driven agenda by which participants suggest and self-organize around discussion topics.

These are some general principles used by Scrum Ukraine‘s Agile Coaching Camp in 2019.

Create ample agenda space to add topics. With the help of a matrix containing time slots and rooms, participants suggest a topic to be discussed in a specific room during a specific time slot.

Here you see that B1-S2 (across the top) are rooms, and slots 1–6 (along the left) are time slots. This particular board contains a 7x6 matrix, with 42 available slots for topics.

Make clear suggestions for topics. After the agenda space is created, participants suggest topics to fill available time slots.

Open the event and establish guidelines. Once the agenda is complete, facilitators open the event, participants select where to go, and the first round of discussions begins.

Invoke the “law of two feet.” If, during the course of the gathering, any person finds themselves in a situation where they are neither learning nor contributing, they can go to some more productive place.

How PandaDoc’s Sprint Review Uses Open Space

Now let’s see how we’ve adopted the Open Space format at PandaDoc.

We use Miro within our organization widely, including for our agenda board:

We have a 3x5 matrix that we can extend to 4x5 or 5x5, if necessary. This agenda is filled during the sprint as teams add presentation topics to available time slots.

On Thursday, the day before our biweekly sprint review, we share this agenda with the organization and invite people to join. This brings transparency to the review process, alerting participants in advance about topics they would like to discuss.

We host our sprint reviews in Zoom. Each meeting is 85 minutes long (sometimes shorter, never longer) and includes three segments: pitch, demo, finalization.

Pitch: 20–30 minutes

During this segment, participants pitch and invite people to their demo session, while also showing the overall evolution of their work.

This segment is very important because, in a short time, it:

  • Shows the common roadmap detailing our deliverables and plans
  • Shares the current state of product delivery
  • Gives an overall understanding of the team’s direction

After this segment, a facilitator (usually a scrum master) shares the final agenda and explains the rules of our Open Space. (We briefly explain rules each time a new teammate joins.)

We also explain the mechanics of Zoom Rooms and remind participants to install the latest version of Zoom, which allows people to switch between rooms. We also explain how to select and move between rooms during the meeting.

After this, the main facilitator opens rooms and our demo segment begins.

Demos: 50–60 minutes

Each room has a moderator (again, usually a scrum master) who leads the discussion, keeps an eye on the clock, and records the demo. We record all sessions in case someone is unable to join, but wants to watch it at a later time.

Our pattern is to create five rooms: three for demos (according to the schedule), and two more called “water cooler” and “smoking room,” where people can continue discussing certain topics and features.

Finalization: 5 minutes

After the demo segment, we reserve five minutes for everyone to come back together in the same room, where anyone can offer feedback, comments, and appreciation.

Feedback and Next Steps

At the time of writing, we’ve conducted seven sprint reviews using the Open Space format and have had great success with it. Here’s a look at the type of reactions we’ve gotten from our team:

So, even while people are happy overall with our new sprint review format, we still plan to make several improvements:

  • Feedback remains limited, so we hope to increase its quality and regularity
  • The scope of demos is usually concentrated around sprint achievements, so we’d like to also discuss each feature’s “story” — not only its present, but its past and future
  • Finally, we hope to demo elements not connected with actual delivery, such as user research results and design review — which may involve even more participants in each demo

I hope this article was useful for you! Share, comment, and give feedback!

--

--

Evgeniy Labunskiy
PandaDoc Tech Blog

Agile Coach, Trainer, Head of Agile Practices @ PandaDoc