The delivery process model
The aim of this meeting is to create the sprint backlog (a subset of the product backlog) and sprint goal. The delivery team plans the upcoming sprint. This meeting should be no more than 4 hours.
- Talk through the ultimate goal for the project talk through the user stories
- Developers will break each user story into tasks, giving estimates for each one
- The product owner will sign off the priority order for the first sprint
- After this meeting, the task list is set and there is no going back.
- What did you do yesterday?
- What will you do today?
- Do you have any blockers?
There are only so many different ways to demonstrate a product but the key is not to go into the technicalities of how something has been done. We prefer to demo a product then and there instead of recording a demo and pressing play – so it is better to have all the right people in the same room. Nevertheless we are a digital team so we can look at including people on video conferencing or recording the demo for those who can’t attend.
A demo should follow a similar route each time:
- Create a story – use personas to guide the end to end process
- Plan what to do if something doesn’t work the way it should
- Be prepared for questions
How to run a demo
The scrum master hosts the demo, acting as compare as the developers talk through their tasks in the backlog, explaining the acceptance criteria and showing how they work in practice on the big screen.
Allow 2 hours for the meeting and book a private space with a big screen. Make sure you arrive early with a HDMI cable, and check that it works before anyone gets there.
A demo could follow these steps:
- Sprint backlog review (complete/incomplete tasks)
- MVP realignment – completed features
- Burndown (where we are with cost and scope)
- Current Bashboard/blockers
- Game changers
- Budget view
- Project overview
- Milestones/next steps
A sprint retro is usually the last thing done in a sprint. It is crucial for us to look at continually improving throughout the project we are working on. Retros styles will change depending on the team / project but the outcome is always the same. Blame is not the name of the game in a retro but you need to be open and honest if something isn’t working as it should, then say it. Remember the good stuff is worth shouting about too!
How to run a retro
One person should host the retro and decide on questions for the team to talk about.
If you’re hosting, pick broad questions that allow the team to set the agenda, rather than strictly setting it yourself.
Retros should have an open atmosphere where every member of the team can speak honestly and feel confident that their colleagues will listen.
Allow 60 to 90 minutes for the meeting and use a private space where you can stick post-it notes to the wall.
A basic retro could follow these steps:
- The host explains the questions at the beginning and sticks a post-it note to the wall for each question.
- Each team member writes down one or more answers for each question on post-it notes and sticks them to the right part of the wall.
- The group discusses issues as they come up, or at the end.
- The host decides on actions to fix any problems raised, and assigns them to members of the team.
You could choose to cover 3 or 4 of the following topics:
- what went well in the last iteration
- what went badly in the last iteration
- what’s puzzling the team or what the team doesn’t understand
- who the team wants to thank (eg other members of the team)
These topics are just examples, there are many different types of retro. You can find more in the Retrospective wiki.
If you’re hosting the retro, you should pick topics which you think will prompt useful discussions in your team, for example on transparency, team learning or your working process.
Make a list of actions
You should use the information you get from your retro to improve your work and your working environment.
Make a list of actions that you’ll carry out to fix the problems that your team highlighted and assign them to people in the team.
You should aim to get the actions done before the next retro.
A chance to take a step back and look at what you have done through your peers eyes. Unlike the demo this is where you do look at the how and the why.
Tech reviews are not only for the benefit of those on the project team, for other team members its about knowing what’s being developed around you and making sure you keep up to date with the learning that comes from each project.
Service testing and refactoring
Service testing looks at what is in the test /UAT environment. Testing isn’t just about testing the process from end to end putting in all the correct answers, testers should be trying to break it. What happens if I do this…. What if I put in something completely random in this box instead of a date? Feeding back findings is the important bit – is it a bug (something that should work but doesn’t ) or is it a new feature that needs adding to the backlog?
Also known as backlog grooming the Scrum master / Business Analyst should be working with the PO to refine the backlog. This could be adding user stories to newly identified features or ensuring the backlog is in priority order (high to low) ready for the next sprint planning. Remember the PO owns the backlog so this needs to be completed together while keeping in mind TIME / COST / SCOPE of the project.
Once the sprint cycle has finished and you have received product sign off from your PO it is time to go through the release tasks. This is where the developers ensure everything is set up for the production environment and the Scrum Master ensures support processes are ready. There should be 3 days put aside for this at the end of the project before the day of release.
At the end of the last sprint, we go through the feature list and make sure that all requirements have been completed to the agreed acceptance criteria.
If the product owner gives consent to go live, the scrum master will set a date and begin the go-live task list.
When the service has gone live, it’s important to hand over the backlog to the product owner – they will add any new requests to the list and come back when there is enough to warrant the next iteration.
All tasks have been completed to an acceptable standard and signed off by the product owner
Minimum viable product (MVP)
The product exists and meets the minimum requirements of the customer
The product has been tested by real users, automated checks and peers in a tech review