You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: insights/review.md
+3-2Lines changed: 3 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -43,6 +43,7 @@ This is a **self** assessment review:
43
43
- Scores help the team focus attention on where to concentrate improvement work.
44
44
- Scores can help teams communicate and escalate issues outside the team.
45
45
- Scores cannot be compared between teams, but they can help spot common issues which would benefit from coordinated effort between and across teams.
46
+
- This is for use by *whole teams*, which means everyone involved in the delivery & operation of the product, not defined by organisation boundaries (for example if specialisms such as service management and cyber are based in a separate area of the organisation, they are still part of the team and should be included in the review session)
46
47
47
48
## Metrics
48
49
@@ -254,7 +255,7 @@ Good facilitation can help teams get the most out of the review and it is recomm
254
255
255
256
### Facilitator responsibilities
256
257
257
-
- Ensure the session has the right people in it.
258
+
- Ensure the session has the right people in it - see note about about "whole team" not being defined by organisation structures.
258
259
- Ensure all participants understand the [purpose](#purpose) of the review.
259
260
- Ensure a full and accurate review is done, considering all aspects.
260
261
- Ensure actions are identified and recorded.
@@ -264,7 +265,7 @@ Good facilitation can help teams get the most out of the review and it is recomm
264
265
- Recommended group size is 3–8 team members for each session.
265
266
- Recommended duration is 3–4 hours, either in one block with breaks or in multiple sessions (but aim to avoid long gaps between sessions).
266
267
- Ask a member of the Software Engineering Quality Assessments team (currently Andrew Blundell, Daniel Stefanuik, David Lavender, Sean Craig, Nick Sparks, Ezequiel Gomez) to create a blank spreadsheet for the review from the [template](review-template.xlsx). This will be stored centrally and shared with the team.
267
-
- Work with a member of the team to fill in the _Project_ and _What's it do_ fields before the session. This is typically uncontroversial and saves time in the session.
268
+
- Work with a member of the team to fill in the *Project* and *What's it do* fields before the session. This is typically uncontroversial and saves time in the session.
268
269
- To save time, fill in the list of participants before the session, but verify and update as necessary at the start of the session.
269
270
- Be ready for the session with the review spreadsheet and this document open such that both can easily be seen by participants by sharing your screen (assuming the review is not being held in person).
0 commit comments