Skip to content

Commit df3478b

Browse files
authored
Folder tidy (#153)
* Organising folders * Fix links * Links again
1 parent 98fe3f4 commit df3478b

20 files changed

+17
-17
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,8 +28,8 @@ The framework is a companion to:
2828
The framework consists of:
2929

3030
* [Engineering principles](principles.md)
31-
* [Engineering quality review tool](review.md)
32-
* [Communities of practice guidelines](communities-of-practice.md) and active communities:
31+
* [Engineering quality review tool](insights/review.md)
32+
* [Communities of practice guidelines](communities/communities-of-practice.md) and active communities:
3333
* [Product Development Test Automation Working Group](communities/pd-test-automation-working-group.md)
3434
* [Product Development Engineering CoP](communities/pd-engineering-cop.md)
3535
* [Product Development Cloud PaaS Working Group](communities/pd-cloud-working-group.md)
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Communities of practice
22

3-
This is part of a broader [quality framework](README.md)
3+
This is part of a broader [quality framework](../README.md)
44

55
A community of practice (CoP) is a group of people who "share a concern or a passion for something they do and learn how to do it better as they interact regularly" ([Introduction to communities of practice](https://wenger-trayner.com/introduction-to-communities-of-practice/)). This document gives guidance on forming and running a CoP.
66

communities/pd-engineering-cop.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Engineering CoP (Community of Practice)
22

3-
This is part of a broader [quality framework](../README.md) and is one of a set of [communities of practice](../communities-of-practice.md).
3+
This is part of a broader [quality framework](../README.md) and is one of a set of [communities of practice](communities-of-practice.md).
44

55
There is some overlap with the [Test Automation Working Group](pd-test-automation-working-group.md).
66

communities/pd-test-automation-working-group.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Test Automation Working Group
22

3-
This is part of a broader [quality framework](../README.md) and is one of a set of [communities of practice](../communities-of-practice.md)
3+
This is part of a broader [quality framework](../README.md) and is one of a set of [communities of practice](communities-of-practice.md)
44

55
## Subject
66

communities/security-cop.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Secure Engineering CoP (Community of Practice)
22

3-
This is part of a broader [quality framework](../README.md) and is one of a set of [communities of practice](../communities-of-practice.md)
3+
This is part of a broader [quality framework](../README.md) and is one of a set of [communities of practice](communities-of-practice.md)
44

55
## Subject
66

@@ -18,7 +18,7 @@ For secure development and operation of systems:
1818
* Provide advice and guidance as requested
1919
* Facilitate good practice discussions, and curate relevant principles, practices and patterns within the [Software Engineering Quality Framework](../README.md)
2020
* Curate relevant Training Pathways within the [Software Engineering Quality Framework](../README.md)
21-
* Curate relevant section(s) within the [Software Engineering Review Tool](../review.md)
21+
* Curate relevant section(s) within the [Software Engineering Review Tool](../insights/review.md)
2222
* Provide supplementary learning in relevant areas (e.g. workshops)
2323
* Discuss, disseminate and feedback on the output of the Cyber Design Authority
2424
* **Build** knowledge:

continuous-improvement.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ The benefits of improving these areas are:
8181

8282
## Identifying improvement opportunities
8383

84-
Regular team retrospectives are an effective way to identify improvement opportunities and actions. Another potential source are periodic reviews using tools such as the the [AWS](https://aws.amazon.com/architecture/well-architected/) or [Azure](https://azure.microsoft.com/en-gb/blog/introducing-the-microsoft-azure-wellarchitected-framework/) Well-Architected Frameworks and the [NHS Digital quality review](review.md). And of course, tech debt is often uncovered in the course of making changes to a system.
84+
Regular team retrospectives are an effective way to identify improvement opportunities and actions. Another potential source are periodic reviews using tools such as the the [AWS](https://aws.amazon.com/architecture/well-architected/) or [Azure](https://azure.microsoft.com/en-gb/blog/introducing-the-microsoft-azure-wellarchitected-framework/) Well-Architected Frameworks and the [NHS Digital quality review](insights/review.md). And of course, tech debt is often uncovered in the course of making changes to a system.
8585

8686
As discused in [Benefits](#benefits), in high level terms the opportunities for reducing waste or improving quality tend to be in these areas:
8787

metrics.md renamed to insights/metrics.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
# Engineering metrics
22

3-
This is part of a broader [quality framework](README.md)
3+
This is part of a broader [quality framework](../README.md)
44

55
## Purpose & usage
66

77
These hard figures help us to measure the effect of improvement work over time, and should be tracked on a monthly basis.
88

99
Also, these metrics are intended to be considered as part of [engineering reviews](review.md). We recommend tracking these metrics on an Engineering Quality dashboard for teams to track their progress with improvements in line with the engineering reviews, for example:
1010

11-
![Example Dashboard](images/quality-dashboard.png)
11+
![Example Dashboard](../images/quality-dashboard.png)
1212

1313
These metrics are obviously limited to engineering concerns, and are only one part of the picture. For example, we don't want to build the wrong thing, even if we build it in a great way! We recommend that these metrics form part of a broader set of health indicators, including data relating to user satisfaction.
1414

@@ -19,7 +19,7 @@ These metrics provide a fundamental level of insight, and so must be tracked:
1919
| Measure | Definition (each calculated over the last 28 days) |
2020
|:---|:---|
2121
| Deployment frequency | Number of deployments.
22-
| Quality checks | Presence or absence of frequent, consistent, enforced (with agreed tolerances) of the [engineering quality checks](quality-checks.md) - follow that link for further details.
22+
| Quality checks | Presence or absence of frequent, consistent, enforced (with agreed tolerances) of the [engineering quality checks](../quality-checks.md) - follow that link for further details.
2323

2424
## Recommended metrics:
2525

review.md renamed to insights/review.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Engineering quality review tool
22

3-
This is part of a broader [quality framework](README.md)
3+
This is part of a broader [quality framework](../README.md)
44

55
# Contents
66

@@ -158,7 +158,7 @@ You may wish to score each individual component or system separately for these a
158158
* We enjoy working with them and they support fast, reliable and safe delivery.
159159
* The tech and architecture make testing, local development and live operations easy.
160160
* The architecture is clean.
161-
* Our system is built as a set of independent services/components where appropriate (see [Architect for Flow](patterns/architect-for-flow.md)).
161+
* Our system is built as a set of independent services/components where appropriate (see [Architect for Flow](../patterns/architect-for-flow.md)).
162162

163163
### 11. Easy and safe to release
164164
* It is easy and straightforward to release a change to production.
@@ -238,8 +238,8 @@ Good facilitation can help teams get the most out of the review and it is recomm
238238

239239
* It's important to set the right tone. Some teams may understandably be wary of "being assessed", particularly because the process includes an outside facilitator. It's essential that they feel safe to make an honest appraisal. Emphasise that this tool is just a way of helping teams identify how to best drive continuous improvement — a bit like a "structured retrospective".
240240
* Remember (and remind the team) that this framework is continually evolving and "open source". Encourage them to suggest ways it can be improved and raise pull requests. As well as being a useful way to drive improvement of the framework, this encourages the idea that it is not set in stone and decreed from on high, which can build trust and engagement.
241-
* Help the team understand and compare where they are just now with what genuinely excellent looks like. The notes under each section try to describe what good looks like, and the [principles](principles.md), patterns and practices go into more detail. Help them trace the path to excellence by starting with achievable changes and working over time to more significant changes if relevant.
242-
* Be intimately familiar with the sections of the review and the supporting [principles](principles.md), patterns and practices. Try to keep conversation focused around the topic for each section, mentioning which section will cover the point being raised when suggesting that discussion be deferred.
241+
* Help the team understand and compare where they are just now with what genuinely excellent looks like. The notes under each section try to describe what good looks like, and the [principles](../principles.md), patterns and practices go into more detail. Help them trace the path to excellence by starting with achievable changes and working over time to more significant changes if relevant.
242+
* Be intimately familiar with the sections of the review and the supporting [principles](../principles.md), patterns and practices. Try to keep conversation focused around the topic for each section, mentioning which section will cover the point being raised when suggesting that discussion be deferred.
243243
* Consider running an "ice breaker" to get the team engaged and talking in the session. This can help to reduce confirmation bias with only the more vocal members of the team contributing to the session.
244244
* Work through the review section by section. For each, briefly outline the scope and pick out a few key points from the list of what "good" looks like, then invite the group to describe how things work for them. Keep conversation and questioning open to start with and let the conversation be led by the team. Ask specific questions to fill in any gaps based on the points under each section. Identify any actions which come up and record these. Try to keep the conversation relevant and focused — there is a lot to go through.
245245
* Once the team has discussed the points for the section, it's time to score. A good way to do this is using "planning poker" style blind voting, which can be easily done by holding up 1 to 5 fingers. Discuss any differences and use these as a lever to bring some of the quieter team members into the conversation. Agree on a single score for the team which is recorded. Refer to the definitions of each score at the bottom of the review sheet. While emphasising that the score is not the most important part of the process, advise the team when it feels like they are being too harsh or too soft on scoring. Accurate scoring will more clearly focus attention on the right areas.

practices/security.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ The remainder of this page gives more detailed and specific recommendations to b
6868
- Be careful not to **leak information**, e.g. error messages, stack traces, headers
6969
- **Don't trust** yourself or others! <a name='secret-scanning'></a>
7070
- Code must be automatically scanned for secrets or other sensitive data:
71-
- To catch any issues early and to minimise potential exposure, scan code on developer machines *before* code is committed to the code repository. We recommend using [awslabs git-secrets](https://github.com/awslabs/git-secrets). To set this up on a Mac workstation or as part of your Jenkins pipeline, follow the examples and READMEs in [nhsd-git-secrets](../nhsd-git-secrets). Windows testing is in progress and instructions/code will be added in due course
71+
- To catch any issues early and to minimise potential exposure, scan code on developer machines *before* code is committed to the code repository. We recommend using [awslabs git-secrets](https://github.com/awslabs/git-secrets). To set this up on a Mac workstation or as part of your Jenkins pipeline, follow the examples and READMEs in [nhsd-git-secrets](../tools/nhsd-git-secrets). Windows testing is in progress and instructions/code will be added in due course
7272
- As a backstop, *also* enable server-side scanning within the code repository. Recommended solution options:
7373
- TO DO: more details... for example in [GitHub](https://docs.github.com/en/code-security/secret-security/about-secret-scanning)
7474
- Be wary of any 3rd party JavaScript included on the page, e.g. for A/B testing, analytics

0 commit comments

Comments
 (0)