|
1 | 1 |
|
2 | | -# ⚙️ Ranking Workflow: PyTorch Ambassador Program |
3 | | - |
4 | | -This document explains how the **automated ranking system** works for evaluating nominations to the PyTorch Ambassador Program. It is intended for **maintainers, contributors, and workflow administrators** who want to understand or improve the scoring system. |
5 | | - |
6 | | ---- |
7 | | - |
8 | | -## 📌 Purpose |
9 | | - |
10 | | -The ranking workflow enables reviewers to **evaluate nominees using a 1–5 scale** via GitHub issue comments. The system: |
11 | | -- Acknowledges each score when submitted |
12 | | -- Collects valid, unique reviewer scores |
13 | | -- Waits for a review window (2 hours) |
14 | | -- Calculates the average |
15 | | -- Applies a final decision label (`approved` or `rejected`) |
16 | | -- Posts a summary comment for transparency |
17 | | - |
18 | | ---- |
19 | | - |
20 | | -## 🧮 How It Works |
21 | | - |
22 | | -### ✅ Step 1: Reviewers Comment Their Scores |
23 | | - |
24 | | -Reviewers comment directly on a nomination issue using the format: |
25 | | - |
26 | | -``` |
27 | | -Score: X |
28 | | -``` |
29 | | - |
30 | | -Where `X` is an integer from 1 to 5. The comment is **not case-sensitive**. |
31 | | - |
32 | | -Only **one score per reviewer** is counted. If a reviewer comments more than once, only their **first valid score** is used. |
33 | | - |
34 | | ---- |
35 | | - |
36 | | -### 🧠 Step 2: Acknowledgement Comment |
37 | | - |
38 | | -As soon as a valid score is submitted, the system responds with: |
39 | | - |
40 | | -``` |
41 | | -📝 Score received from @reviewer: X |
42 | | -⏳ Final decision will be calculated and posted after all reviewers have submitted or in approximately 2 hours. |
43 | | -``` |
44 | | - |
45 | | -This ensures transparency and avoids premature decisions. |
46 | | - |
47 | | ---- |
48 | | - |
49 | | -### ⏲️ Step 3: Scheduled Finalization |
50 | | - |
51 | | -A scheduled GitHub Action runs **every 2 hours** (or can be manually triggered). It: |
52 | | - |
53 | | -1. Collects all valid `Score: X` comments |
54 | | -2. Filters for unique reviewers |
55 | | -3. Calculates the **average score** |
56 | | -4. Applies the final label based on threshold |
57 | | - |
58 | | -| Average Score | Final Status | |
59 | | -|---------------|--------------| |
60 | | -| **≥ 3.0** | ✅ Approved | |
61 | | -| **< 3.0** | ❌ Rejected | |
62 | | - |
63 | | -The workflow: |
64 | | -- Removes conflicting labels (if any) |
65 | | -- Adds the correct final decision label |
66 | | -- Posts a summary comment |
67 | | - |
68 | | ---- |
69 | | - |
70 | | -### 💬 Step 4: Summary Comment Example |
71 | | - |
72 | | -``` |
73 | | -🧮 Final average score from 3 reviewers: **3.67** |
74 | | -👥 Reviewed by: @alice, @bob, @carol |
75 | | -📌 Final decision: **APPROVED** |
76 | | -``` |
77 | | - |
78 | | ---- |
79 | | - |
80 | | -## 🧪 How to Test the Workflow |
81 | | - |
82 | | -1. Submit a test nomination |
83 | | -2. Leave `Score: X` comments from different GitHub users |
84 | | -3. Wait for the scheduled action to run (or trigger manually) |
85 | | -4. Confirm: |
86 | | - - The correct label (`approved`/`rejected`) is applied |
87 | | - - The summary comment is posted |
88 | | - - Conflicting labels are removed |
89 | | - |
90 | | ---- |
91 | | - |
92 | | -## 🛠 Maintenance & Future Improvements |
93 | | - |
94 | | -- Core logic lives in: |
95 | | - - `.github/workflows/acknowledge-score-workflow.yml` |
96 | | - - `.github/workflows/finalize-ranking-scheduled.yml` |
97 | | -- Ideas for improvement: |
98 | | - - Restrict to nomination issues only |
99 | | - - Allow score updates (override previous ones) |
100 | | - - Set scoring deadlines per nomination |
101 | | - |
102 | | ---- |
103 | | - |
104 | | -## 🙋 Questions? |
105 | | - |
106 | | -If you need help updating or maintaining the workflow, contact a repo maintainer or open an issue in this repository. |
0 commit comments