Skip to content

Commit e6a93a6

Browse files
authored
Merge pull request #20 from i-walk-away/content
guidelines for contributors
2 parents 4350a93 + f7bfb19 commit e6a93a6

File tree

6 files changed

+318
-246
lines changed

6 files changed

+318
-246
lines changed

CONTRIBUTING.md

Lines changed: 68 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
# Contributing
2+
3+
Table of contents!
4+
5+
## Adding new lessons to pydantic quest
6+
7+
If you want to add a new lesson to pydantic quest, do this:
8+
9+
1. Create a new folder in [lessons/](lessons/)
10+
2. Populate the folder with 4 neccessary files. You can copy them from
11+
[lessons/lesson-template/](lessons/lesson-template/):
12+
* `lesson.yaml`
13+
* `theory.md`
14+
* `starter.py`
15+
* `cases.yaml`
16+
3. Head to [lessons/index.yaml](lessons/index.yaml) and add the following:
17+
18+
```yaml
19+
- slug: dash-separated-lesson-name
20+
order: <integer>
21+
```
22+
23+
The "order" field changes the order in which lessons appear in pydantic quest.
24+
If you're not sure, just use whatever highest order already exists in index and
25+
add +1 to it. I will reorder everything myself if needed :)
26+
27+
## Explanation of the 4 neccessary files
28+
29+
### 1. `lesson.yaml`
30+
31+
This file currently only defines the name of the lesson. The file has the following structure:
32+
33+
```yaml
34+
title: "Your lesson title goes here"
35+
```
36+
37+
Change the `title` field to the title of your lesson.
38+
39+
### 2. `theory.md`
40+
41+
The contents of this file is the lesson *body:* what the user sees on the
42+
left side of the screen when your lesson is selected.
43+
Populate it with theoretical information and an assignment.
44+
Refer to [lessons/lesson-template/theory.md](lessons/lesson-template/theory.md) to see available
45+
custom formatting (on top of existing Markdown formatting)
46+
and general recommendations on designing a good lesson body.
47+
48+
### 3. `starter.py`
49+
50+
The contents of this python script is what will be displayed to the user in the code editor
51+
by default in your lesson.
52+
The user will then build upon your starter script in order to complete the assignment.
53+
Refer to [lessons/lesson-template/starter.py](lessons/lesson-template/starter.py) for better insight.
54+
55+
### 4. `cases.yaml`
56+
57+
Test cases for your lesson. Just refer to [`lessons/lesson-template/cases.yaml`](lessons/lesson-template/cases.yaml).
58+
You can find
59+
a *lot* of information there about how it works and how exactly to design your own test cases.
60+
Please inform me if it is still not very clear.
61+
62+
## Contributor checklist
63+
64+
Before opening a PR:
65+
66+
1. verify `lesson.yaml`, `theory.md`, `starter.py`, and `cases.yaml` exist in your new lesson folder
67+
2. verify lesson slug is added to [lessons/index.yaml](lessons/index.yaml)
68+
3. confirm each visible case has clear `label` and useful `reason`

lessons/intro-model/theory.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,3 @@
1-
# Intro: BaseModel and validation
2-
31
Create a `User` model with fields:
42

53
- name: `str`

lessons/lesson-template/cases.yaml

Lines changed: 216 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,36 +1,201 @@
11
cases:
2-
# each case has:
3-
# - unique "name" (internal id)
4-
# - readable "label" (shown to learner for visible cases)
5-
# - hidden=true/false
6-
# - script that sets boolean "ok" and optional "reason"
2+
# This file defines the automated checks that run against the learner's submitted code.
3+
#
4+
# Read this file as "a list of small python programs". Each item under
5+
# "cases" is executed separately after the learner's solution has already been
6+
# loaded into memory.
7+
#
8+
# Execution model:
9+
# - the learner writes code in the code edtor
10+
# - the platform executes that code first
11+
# - after that, each case below runs as its own small Python script
12+
# - those scripts can access objects created by the learner in his python programm
13+
# (for example, if user has created a UserProfile class - your scripts can access it
14+
#
15+
# That is why cases below can use "UserProfile" directly even though this YAML file
16+
# never defines it. "UserProfile" is expected to come from the learner's
17+
# submission. In a real lesson, replace "UserProfile" with whatever object,
18+
# function, validator, or model the learner is supposed to build.
19+
#
20+
# In other words:
21+
# This file describes how the platform will verify the solution.
22+
# Lesson-defined names come from the learner's code.
23+
# Your overall goal here is to write small tests that will verify the correctness of
24+
# the user's solution. If your lesson is about making a UserProfile class with
25+
# the field "age: int" and a strict requirement:
26+
#
27+
# "String age should raise ValidationError, even though Pydantic usually
28+
# converts string numbers to integers when they're assigned
29+
# to integer fields",
30+
#
31+
# then you should create a test case that tries to instantiate UserProfile with string age
32+
# and if no ValidationError exception is raised by the user - and therefore is not
33+
# caught in your script's `try / except` block - this case should fail, because
34+
# you just verified that the submitted UserProfile definition is not raising
35+
# a validation error upon receiving a string type age. You get the idea.
36+
#
37+
# Every case item has four fields:
38+
#
39+
# name:
40+
# Stable internal identifier of a test case.
41+
# Requirements:
42+
# - must be unique inside this file
43+
# - should use snake_case
44+
# - should describe the rule being tested (e.g "string_age")
45+
#
46+
# label:
47+
# Short human-readable title for the case.
48+
# Unless your test case is hidden (more info below), the user will see this label
49+
# upon unfolding the "test cases" list at the end of the lesson body.
50+
#
51+
# Good labels describe what is tested:
52+
# - "string age rejected"
53+
# - "username stripped"
54+
#
55+
# Bad labels:
56+
# - "string age"
57+
# - "strip username"
58+
# - "test field validator"
59+
#
60+
# hidden (bool):
61+
# Controls whether the case is visible in the UI before the learner runs the
62+
# solution.
63+
# - false: visible to the learner in the public "test cases" list
64+
# - true: still executed, but hidden from the public list
65+
#
66+
# script:
67+
# Python code for this specific test case.
68+
# It runs in the same execution context as the learner's code, so it can:
69+
# - instantiate learner-defined classes
70+
# - call learner-defined functions
71+
# - inspect returned values
72+
# - intentionally trigger validation or runtime errors
73+
#
74+
# The platform expects each script to communicate the result through these
75+
# variables:
76+
# - ok:
77+
# required boolean pass/fail flag
78+
# - reason:
79+
# optional string explaining why the case failed
80+
#
81+
# If your script raises an exception and you do NOT catch it yourself, the
82+
# platform catches it at the case-runner level and automatically converts it
83+
# into a failed case:
84+
# - ok becomes False
85+
# - reason becomes "<ExceptionClass>: <exception text>"
86+
#
87+
# For example, if this line raises a Pydantic ValidationError:
88+
# profile = UserProfile(username="alice", age=24, tags=["some_string"])
89+
# then the case will fail automatically with a reason similar to:
90+
# ValidationError: 1 validation error for UserProfile
91+
#
92+
# This means:
93+
# - uncaught exceptions do NOT crash the whole runner
94+
# - they fail only the current case
95+
# - if you want an exception to count as a successful validation check, wrap
96+
# the code in try/except and set ok/reason yourself. like this:
97+
#
98+
# try:
99+
# <if this line of code doesn't raise a ValidationError, the rest of the try block will execute>
100+
# ok = False
101+
# reason = "invalid input must raise validation error"
102+
# except ValidationError:
103+
# ok = True
104+
#
105+
# Practical rule:
106+
# - if the case passes, setting only "ok = True" is enough
107+
# - if the case fails, set "ok = False" and provide a short "reason"
108+
# (e.g "passing a `string` value to `age` field of UserProfile should
109+
# raise ValidationError)
110+
#
111+
# Write "reason" as direct feedback to the learner. It should
112+
# explain the broken behavior, not narrate the script:
113+
# - good: "blank username must raise validation error"
114+
# - good: "age boundary value 13 should be accepted"
115+
# - bad: "the assertion failed"
116+
# - bad: "string age test returned false"
117+
#
118+
# Recommended lesson design strategy:
119+
# - use visible cases for the actual contract the learner should implement:
120+
# valid input, invalid input, boundary values, normalization rules, and
121+
# anything else you want them to understand explicitly from the lesson
122+
#
123+
# - use hidden cases mostly for anti-cheat and anti-hardcode coverage
124+
# - do not hide test cases that are important fot teaching the subject
125+
#
126+
# Some ideas for your test cases (doesn't apply to everything, of course):
127+
# - invalid input
128+
# - visible edge and boundary cases
129+
# - off-by-one errors
130+
# - missing fields
131+
# - wrong types
132+
# - multiple invalid combinations
133+
# - one hidden anti-hardcode case, if the lesson needs it
134+
#
135+
# Keep each case focused. Prefer five small cases over one giant script that
136+
# tries to test everything at once. Small cases are easier to read, debug, and
137+
# maintain.
138+
#
139+
# Visible happy path.
140+
#
141+
# Use at least one public case that demonstrates what "correct" looks like.
142+
# The learner should be able to run their code and quickly confirm the core
143+
# behavior works before dealing with edge cases.
7144

8-
# visible "happy path": keep at least one simple success case for fast feedback
9145
- name: valid_profile
10146
label: valid profile
11147
hidden: false
12148
script: |
13-
profile = UserProfile(username="alice", age=24, tags=["python"])
14-
ok = profile.username == "alice" and profile.age == 24 and profile.tags == ["python"]
149+
# "UserProfile" is expected to be created by the learner in starter.py.
150+
# This case does not define that class. It assumes the learner's code has
151+
# already defined it and now verifies its happy-path behavior.
152+
#
153+
# If the learner forgot to define one of the required fields, or defined a
154+
# different schema that cannot accept these values, this constructor call
155+
# may raise an exception. If that happens and you do not catch it here,
156+
# the platform will automatically fail this case and turn the exception
157+
# into the case fail reason, so you don't have to test manually if fields actually
158+
# exist:
159+
profile = UserProfile(username="alice", age=24, tags=["some_string"])
160+
161+
# Set ok to a boolean. The platform reads this variable to determine
162+
# whether the case passed.
163+
ok = profile.username == "alice" and profile.age == 24 and profile.tags == ["some_string"]
164+
165+
# "reason" should explain the actual contract that was broken.
166+
# The learner sees this message if ok is False, so it should be specific
167+
# enough to tell them what to fix.
15168
if not ok:
16-
reason = "model should accept valid data and preserve field values"
169+
reason = "model should define all required fields with the correct types and accept valid input"
17170
18-
# visible "negative path": learner immediately sees why solution is incomplete
171+
# Negative path.
172+
#
173+
# Failure cases are useful because they explain the lesson's rules
174+
# immediately. Here we say that blank usernames must not be accepted.
19175
- name: reject_blank_username
20176
label: reject blank username
21177
hidden: false
22178
script: |
179+
# This case expects invalid input to be rejected.
180+
#
181+
# Use try/except for rules where "raising an exception" is the expected
182+
# success condition.
23183
try:
24184
UserProfile(username=" ", age=24, tags=[])
25185
ok = False
26186
reason = "blank username must raise validation error"
27187
except Exception:
188+
# An exception means the invalid input was rejected, so the case
189+
# passes. It's better if you check for an exact exception type
190+
# or inspect the error message.
28191
ok = True
29192
30-
# hidden anti-cheat/edge case: not shown in sample list
193+
# Visible validation rule.
194+
#
195+
# If "age must be at least 13" is part of the lesson contract.
31196
- name: reject_too_young
32197
label: reject age under 13
33-
hidden: true
198+
hidden: false
34199
script: |
35200
try:
36201
UserProfile(username="alice", age=12, tags=[])
@@ -39,22 +204,55 @@ cases:
39204
except Exception:
40205
ok = True
41206
42-
# hidden boundary check: catches off-by-one bugs
207+
# Boundary check.
208+
#
209+
# Boundary cases are valuable because they catch off-by-one mistakes.
210+
# If the rule is "age must be at least 13", you should test 13 too.
43211
- name: accept_boundary_age
44212
label: accept boundary age
45-
hidden: true
213+
hidden: false
46214
script: |
215+
# Boundary checks catch common "> vs >=" and "< vs <=" mistakes.
47216
profile = UserProfile(username="alice", age=13, tags=[])
48217
ok = profile.age == 13
49218
if not ok:
50219
reason = "age boundary value 13 should be accepted"
51220
52-
# hidden normalization case: useful when lesson requires sanitizing strings
53-
- name: trim_username
54-
label: trim username
55-
hidden: true
221+
# Normalization case.
222+
#
223+
# Use cases like this when the learner must normalize or clean input data.
224+
# This prevents solutions that only validate but do not transform values.
225+
- name: username_trimmed
226+
label: username trimmed
227+
hidden: false
56228
script: |
229+
# This case shows how to test normalization instead of pure validation.
230+
# If the lesson expects cleaned output, assert on the transformed value.
231+
57232
profile = UserProfile(username=" alice ", age=24, tags=[])
233+
234+
# If the model is defined correctly, username will be trimmed upon instantiating profile.
235+
# Therefore the value of profile.username should be processed (trimmed in this case)
236+
58237
ok = profile.username == "alice"
59238
if not ok:
60239
reason = "username should be normalized with strip()"
240+
241+
# Hidden anti-hardcode example.
242+
#
243+
# This is a good example of when a hidden case is justified: it checks the
244+
# same contract as the visible happy path, but with different values. That
245+
# makes it harder to pass the lesson with a hardcoded object or with logic
246+
# that only works for one specific input sample shown publicly.
247+
- name: valid_profile_different_values
248+
label: valid profile with different values
249+
hidden: true
250+
script: |
251+
profile = UserProfile(username="bob", age=31, tags=["fastapi", "backend"])
252+
ok = (
253+
profile.username == "bob"
254+
and profile.age == 31
255+
and profile.tags == ["fastapi", "backend"]
256+
)
257+
if not ok:
258+
reason = "model should work for valid input values beyond the public example"
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
title: "Template: How to write a lesson"
1+
title: "Your lesson title goes here"

0 commit comments

Comments
 (0)