Skip to content

Commit 303441b

Browse files
author
Sylvain MARIE
committed
Added an "examples" section in the documentation, pointing to pytest-patterns
1 parent 8f8fb42 commit 303441b

File tree

7 files changed

+9
-1
lines changed

7 files changed

+9
-1
lines changed

docs/examples.md

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
# Examples
2+
3+
Our examples are hosted in a separate repository so that they can be updated independently.
4+
5+
* [data science benchmark](https://smarie.github.io/pytest-patterns/examples/data_science_benchmark/) demonstrates how `pytest` can be used as a benchmarking engine thanks to `pytest-cases` and `pytest-harvest`, to compare the performances of several regression algorithms on several datasets and produce various reports (plots, csv table...).
29.5 KB
Loading
123 KB
Loading
151 KB
Loading
165 KB
Loading

docs/index.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,9 @@ Did you ever think that most of your test functions were actually *the same test
1717

1818
- In addition, `pytest-cases` improves `pytest`'s fixture mechanism to support "fixture unions". This is a **major change** in the internal `pytest` engine, unlocking many possibilities such as using fixture references as parameter values in a test function. See [here](pytest_goodies.md#fixture_union).
1919

20-
`pytest-cases` is fully compliant with [pytest-harvest](https://smarie.github.io/python-pytest-harvest/) and [pytest-steps](https://smarie.github.io/python-pytest-steps/) so you can create test suites with several steps, send each case on the full suite, and monitor the execution times and created artifacts. See also [this example](https://smarie.github.io/pytest-patterns/) (a bit old, needs to be refreshed) showing how to combine the various plugins to create data science benchmarks.
20+
`pytest-cases` is fully compliant with [pytest-harvest](https://smarie.github.io/python-pytest-harvest/) so you can easily monitor the execution times and created artifacts. With it, it becomes very easy to create a complete data science benchmark, for example comparing various models on various datasets as illustrated below (from the [example](examples.md) section):
21+
22+
![benchmark_plots_example](imgs/0_bench_plots_example4.png)
2123

2224
## Installing
2325

docs/mkdocs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ docs_dir: .
55
site_dir: ../site
66
nav:
77
- Home: index.md
8+
- Examples: examples.md
89
- pytest goodies: pytest_goodies.md
910
- fixture unions theory: unions_theory.md
1011
- API reference: api_reference.md

0 commit comments

Comments
 (0)