Skip to content

Conversation

@jakelorocco
Copy link
Contributor

@jakelorocco jakelorocco commented Oct 15, 2025

Add pytest docs so that you can run most of the examples as a single pytest run. I had to modify a few tests to get them to be runnable from any directory and without user input.

Also updated our codespell precommit hook so that we can use #ignore:codespell syntax for individual lines.

Found and fixed bugs as a part of this:

  • The huggingface version of granite guardian was passing a duplicate model option which was causing errors
  • Added __copy__ and __deepcopy__ dunder methods to ModelOutputThunk to prevent errors when generating with sampling strategies since sampling strategies copy actions
  • Added a raises clause to the docstring of generative slots since validation can occasionally fail

Output looks like regular pytest and identifies the example file:

================================================================ test session starts =================================================================
platform darwin -- Python 3.12.0, pytest-8.4.2, pluggy-1.6.0
rootdir: /Users/jake/code/mellea
configfile: pyproject.toml
plugins: asyncio-1.2.0, anyio-4.11.0, Faker-37.6.0
asyncio: mode=Mode.AUTO, debug=False, asyncio_default_fixture_loop_scope=None, asyncio_default_test_loop_scope=function
collected 45 items

docs/examples/agents/react.py .                                                                                                                [  2%]

Some of the examples have to be skipped since they can't easily be run in a pytest compatible way. When pytest runs on the docs directory, a reminder is output to run those files manually:

================================================================== Skipped Examples ==================================================================
Examples with the following names were skipped because they cannot be easily run in the pytest framework; please run them manually:
101_example.py
mcp_example.py
client.py
__init__.py
simple_rag_with_filter.py

@mergify
Copy link

mergify bot commented Oct 15, 2025

Merge Protections

Your pull request matches the following merge protections and will not be merged until they are valid.

🟢 Enforce conventional commit

Wonderful, this rule succeeded.

Make sure that we follow https://www.conventionalcommits.org/en/v1.0.0/

  • title ~= ^(fix|feat|docs|style|refactor|perf|test|build|ci|chore|revert|release)(?:\(.+\))?:

@jakelorocco jakelorocco marked this pull request as ready for review October 15, 2025 18:52

return ExampleFile.from_parent(parent, path=file_path)

# TODO: Support running jupyter notebooks:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FYI: you could do this with nbmake https://github.com/treebeardtech/nbmake

Copy link
Contributor

@avinash2692 avinash2692 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM apart from a couple of things:

  • might make sense to add nbmake in this PR before merging so that we have notebooks also covered in the tests

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want all conftest.py to be in the same place. Not sure how the hierarchy works with pytest

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe eventually, but I don't think so right now:

  • I don't think we always want the examples to run by default. We would have to write new code to disable them on github runner's, etc...
  • The tests are different enough that they should almost always be run separately

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, it requires changing how the pytest command is invoked. I think that change is fine, but I we should think about when we want to run the examples, etc...

@jakelorocco
Copy link
Contributor Author

LGTM apart from a couple of things:

  • might make sense to add nbmake in this PR before merging so that we have notebooks also covered in the tests

I added nbmake to the comment documenting how to test the notebooks. I think we should add notebook testing eventually but I think this PR is fine without it. To run the notebooks with nbmake requires rewriting the notebooks to avoid background processes.

Copy link
Contributor

@avinash2692 avinash2692 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I'll look into addinf nbmake is a separate PR.

@avinash2692 avinash2692 merged commit e30afe6 into main Oct 23, 2025
4 checks passed
@jakelorocco jakelorocco deleted the jal/run-examples-as-tests branch October 24, 2025 21:11
tuliocoppola pushed a commit to tuliocoppola/mellea that referenced this pull request Nov 5, 2025
…ng#198)

* feat: add conftest to run examples as tests

* fix: fix errors with granite guardian req generation

* fix: copy behavior with mots, add tests, add raises to genslot

* fix: update codespell precommit to support ignore

* fix: add note about nbmake
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants