-
Notifications
You must be signed in to change notification settings - Fork 11
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Following #177, we should review test coverage of AptaTrans to make sure everything works correctly.
Below, the test coverage for AptaTrans related modules.
=============================== tests coverage ================================
_______________ coverage: platform win32, python 3.12.9-final-0 _______________
Name Stmts Miss Branch BrPart Cover
-------------------------------------------------------------------------------------------------
pyaptamer\aptatrans\__init__.py 6 0 0 0 100%
pyaptamer\aptatrans\_model.py 80 10 14 2 85%
pyaptamer\aptatrans\_model_lightning.py 34 0 0 0 100%
pyaptamer\aptatrans\_pipeline.py 42 0 6 1 98%
pyaptamer\aptatrans\layers\__init__.py 3 0 0 0 100%
pyaptamer\aptatrans\layers\_convolutional.py 24 0 4 0 100%
pyaptamer\aptatrans\layers\_encoder.py 38 0 2 1 98%
pyaptamer\aptatrans\layers\_interaction_map.py 14 0 0 0 100%
pyaptamer\aptatrans\layers\tests\__init__.py 1 0 0 0 100%
pyaptamer\aptatrans\layers\tests\test_all_aptatrans_layers.py 78 0 2 0 100%
pyaptamer\aptatrans\tests\__init__.py 1 0 0 0 100%
pyaptamer\aptatrans\tests\test_aptatrans.py 140 4 0 0 97%
pyaptamer\aptatrans\tests\test_aptatrans_lightning.py 66 0 0 0 100%
Coverage looks good overall. Only _model.py is relatively low, however the branch not being tested is the one related to loading pretraining weights, not crucial.
As I mentioned in #178, the bug on shape mismatch was caused by introducing the attention mask in the encoders. This wasn't something we could have catched via test as it's, rather than a bug, was an non-conventional way of doing things (i.e., not using an attention mask).
FYI @fkiraly
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request