Skip to content

[ENH] AptaTrans test coverage #180

@NennoMP

Description

@NennoMP

Following #177, we should review test coverage of AptaTrans to make sure everything works correctly.

Below, the test coverage for AptaTrans related modules.

=============================== tests coverage ================================
_______________ coverage: platform win32, python 3.12.9-final-0 _______________

Name                                                            Stmts   Miss Branch BrPart  Cover
-------------------------------------------------------------------------------------------------
pyaptamer\aptatrans\__init__.py                                     6      0      0      0   100%
pyaptamer\aptatrans\_model.py                                      80     10     14      2    85%
pyaptamer\aptatrans\_model_lightning.py                            34      0      0      0   100%
pyaptamer\aptatrans\_pipeline.py                                   42      0      6      1    98%
pyaptamer\aptatrans\layers\__init__.py                              3      0      0      0   100%
pyaptamer\aptatrans\layers\_convolutional.py                       24      0      4      0   100%
pyaptamer\aptatrans\layers\_encoder.py                             38      0      2      1    98%
pyaptamer\aptatrans\layers\_interaction_map.py                     14      0      0      0   100%
pyaptamer\aptatrans\layers\tests\__init__.py                        1      0      0      0   100%
pyaptamer\aptatrans\layers\tests\test_all_aptatrans_layers.py      78      0      2      0   100%
pyaptamer\aptatrans\tests\__init__.py                               1      0      0      0   100%
pyaptamer\aptatrans\tests\test_aptatrans.py                       140      4      0      0    97%
pyaptamer\aptatrans\tests\test_aptatrans_lightning.py              66      0      0      0   100%

Coverage looks good overall. Only _model.py is relatively low, however the branch not being tested is the one related to loading pretraining weights, not crucial.

As I mentioned in #178, the bug on shape mismatch was caused by introducing the attention mask in the encoders. This wasn't something we could have catched via test as it's, rather than a bug, was an non-conventional way of doing things (i.e., not using an attention mask).

FYI @fkiraly

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions