Added new test cases covering all edge scenarios for calc_std_and_verify#386
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #386 +/- ##
=======================================
Coverage 76.35% 76.35%
=======================================
Files 29 29
Lines 1793 1793
=======================================
Hits 1369 1369
Misses 424 424 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
tests/ops/test_validate.py
Outdated
| # Should fail when slightly over limit | ||
| over_limit_gps = pd.Series({ | ||
| "ant_cov_XX1": individual_std * 1.001, | ||
| "ant_cov_YY1": individual_std * 1.001, | ||
| "ant_cov_ZZ1": individual_std * 1.001, | ||
| }) | ||
|
|
||
| with pytest.raises(ValueError, match=r"3D Standard Deviation.*exceeds GPS Sigma Limit"): | ||
| calc_std_and_verify(over_limit_gps, std_dev=True, sigma_limit=0.05, verify=True) | ||
|
|
||
| # Should pass when slightly under limit | ||
| under_limit_gps = pd.Series({ | ||
| "ant_cov_XX1": individual_std * 0.999, | ||
| "ant_cov_YY1": individual_std * 0.999, | ||
| "ant_cov_ZZ1": individual_std * 0.999, | ||
| }) | ||
|
|
||
| result = calc_std_and_verify(under_limit_gps, std_dev=True, sigma_limit=0.05, verify=True) | ||
| assert result < 0.05 |
tests/ops/test_validate.py
Outdated
| def test_extreme_limits(self): | ||
| """Test with extremely small and large sigma limits.""" | ||
| normal_gps = pd.Series({ | ||
| "ant_cov_XX1": 0.01, | ||
| "ant_cov_YY1": 0.01, | ||
| "ant_cov_ZZ1": 0.01, | ||
| }) | ||
|
|
||
| # Extremely small limit should fail | ||
| with pytest.raises(ValueError): | ||
| calc_std_and_verify(normal_gps, std_dev=True, sigma_limit=1e-10, verify=True) | ||
|
|
||
| # Extremely large limit should pass | ||
| result = calc_std_and_verify(normal_gps, std_dev=True, sigma_limit=1e10, verify=True) | ||
| assert result < 1e10 |
tests/ops/test_validate.py
Outdated
| # Test with all numeric data (should work) | ||
| numeric_gps = pd.Series({ | ||
| "ant_cov_XX1": 0.01, | ||
| "ant_cov_YY1": 0.01, | ||
| "ant_cov_ZZ1": 0.01, | ||
| }) | ||
|
|
||
| result = calc_std_and_verify(numeric_gps, std_dev=True, sigma_limit=0.05, verify=True) | ||
| expected = np.sqrt(0.01**2 + 0.01**2 + 0.01**2) | ||
| assert abs(result - expected) < 1e-10 |
There was a problem hiding this comment.
The intention was to test, invalid data fails, valid data works, within the same test. But as rightly said, this already is covered in the first test case, so I will remove it.
tests/ops/test_validate.py
Outdated
| def test_parameter_matrix_combinations(self): | ||
| """Test all combinations of std_dev × verify × sigma_limit parameters.""" | ||
| test_gps = pd.Series({ | ||
| "ant_cov_XX1": 0.01, | ||
| "ant_cov_YY1": 0.01, | ||
| "ant_cov_ZZ1": 0.01, | ||
| }) | ||
|
|
||
| variance_gps = pd.Series({ | ||
| "ant_cov_XX1": 0.0001, # 0.01² | ||
| "ant_cov_YY1": 0.0001, | ||
| "ant_cov_ZZ1": 0.0001, | ||
| }) | ||
|
|
||
| # Test matrix: std_dev=[True, False] × verify=[True, False] × sigma_limit=[strict, lenient] | ||
| combinations = [ | ||
| (True, True, 0.02), # std_dev=True, verify=True, strict limit (should pass) | ||
| (True, True, 0.01), # std_dev=True, verify=True, very strict (should fail) | ||
| (True, False, 0.01), # std_dev=True, verify=False, strict (should return value) | ||
| (False, True, 0.02), # std_dev=False, verify=True, strict (should pass) | ||
| (False, True, 0.01), # std_dev=False, verify=True, very strict (should fail) | ||
| (False, False, 0.01), # std_dev=False, verify=False, strict (should return value) | ||
| ] | ||
|
|
||
| for std_dev, verify, sigma_limit in combinations: | ||
| data = test_gps if std_dev else variance_gps | ||
|
|
||
| if verify and sigma_limit < 0.015: # Will fail verification | ||
| with pytest.raises(ValueError): | ||
| calc_std_and_verify(data, std_dev=std_dev, sigma_limit=sigma_limit, verify=verify) | ||
| else: # Should succeed | ||
| result = calc_std_and_verify(data, std_dev=std_dev, sigma_limit=sigma_limit, verify=verify) | ||
| assert isinstance(result, float) | ||
| assert result > 0 |
There was a problem hiding this comment.
This was already more or less done in #384, remove.
There was a problem hiding this comment.
Yes, I think I over engineered here a bit
tests/ops/test_validate.py
Outdated
| def test_numerical_precision_small_values(self): | ||
| """Test numerical precision with very small values.""" | ||
| tiny_gps = pd.Series({ | ||
| "ant_cov_XX1": 1e-10, | ||
| "ant_cov_YY1": 1e-10, | ||
| "ant_cov_ZZ1": 1e-10, | ||
| }) | ||
|
|
||
| result = calc_std_and_verify(tiny_gps, std_dev=True, sigma_limit=1e-8, verify=True) | ||
| expected = np.sqrt(3 * (1e-10)**2) | ||
| assert abs(result - expected) < 1e-15 |
There was a problem hiding this comment.
This is addressed with the floating point edge cases below, remove.
tests/ops/test_validate.py
Outdated
| def test_numerical_precision_large_values(self): | ||
| """Test numerical precision with very large values.""" | ||
| large_gps = pd.Series({ | ||
| "ant_cov_XX1": 1e6, | ||
| "ant_cov_YY1": 1e6, | ||
| "ant_cov_ZZ1": 1e6, | ||
| }) | ||
|
|
||
| result = calc_std_and_verify(large_gps, std_dev=True, sigma_limit=2e6, verify=True) | ||
| expected = np.sqrt(3 * (1e6)**2) | ||
| assert abs(result - expected) < 1e3 # Allow for some floating point error |
There was a problem hiding this comment.
This test is unnecessary, remove.
tests/ops/test_validate.py
Outdated
| "ant_cov_XX1": 0.1 + 0.2 - 0.3, # Should be 0, but floating point... | ||
| "ant_cov_YY1": 1.0 / 3.0 * 3.0 - 1.0, # Should be 0 |
There was a problem hiding this comment.
Split these up into separate tests so that if one fails we know which one.
There was a problem hiding this comment.
Hi @johnbdesanto , I was wondering if this is actually required as we are testing it implicitly.
tests/ops/test_validate.py
Outdated
| def test_single_axis_dominance(self): | ||
| """Test cases where one axis dominates the 3D calculation.""" | ||
| dominant_axis_gps = pd.Series({ | ||
| "ant_cov_XX1": 0.05, # Large value | ||
| "ant_cov_YY1": 1e-10, # Tiny value | ||
| "ant_cov_ZZ1": 1e-10, # Tiny value | ||
| }) | ||
|
|
||
| result = calc_std_and_verify(dominant_axis_gps, std_dev=True, sigma_limit=0.1, verify=True) | ||
| # Result should be dominated by the X component | ||
| assert abs(result - 0.05) < 1e-6 |
There was a problem hiding this comment.
This test seems unnecessary, remove.
johnbdesanto
left a comment
There was a problem hiding this comment.
I've left a number of comments. Many of the tests are similar to the point of testing the same things, so I have requested that those which are redundant be removed.
|
@weknowthecalmat Were these tests AI Generated? |
|
@lsetiawan A part of it. So I wanted to verify that we cover all the possible combinations and we don't miss out on any. The 4 new test cases that I added as per suggestion from AI - test_parameter_matrix_combinations, test_numerical_precision_small_values / large_values, test_single_axis_dominance, test_extreme_limits. Others I validated my approach. I used GPT model to brainstorm a bit for this. I think in this process this got a bit over-engineered, which I will avoid. |
|
Hi @johnbdesanto thanks for reviewing it. I think I over engineered it a bit. Thanks for all the comments. I was looking through a much more simple lens now and was wondering if "test_floating_point_edge_cases" was required, since numpy handles this use case. |
…o values, negative values, NaN/Inf handling - Boundary testing: exact sigma_limit matches, extreme limits - Error scenarios: invalid data types, missing columns, empty data - Parameter matrix: all combinations of std_dev × verify × sigma_limit - Numerical precision: small/large values, floating-point edge cases - Additional robustness tests for single axis dominance and consistency
… (parameter matrix, numerical precision, floating-point) - Simplify boundary and invalid data types
2600c80 to
374ab8d
Compare
|
@johnbdesanto I have added the changes as per the review feedback |
Added 15 test cases for -
Zero values - test_zero_values
Negative values - test_negative_values_variance_mode
NaN/Inf handling - test_nan_values, test_inf_values
Boundary testing - test_exact_sigma_limit_matches, test_extreme_limits
Error scenarios - test_invalid_data_types, test_different_column_counts, test_empty_series
Parameter matrix - test_parameter_matrix_combinations
Data validation - test_empty_series, test_different_column_counts
Numerical precision - test_numerical_precision_small_values, test_numerical_precision_large_values, test_floating_point_edge_cases
Additional robustness - test_single_axis_dominance, test_variance_vs_std_dev_consistency
Relates to the issue - #196