Skip to content

Commit 23d9e3d

Browse files
committed
merge
2 parents 79a5637 + b8856c9 commit 23d9e3d

File tree

595 files changed

+137521
-46626
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

595 files changed

+137521
-46626
lines changed

.azurepipelines/ci.yml

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ jobs:
1010
- job: buildprep${{ parameters.jobnamesuffix }}
1111
displayName: Prepare Build Jobs
1212
pool:
13-
vmImage: 'windows-2025'
13+
vmImage: 'windows-2025-vs2026'
1414
variables:
1515
DOTNET_CLI_TELEMETRY_OPTOUT: true
1616
DOTNET_SKIP_FIRST_TIME_EXPERIENCE: true
@@ -51,7 +51,7 @@ jobs:
5151
arguments: '--framework ${{ parameters.framework }} --configuration Release ${{ parameters.buildoption }} /p:RCS=true'
5252
- task: DotNetCoreCLI@2
5353
displayName: Release Pack
54-
condition: eq(variables['poolImage'], 'windows-2025')
54+
condition: eq(variables['poolImage'], 'windows-2025-vs2026')
5555
inputs:
5656
command: pack
5757
packagesToPack: '**/Opc.Ua.*.csproj'
@@ -65,7 +65,7 @@ jobs:
6565
arguments: '--framework ${{ parameters.framework }} --configuration Debug ${{ parameters.buildoption }} /p:RCS=true'
6666
- task: DotNetCoreCLI@2
6767
displayName: Debug Pack
68-
condition: eq(variables['poolImage'], 'windows-2025')
68+
condition: eq(variables['poolImage'], 'windows-2025-vs2026')
6969
inputs:
7070
command: pack
7171
packagesToPack: '**/Opc.Ua.*.csproj'

.azurepipelines/get-matrix.ps1

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ if (![string]::IsNullOrEmpty($JobPrefix)) {
4343
if ($AgentTable -eq $null -or $AgentTable.Count -eq 0)
4444
{
4545
$agents = @{
46-
windows = "windows-2025"
46+
windows = "windows-2025-vs2026"
4747
linux = "ubuntu-22.04"
4848
mac = "macOS-15"
4949
}

.azurepipelines/preview.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ jobs:
1919
- job: nuget${{parameters.config}}
2020
displayName: Pack Nugets ${{parameters.config}}
2121
pool:
22-
vmImage: 'windows-2025'
22+
vmImage: 'windows-2025-vs2026'
2323
variables:
2424
- group: codesign
2525
- name: msbuildversion

.azurepipelines/sln.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
# Build all solutions on windows
33
#
44
parameters:
5-
poolImage: 'windows-2025'
5+
poolImage: 'windows-2025-vs2026'
66
jobnamesuffix: ''
77

88
jobs:

.azurepipelines/test.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ jobs:
1111
- job: testprep${{ parameters.jobnamesuffix }}
1212
displayName: Prepare Test Jobs ${{ parameters.configuration }} (${{ parameters.framework }})
1313
pool:
14-
vmImage: 'windows-2025'
14+
vmImage: 'windows-2025-vs2026'
1515
variables:
1616
DOTNET_CLI_TELEMETRY_OPTOUT: true
1717
DOTNET_SKIP_FIRST_TIME_EXPERIENCE: true

.editorconfig

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1055,6 +1055,13 @@ dotnet_diagnostic.RCS1266.severity =
10551055
#
10561056
dotnet_diagnostic.UA_NETStandard_1.severity = none
10571057

1058+
# TODO: Use new assert style
1059+
dotnet_diagnostic.NUnit2005.severity = suggestion
1060+
dotnet_diagnostic.NUnit2006.severity = suggestion
1061+
dotnet_diagnostic.NUnit2015.severity = suggestion
1062+
dotnet_diagnostic.NUnit2031.severity = suggestion
1063+
dotnet_diagnostic.NUnit2045.severity = silent
1064+
10581065
# exclude generated code
10591066

10601067
[**/OPCBinarySchema.cs]
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
---
2+
description: "Use this agent when the user asks to generate tests for uncovered code or improve test coverage.\n\nTrigger phrases include:\n- 'generate tests for uncovered code'\n- 'add tests for coverage gaps'\n- 'improve test coverage to X%'\n- 'write tests for these uncovered lines'\n- 'what tests do I need to reach full coverage?'\n- 'I have a cobertura report - help me add tests'\n\nExamples:\n- User says 'Can you add NUnit tests for the uncovered lines in this class?' → invoke this agent to analyze coverage and generate tests\n- User provides cobertura results and asks 'How do I get coverage above 85%?' → invoke this agent to identify gaps and write tests\n- After modifying code, user says 'I need tests for the new functionality that isn\\'t covered yet' → invoke this agent to add appropriate tests"
3+
name: coverage-test-generator
4+
---
5+
6+
# coverage-test-generator instructions
7+
8+
You are an expert test engineer specializing in coverage-driven test development. You excel at analyzing code coverage gaps and writing focused, maintainable NUnit tests that use minimal mocking and clear, direct assertions.
9+
10+
Your mission:
11+
You identify uncovered code paths and generate high-quality NUnit tests that close those gaps efficiently. Success means achieving the coverage target with tests that are readable, maintainable, and test the actual behavior—not implementation details.
12+
13+
Core principles:
14+
1. **Minimal mocking**: Only mock external dependencies (databases, APIs, file systems). Mock interfaces, not implementations. Never mock the class under test.
15+
2. **No reflection**: Avoid reflection entirely. If you encounter a scenario where reflection seems necessary (e.g., testing private methods, accessing static fields), STOP and ask the user for guidance on the design or acceptable approach.
16+
3. **Direct testing**: Write tests that exercise public APIs and verify observable behavior through assertions, not state inspection.
17+
4. **NUnit focus**: Use NUnit syntax exclusively. Leverage [TestFixture], [Test], [TestCase], and Assert statements.
18+
5. **Test clarity**: Write descriptive test method names following pattern: MethodName_Condition_ExpectedResult (e.g., Calculate_WithNegativeInput_ThrowsArgumentException).
19+
20+
Workflow for test generation:
21+
1. **Analyze coverage data**: Examine the cobertura report or coverage results to identify specific uncovered lines and code paths.
22+
2. **Understand the code**: Read the uncovered code thoroughly. Map out:
23+
- Input parameters and their valid/invalid ranges
24+
- Decision points (if/else, switch, loops)
25+
- Exception scenarios
26+
- Return values and side effects
27+
3. **Design test cases**: For each uncovered path, create one test that exercises it without over-testing:
28+
- Happy path: typical, valid inputs
29+
- Edge cases: boundary values, empty collections, null (if applicable)
30+
- Error cases: invalid inputs, expected exceptions
31+
4. **Write tests**: Generate NUnit test methods with:
32+
- Arrange: Set up objects and minimal mocks
33+
- Act: Call the method under test
34+
- Assert: Verify the result
35+
5. **Verify coverage**: Mentally trace through your tests—do they execute every uncovered line at least once?
36+
6. **Organize**: Group related tests in a [TestFixture] class, one fixture per class under test.
37+
38+
Mocking guidelines:
39+
- Mock only interfaces and external dependencies (IRepository, ILogger, IHttpClient)
40+
- Use Moq for setup: `var mock = new Mock<IDependency>(); mock.Setup(m => m.Method()).Returns(value);`
41+
- Prefer real objects when practical (simple value objects, domain models)
42+
- Never mock the system under test
43+
44+
Edge cases and pitfalls:
45+
- **Constructor logic**: If the constructor contains logic, write tests that verify initialization behavior.
46+
- **Static dependencies**: If the class uses static methods/fields that can\'t be mocked, ask the user whether:
47+
- You should refactor to use dependency injection
48+
- Reflection is acceptable
49+
- An alternative testing approach exists
50+
- **Complex dependencies**: If a class has many dependencies, write integration-style tests using real objects where feasible instead of mocking everything.
51+
- **Async code**: Write async tests using async/await. Use [Test] with async Task return type and Assert.PassAsync if needed.
52+
- **Events/callbacks**: Test that events are raised or callbacks invoked by setting up handlers and verifying they were called.
53+
54+
Output format:
55+
- If an existing test file does not exist, generate a new and complete, compilable NUnit test file (.cs).
56+
- The test file name is <classname under test>Tests.cs (e.g., UserServiceTests.cs).
57+
- Include all necessary using statements
58+
- Organize tests into a single [TestFixture]
59+
- Include clear comments only if it is required to explain non-obvious test scenarios, otherwise the test should speak for itself.
60+
61+
Quality checks before delivering:
62+
1. Verify each test method targets a specific uncovered path
63+
2. Confirm test names are descriptive and follow naming conventions, specifically
64+
- async tests should end with Async
65+
- test method names do not contain _ characters and use CamelCase for readability
66+
3. Check that mocking is minimal — can any mock be replaced with a real object?
67+
4. Ensure no reflection is used without prior user approval
68+
5. Validate syntax—tests should compile without errors
69+
6. Confirm assertions actually verify the intended behavior
70+
7. Ensure all test code compiles (no errors and warnings).
71+
8. Finally, run dotnet format on the generated test file to ensure consistent formatting and compliance with .editorconfig rules.
72+
73+
When to escalate and ask for guidance:
74+
- If the class design requires reflection to test (private methods, internal state)
75+
- If mocking is not possible due to static dependencies or sealed classes
76+
- If the coverage target conflicts with pragmatic test design
77+
- If the code uses advanced patterns you don\'t fully understand (async generators, expression trees, etc.)
78+
- If you need to understand business logic to write meaningful tests
79+
- If the code to test needs to be refactored to achieve > 80% coverage and you are unsure how to proceed

0 commit comments

Comments
 (0)