fix(schemaUtils): preserve multiple examples instead of collapsing to first #923
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
getExampleData previously returned only the first entry when multiple examples were provided.
This caused example information to be lost and limited downstream features that rely on preserving all examples.
This PR updates the behavior to retain all examples when multiple are defined, while keeping existing behavior unchanged for single or empty example definitions.
What changed
getExampleData now:
returns a single value when exactly one example is defined (unchanged behavior)
returns an object map when multiple examples are defined
returns an empty string when no examples are present (backward compatible)
Why this matters
Enables accurate handling of OpenAPI examples objects
Prevents silent loss of example data
Unblocks future work around generating multiple response examples (e.g. parameter-based permutations)
Tests
Added unit tests covering:
empty example object (legacy behavior)
multiple examples preservation (new behavior)
All existing tests continue to pass.
Scope
This change is intentionally limited to schemaUtils.getExampleData
and does not modify request/response generation logic.