Merged
Conversation
Implements the FHIRPath union operator (|) for Boolean, Integer, and String types using Spark's array_union() function for deduplication. Key changes: - Add UnionOperator class extending SameTypeBinaryOperator - Enhance SameTypeBinaryOperator with handleOneEmpty() hook for custom empty collection semantics - Register union operator in BinaryOperatorType enum - Enable union expression parsing in Visitor - Add comprehensive DSL test suite with 61 test cases covering empty collections, single values, arrays, and grouped expressions - Add test exclusion for cross-type unions not yet supported Per FHIRPath spec, union eliminates duplicates using equality semantics, and unioning with an empty collection returns the non-empty collection with duplicates eliminated. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
feat: Add DecimalCollection support to union operator Implements union operator (|) for decimal values by normalizing decimal precision to DECIMAL(32,6) before merging. This enables union operations on decimals with different precisions and mixing of integer and decimal values. Changes: - Add normalizeDecimalType() method to DecimalCollection for type compatibility - Add getArrayForUnion() helper in UnionOperator using Java 21 pattern matching - Refactor union operations to use new helper method - Add comprehensive test coverage for decimal unions including precision variations 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com> Implements the FHIRPath union operator (|) for Boolean, Integer, and String types using Spark's array_union() function for deduplication. Key changes: - Add UnionOperator class extending SameTypeBinaryOperator - Enhance SameTypeBinaryOperator with handleOneEmpty() hook for custom empty collection semantics - Register union operator in BinaryOperatorType enum - Enable union expression parsing in Visitor - Add comprehensive DSL test suite with 61 test cases covering empty collections, single values, arrays, and grouped expressions - Add test exclusion for cross-type unions not yet supported Per FHIRPath spec, union eliminates duplicates using equality semantics, and unioning with an empty collection returns the non-empty collection with duplicates eliminated. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Extends the union operator (|) to support QuantityCollection using custom equality semantics. The implementation uses aggregate-based deduplication with ifnull() to properly handle non-comparable quantities (e.g., different dimensions like 'cm' vs 'cm2', or indefinite calendar durations like '1 year' vs '12 months'). Key changes: - Add deduplicateWithEquality() helper using Spark aggregate() function - Handle QuantityCollection in both handleOneEmpty() and handleEquivalentTypes() - Add asStringCollection() to QuantityCollection for test framework - Add comprehensive test suite with 13 Quantity union test cases - Add exclusion for indefinite calendar duration union deduplication - Update all union tests to follow singular/plural pattern The first element is retained when quantities are equal (same dimension and normalized value), and both values are kept when they are non-comparable (equality returns NULL). 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Extends union operator (|) to support Coding type with custom equality semantics based on system, code, version, display, and userSelected fields. Follows the same pattern as QuantityCollection using aggregate- based deduplication. Changes: - UnionOperator: Added CodingCollection type checks and custom equality - CodingCollection: Added asStringCollection() for test formatting - DefaultYamlTestExecutor: Extended result formatting for Coding types - CombiningOperatorsDslTest: Added 17 comprehensive test cases All tests pass: 112/112 CombiningOperatorsDslTest, 1821/1821 YamlReferenceImplTest 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Extends union operator (|) to support Time type with precision-aware equality semantics via TemporalComparator. Times with different precisions are treated as incomparable and both are retained in the union. Changes: - UnionOperator: Added TimeCollection type checks and custom equality - TimeCollection: Added asStringCollection() for test formatting - DefaultYamlTestExecutor: Extended result formatting for Time types - CombiningOperatorsDslTest: Added 23 comprehensive test cases covering same precision, different precision, and mixed precision scenarios All tests pass: 135/135 CombiningOperatorsDslTest, 1821/1821 YamlReferenceImplTest 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Implements union operator (|) for Date and DateTime types with precision-aware equality for deduplication. Date and DateTime types can be unioned together with implicit type promotion, but values remain incomparable due to precision differences. Key changes: - Added DateCollection and DateTimeCollection support to UnionOperator - Both types use custom equality via deduplicateWithEquality() for precision-aware comparison - Added castAs() method to DateCollection for Date to DateTime type promotion - Migrated test expectations to FhirTypedLiteral pattern using toDate(), toDateTime(), toTime(), and toCoding() helpers - Added comprehensive Date, DateTime, and Date/DateTime union tests covering precision handling, empty collections, arrays, and nested expressions - Restored QuantityCollection special handling in DefaultYamlTestExecutor Test coverage: - 175 tests passing in CombiningOperatorsDslTest - All YamlReferenceImplTest cases passing with expected exclusions Resolves #2398 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Refactors UnionOperator to eliminate explicit instanceof checks on collection types by using comparator detection instead. This makes the code more maintainable, extensible, and follows the open/closed principle. Key changes: - Add usesDefaultSqlEquality() marker method to ColumnEquality interface - Extract array operations to SqlFunctions utility methods: - arrayDistinctWithEquality() for custom equality deduplication - arrayUnionWithEquality() for custom equality merging - Refactor UnionOperator to use comparator detection via usesDefaultSqlEquality() instead of explicit collection type checks - Update class documentation to describe behavior in terms of type reconciliation and equality semantics Benefits: - Polymorphic: uses comparator type instead of collection type - Extensible: new collection types require no UnionOperator changes - Reusable: SqlFunctions methods available for other operators - Maintainable: clear separation of concerns All 175 tests in CombiningOperatorsDslTest pass. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Implements the FHIRPath is() function per specification to check if a value is of a specified type. The implementation includes: ## Core Implementation - TypeFunctions class with is() function that accepts TypeSpecifier - Collection.isOfType() method for type checking with proper handling of: * Static emptiness (EmptyCollection returns empty) * Runtime emptiness (returns null for empty values) * Singular constraint enforcement (throws error for multi-item collections) * Type matching via filterByType() - ChoiceElementCollection.isOfType() override for polymorphic elements ## Parser Enhancements - IdentifierHelper utility class for extracting identifier values from parser contexts - Support for delimited identifiers (backtick-quoted) in type specifiers - TypeSpecifierVisitor refactored with static factory methods for clarity - Updated InvocationVisitor to recognize is() as a type specifier function ## Test Coverage - TypeFunctionsDslTest with 17 comprehensive test cases covering: * Primitive and complex type matching * Namespace variations (System, FHIR, unqualified) * Edge cases (empty collections, multi-item errors) * Integration with other functions - All tests passing ## YAML Reference Test Exclusions Added exclusions for known architectural differences: 1. FHIR/System type namespace equivalence (fhir-r4.yaml) - Pathling intentionally treats FHIR.boolean and System.Boolean as equivalent - Violates strict FHIRPath namespace separation for query efficiency - Excluded: Patient.active.is(Boolean).not(), Patient.active.is(System.Boolean).not() 2. Primitive elements with extensions (6.3_types.yaml) - Pathling doesn't support primitive elements with only extensions (no value) - Related to existing issue #437 - Excluded: Patient.name.given[3].is(FHIR.string) Related issue: #2383 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Implements the 'is' operator syntax (e.g., `value is Type`) as a binary operator alongside the existing is() function. The operator delegates to Collection.isOfType() for type checking and supports the same empty collection semantics. Key changes: - Add invokeWithPaths() to FhirPathBinaryOperator for operators with special evaluation needs - Create IsOperator that extracts TypeSpecifier from parse tree - Fix TypeSpecifierVisitor to parse qualified/unqualified type names - Update Visitor to handle type expressions from grammar - Add comprehensive DSL tests covering primitives and complex types - Add YAML test exclusions for known limitations (precedence, polymorphic traversal, extensions, type hierarchy) Closes #2383 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Adds the as() function which returns the input value if it matches the specified type, or an empty collection otherwise. This complements the is() function by returning the actual value instead of a boolean. Key changes: - Add as() function to TypeFunctions with @FhirPathFunction annotation - Add asType() method to Collection using asSingular() + filterByType() - Register as() in InvocationVisitor.isTypeSpecifierFunction() - Add comprehensive DSL tests covering 21 test scenarios - Add YAML test exclusions for known namespace/precedence limitations Implementation leverages existing filterByType() for type matching and asSingular() for cardinality enforcement, ensuring consistency with other type operations. Related: #2383 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Adds the 'as' operator keyword syntax (value as Type) which complements the as() function syntax. This provides the operator form specified in the FHIRPath specification for type casting operations. Key changes: - Add AsOperator class following IsOperator pattern - Update Visitor to support both 'is' and 'as' operators via switch - Add comprehensive DSL tests with 28 test scenarios - Update YAML test exclusions to cover both operator and function syntax Implementation mirrors the 'is' operator and reuses the existing Collection.asType() method for consistency with the as() function. Both operator and function syntax produce identical results. Related: #2383 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Implements .quantity(), .quantityEmpty(), and .quantityArray() methods for FhirPathModelBuilder using FHIRPath quantity literal syntax (e.g., "10.5 'mg'"). Follows the existing Coding type pattern with: - FhirTypedLiteral.toQuantity() for parsing - YAML harness integration for serialization and type conversion - TypeFunctionsDslTest updated to use new concise syntax Reduces Quantity definition verbosity from 4 lines to 1 line. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
…patibility - Convert DSL test cases to use explicit toQuantity() instead of string literals for Quantity expectations in AsOperatorDslTest, CombiningOperatorsDslTest, ConversionFunctionsDslTest, and TypeFunctionsDslTest - Add dual comparison strategy in DefaultYamlTestExecutor to support both string-based expectations (YAML reference tests) and struct-based expectations (DSL tests using toQuantity()) - Normalize Quantity comparisons by ignoring value_scale and canonical fields to focus on semantic equality - Fix Scala deprecation warning by replacing JavaConverters.asScalaBuffer with javaapi.CollectionConverters.asScala - Update YamlSupport.writeQuantity() to include all 10 Quantity struct fields All tests passing: 1420 DSL tests, 1821 YAML reference tests 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Move type function tests for FHIR choice collections into TypeFunctionsDslTest as testTypeFunctionsOnFhirChoiceCollection(). This groups all type function tests (is, as, ofType) in a single test class for better organization. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
…ess) Add comprehensive test coverage for type functions (is, as, ofType) on FHIR complex type elements in testTypeFunctionsOnFhirComplexTypes(). Test coverage includes: - ofType() function for filtering HumanName and Address collections - is() function for type checking on singletons - as() function for type casting with property traversal - Namespace validation (FHIR namespace only for complex types) - Integration scenarios combining type functions with where() and chaining - Edge cases: empty collections, multi-item errors, failed casts 62 new test cases using Patient resource with multiple complex type elements. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Adds count() function that returns the integer count of items in a collection, returning 0 for empty collections per FHIRPath specification. Implementation: - Add count() to ExistenceFunctions using existing ColumnRepresentation.count() - Returns IntegerCollection for proper FHIRPath type compliance - Auto-discovered via @FhirPathFunction annotation Testing: - Add testCount() with tests for basic types, composite types, and where() composition - Add testCountOnFhirResource() with HAPI Patient resource tests - All DSL tests pass (57 tests total) - YAML reference tests pass with appropriate exclusions Test exclusions: - Add 4 exclusions for known issue #437 (primitive elements with extensions but no value) - Exclusions for: where() filtering, as() operator, and union operations with count() Closes #2532 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Implements a limited version of the FHIRPath resolve() function that extracts and returns type information from Reference elements, supporting type checking with the 'is' operator without performing actual resource resolution. Type Extraction: - Prioritizes Reference.type field when present - Falls back to parsing type from Reference.reference string - Supports relative, absolute, and canonical reference formats - Works with contained and logical references when type field is provided - Validates extracted types against FHIR resource type pattern Key Features: - Type checking: resolve() is Patient, resolve() is Organization - Array handling: Preserves array alignment using zip_with() - Filtering: Automatically removes unresolvable references (nulls) - No traversal: Prevents field access on resolved references Implementation: - ReferenceValue: Utility class for type extraction from Reference columns - Static method extractTypeFromColumns() for singular extraction logic - Vectorized extraction using zip_with() for arrays - Both referenceColumn and typeColumn required (@nonnull) - ResolvedReferenceCollection: Represents resolved references with type info - Dynamic type information for 'is' operator support - Prevents traversal to child fields - Supports ofType() filtering - ReferenceCollection.resolve(): Main entry point for resolution - Uses getField() to extract reference and type columns - Preserves array alignment (no empty value removal) - Filters nulls after type extraction Testing: - 86 comprehensive DSL test cases in ResolveFunctionDslTest - Tests cover: basic extraction, type priority, edge cases, HAPI resources, collections, filtering, and integration scenarios - Added fhirReference() helper to FhirPathModelBuilder for consistent Reference creation in tests (ensures both fields always present) Addresses #2522 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add initial FHIR Search implementation supporting resource-level search with token search parameters. This is the first phase of issue #1986. Key components: - FhirSearch: Search query specification with builder pattern - FhirSearchExecutor: Executes search by translating criteria to SparkSQL - SearchParameterRegistry: Hardcoded registry (gender parameter for Patient) - TokenSearchFilter: Builds SparkSQL expressions for token matching - TokenSearchValue: Parses token values (code, system|code, |code formats) The executor uses FHIRPath to extract column values from search parameter expressions, then applies SparkSQL filter expressions to match values. Results maintain the same schema as DataSource.read(). Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Change SearchFilter.buildFilter() to take ColumnRepresentation instead of Column - Use vectorize() in TokenSearchFilter to handle both arrays and singular values - Arrays use exists() to match if ANY element equals the search value - Singular values use direct equalTo() comparison - Add address-use search parameter to registry (Patient.address.use) - Add 5 tests for array-valued token search Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add StringSearchFilter for case-insensitive starts-with matching - Add STRING case to FhirSearchExecutor.getFilterForType() - Add 'family' search parameter for Patient.name.family - Refactor FhirSearchExecutorTest to use @ParameterizedTest Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…bility - Create ElementMatcher functional interface for matching logic - Add TokenMatcher (exact equality) and StringMatcher (case-insensitive prefix) - Convert SearchFilter from interface to concrete class with composition - Delete TokenSearchFilter and StringSearchFilter (logic moved to matchers) - Update FhirSearchExecutor to create SearchFilter with appropriate matcher - Add ElementMatcherTest with 25 unit tests for matcher logic Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add modifier field to SearchCriterion and parsing in FhirSearch builder - Create ExactStringMatcher for case-sensitive exact match (:exact on string) - Add negated flag to SearchFilter for :not modifier (token only) - Handle null values correctly for :not (includes resources with no value) - Add InvalidModifierException for unsupported modifiers per FHIR spec - Add 13 ExactStringMatcher unit tests and 7 modifier integration tests Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Implement range-based date matching per FHIR specification where both element and search values represent implicit ranges based on precision. For the default `eq` prefix, a match occurs when ranges overlap. - Add DateMatcher using existing FhirPathDateTime and boundary UDFs - Register birthdate parameter for Patient resource - Add DATE case to FhirSearchExecutor - Support all precision levels: year, year-month, date, datetime Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Implement FHIR date search prefixes for ordered comparisons: - eq: ranges overlap (default, existing behavior preserved) - ne: ranges do not overlap - gt: resource ends after parameter - ge: resource starts at or after parameter start - lt: resource starts before parameter - le: resource ends at or before parameter end Includes DatePrefix enum for parsing prefixes from search values (e.g., "ge2023-01-15" → prefix=GE, date="2023-01-15"). Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Implements number search parameters using RiskAssessment.probability as the test case. Supports decimal values with all comparison prefixes: eq (default), ne, gt, ge, lt, le. Uses simplified direct comparison without implicit range semantics based on significant figures. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Eliminates ~130 lines of duplicate code by merging the two prefix enums into a single SearchPrefix enum. The new design separates concerns: - SearchPrefix handles only prefix extraction (eq, ne, gt, ge, lt, le) - Value validation moved to type-specific matchers where type is known Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add support for date searches on FHIR Period fields (e.g., Coverage.period). Period boundaries are extracted from start/end STRING fields using existing boundary UDFs, with null values treated as negative/positive infinity. Also adds Condition.recordedDate (dateTime) and AuditEvent.date (instant) search parameters to test date search on scalar date types. Key changes: - SearchParameterType: Add allowed FHIR types per search type for validation - DateMatcher: Add type-aware boundary extraction for Period vs scalar types - FhirSearchExecutor: Extract FHIR type from Collection for matcher selection - SearchParameterRegistry: Add Coverage.period, Condition.recorded-date, AuditEvent.date - Add InvalidSearchParameterException for type validation errors Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Implement complete token search matching with system|code syntax support: - Rewrite TokenMatcher with type-specific matching logic for: - Coding (system|code matching) - CodeableConcept (match ANY coding in array) - Identifier (system|value matching) - ContactPoint (value-only matching) - code, uri, id (simple string equality) - string (case-insensitive matching) - boolean (true/false matching) - Add getSimpleCode() to TokenSearchValue for types that don't support system|code syntax, with validation - Add search parameters to registry: - Patient.identifier (Identifier type) - Patient.telecom (ContactPoint type) - Patient.active (boolean type) - Observation.code (CodeableConcept type) - Add comprehensive unit tests for all token types - Add integration tests for system|code syntax variants Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Implements the Strategy Pattern using enum constant-specific method bodies. Each SearchParameterType now creates its own filters via createFilter(), eliminating the 47-line switch statement in FhirSearchExecutor. This follows the Open/Closed Principle: new search parameter types can be added by extending the enum without modifying FhirSearchExecutor. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Enables search parameters like Observation.date that use FHIRPath union expressions with different types (dateTime | Period | instant). Since the FHIRPath engine requires same-type unions, we handle polymorphic expressions at the search executor level: - SearchParameterDefinition now stores List<String> expressions - Each expression is evaluated separately with its own FHIR type - Filter results are combined with OR logic Includes Observation 'date' search parameter with dateTime, Period, and instant support. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Copy frequently-used skills from user config to the project so they are available to all contributors.
Expose PathlingContext.searchToColumn to R users, allowing FHIR search query strings to be converted into Spark Column objects for use with sparklyr DataFrame filtering operations.
Update logback from 1.5.19 to 1.5.25, wheel from 0.43.0 to 0.46.3, and twine from 5.0.0 to 6.2.0. Regenerate Python lock file to pick up patched transitive dependencies: cryptography 46.0.5, werkzeug 3.1.5, and urllib3 2.6.3.
Add --allow-existing to the uv venv command so it succeeds when the .venv directory already exists. This fixes the CI build failure caused by maven-source-plugin forking a new lifecycle that re-runs the initialize phase.
Apache's CDN drops older Spark versions, and the archive mirror is rate-limited. This adds a GitHub Actions cache for Spark, warmed by a weekly workflow, so builds restore from cache instead of relying on sparklyr's download. The spark_install() call is now skipped when SPARK_HOME is set.
Add ReferenceMatcher with suffix-based matching logic supporting type-qualified references, bare IDs, absolute URLs, and URN references. Implement :not modifier for negated matching and :[type] modifier for constraining target resource types. Consolidate reference field constant into FhirFieldNames.
Adds URI search parameter matching with exact string equality, :below (prefix) and :above (inverse prefix) modifiers. Declares all URI family FHIR types (URI, URL, CANONICAL, OID, UUID) as allowed. Supports :not modifier for negated matching, consistent with other search parameter types.
Spark's exists() returns null for null arrays, so NOT(null) remained null and excluded resources with no value. The FHIR spec requires :not to include resources with no value for the parameter. Apply the same coalesce-to-false treatment already used for scalar columns.
Exposes FHIRPath expression evaluation as a column-producing method on PathlingContext, enabling users to use arbitrary FHIRPath expressions in Spark DataFrame operations like filter() and select(). Includes Java, Python, and R bindings with tests.
DataFrame-level convenience functions that wrap JVM Column objects in idiomatic sparklyr API, enabling pipe-based workflows with FHIRPath and search expressions.
Replace the deprecated AgnosticExpressionPathEncoder with a plain AgnosticEncoder and construct ExpressionEncoder directly, bypassing the SerializerBuildHelper/DeserializerBuildHelper round-trip. Closes #2497
Update Spark version across all modules (core, server, R library) and bump Scala from 2.13.15 to 2.13.16 to match. Remove org.jetbrains.annotations usage that was previously provided as a transitive dependency from Spark, replacing @NotNull with @nonnull and removing IDE-only annotations (@contract, @Unmodifiable). An upgrade to Spark 4.1.1 was initially attempted but abandoned due to binary incompatibility with Delta Lake 4.0.0 (the only stable 4.x release on Maven Central).
Replace placeholder comment with findings from security investigation. Risk is low — exploitation requires authenticated import access with a crafted parquet file. See #2540.
Extensions on FHIR elements nested inside choice types (e.g., value[x]) were silently dropped during encoding due to a hash mismatch in flattenExtensions. The children() method returns property names with the [x] suffix, but getProperty() expects the base name without it. Replaced the hash-based lookup with direct use of Property.getValues() to bypass the mismatch. Closes #2538
Fixed five Javadoc reference errors in the fhirpath module that were causing Maven build failures: - Changed reference to Lombok-generated getter to reference the field directly in ResourceRepresentation - Removed obsolete documentation section referencing deleted FhirPathEvaluator class - Corrected ProjectionContext references to remove non-existent method references in collection classes These documentation-only changes resolve build errors with no impact on runtime behavior.
Overrides Spark's transitive Avro dependency (1.12.0) with the patched version to fix a code injection vulnerability in the Apache Avro Java SDK. The vulnerability allows code injection when processing untrusted Avro schemas. While Pathling does not accept user-provided schemas, upgrading eliminates the vulnerability and unblocks the CI security scan. Added CVE to .trivyignore since Avro is not bundled in library-runtime (provided at runtime) and server explicitly overrides to fixed version.
cc241df to
eec0079
Compare
Adds comprehensive documentation for search_to_column and fhirpath_to_column library functions with examples across all supported languages. Removes incorrect documentation of server-specific _query=fhirPath feature from library API documentation. Updates resolve() function documentation with accurate behavior and limitations.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
No description provided.