Skip to content

[Update] Product Receiver Service Enhancements: Zod Validation, Field Renaming, Composite Keys, Retry Logic, and Logstash Integration #23

@amema42

Description

@amema42

Product Receiver Service Enhancements

  1. Integration of Zod for Request Validation

    • Description:
      • Integrated Zod for schema validation of incoming JSON requests.
      • Defined validation schemas for both Product and Store endpoints to ensure all required fields are present and correctly formatted.
    • Impact:
      • Enhances data integrity by preventing malformed or incomplete data from being processed.
      • Reduces runtime errors related to invalid data inputs.
  2. Automatic Renaming of long to lng

    • Description:
      • Implemented middleware to automatically rename the field long to lng in incoming data.
      • Ensures compatibility with the Prisma schema, which uses lng for longitude.
    • Impact:
      • Prevents mapping errors and ensures geographical coordinates are correctly interpreted.
      • Facilitates seamless integration with front-end applications or external data sources that may use long.
  3. Addition of street to the Composite Key in Localization

    • Description:
      • Updated the Prisma schema to include the street field in the composite unique key for the Localization model.
      • Composite Unique Key: @@unique([grocery, lat, lng, street])
      • Allows differentiation between stores with identical grocery, lat, and lng but located on different streets.
    • Impact:
      • Prevents duplicate entries for stores sharing the same geographical coordinates and name but differing in street addresses.
      • Enhances the precision of store management and search functionalities.
  4. Upsert Logic for Products Using (name_id, localizationId) as Unique Reference

    • Description:
      • Modified the upsert logic for the Product model to use the combination of name_id and localizationId as the unique reference.
      • Removed or commented out the previous use of document_id to streamline the uniqueness constraint.
    • Impact:
      • Ensures each product is uniquely associated with a specific store.
      • Reduces the risk of duplicate product entries within the same store.
  5. Implementation of Retry Logic for Product Upserts on Known Prisma Errors

    • Description:
      • Added a retry mechanism that attempts to upsert a product up to 3 times in the event of known Prisma errors (P2028, P2002).
      • Introduced a 1-second delay between retry attempts to handle transient issues.
    • Impact:
      • Increases the resilience of the service against transient database errors.
      • Enhances the reliability of product upsert operations, reducing the likelihood of failed transactions.
  6. Automatic Logging to Logstash on Successful Product Upserts

    • Description:
      • Updated the Logstash configuration to handle the new logging requirements.
      • After each successful product creation or update, the service sends product data to Logstash for processing and indexing.
    • Updated Logstash Configuration:
      input {
        beats {
          port => 5044
        }
      
        tcp {
          port => 50000
          codec => json_lines
        }
      }
      
      filter {
        if [localization] {
          mutate {
            # Rename 'lng' to 'lon' for Elasticsearch compatibility
            rename => { "[localization][lng]" => "[localization][lon]" }
          }
          mutate {
            # Create the 'location' field as a string "lat,lon"
            add_field => { "location" => "%{[localization][lat]},%{[localization][lon]}" }
          }
        }
        if ![name] and [name_id] {
          mutate {
            copy => { "name_id" => "name" }
          }
        }
      }
      
      output {
        elasticsearch {
          hosts => "http://elasticsearch:9200"
          index => "products"
          document_id => "%{id}"
          action => "index"
          doc_as_upsert => true
          user => "logstash_internal"
          password => "${LOGSTASH_INTERNAL_PASSWORD}"
        }
        stdout { codec => rubydebug }
      }
      
    • Impact:
      • Enhances monitoring and analysis capabilities by ensuring all product data changes are logged and indexed in Elasticsearch.
      • Facilitates advanced reporting and tracking of product-related activities.

Functional Status

  • Mode "Convenience":

    • The search service continues to function correctly in "convenience" mode, returning single stores that offer all requested products or the best combinations of stores.
  • Existence and Availability Endpoints:

    • Endpoints such as /product/exists and /product/in-shop operate as expected, verifying the existence and availability of products in specified stores.

Current Issue

  • Mode "Savings":
    • The "savings" mode in the search service is currently non-functional, returning empty responses even when valid combinations of stores that cover all requested products exist.

Expected Behavior

  • Mode "Savings":

    • Should identify and return combinations of up to two stores that offer all requested products at the lowest total price.
    • Ensure Elasticsearch queries correctly retrieve the necessary data to form these combinations.
  • Product Receiver Service:

    • Ensure that received product data is correctly indexed and validated, allowing the search service to operate effectively in both "savings" and "convenience" modes.

Steps to Reproduce the Issue

  1. Attempt to use the "savings" mode in the search service with a set of products that are available across multiple stores.
  2. Observe that the service returns empty responses despite the existence of valid store combinations.

Proposed Solution

  • Conduct a thorough review of the combination logic within the search service to ensure it correctly identifies and returns valid store combinations.
  • Verify that Elasticsearch queries are accurately retrieving and processing product and store data to support the "savings" mode functionality.

Impact of Changes

Database

  • Prisma Migrations Required:
    • Execute Prisma migrations to update the database (local and remote) schema in line with the new definitions.
    • npx prisma migrate dev

Dependent Services

  • Alignment with New Changes:
    • Other services interacting with the database or the Product Receiver Service may require updates to accommodate the new unique keys and additional fields.

Testing

  • Integration and Unit Tests:
    • Perform comprehensive testing to ensure that the changes do not introduce regressions and that all functionalities operate as expected.

Conclusion

The recent updates to the Product Receiver Service and the Prisma schema aim to enhance data management for products and stores, ensuring data integrity and uniqueness. However, the "savings" mode in the search service remains non-functional, returning empty responses even when valid store combinations exist. A detailed review of the store combination logic and Elasticsearch query configurations is necessary to identify and resolve this malfunction, ensuring that all search modes operate correctly.

Metadata

Metadata

Labels

documentationImprovements or additions to documentationenhancementNew feature or requesthelp wantedExtra attention is needed

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions