-
Notifications
You must be signed in to change notification settings - Fork 25
Open
Labels
area:configgep:requiredmodel:embeddingmodel:languagetype:epicA high level issue that usually consist of smaller tasks and corresponds to some functionalityA high level issue that usually consist of smaller tasks and corresponds to some functionality
Description
Requirements
- Mask PHI, PII, and PCI information in lang/embedding requests
- Unmask all masked values before returning LLM chat responses back
- Unmask masked tokens in streaming chat requests
- Allow to enable the functionality per router
- consider using the middleware-like approach to the architecture and that there may be more pre/post processing logic to include in the future (like ✨ [Cost] Incoming Token Compression #259)
Use Case
- I want to be able to mask sensitive information (PHI, PII, and PCI) before sending it to LLM providers, so I'm not leaking it. If model returns masked placeholders, I want it to be unmasked automatically and seamlessly
Metadata
Metadata
Assignees
Labels
area:configgep:requiredmodel:embeddingmodel:languagetype:epicA high level issue that usually consist of smaller tasks and corresponds to some functionalityA high level issue that usually consist of smaller tasks and corresponds to some functionality
Type
Projects
Status
Icebox