Skip to content

Conversation

@ryanmeans
Copy link

Description: (required)
This is effectively an update for the Pangea Guardrails plugin. First, it is a rebrand, since Pangea has been acquired by crowdstrike, and has a new product name

It's updated for much newer APIs. The current APIs support blocking and redaction based on configured upstream rules. The new endpoint is designed to work with structued LLM inputs, e.g. you should pass it the entire messages array, but also supports having tools and other properties a well.

I think we should consider removing the Pangea guardrails as well.

Tests Run/Test cases added: (required)

  • Added tests for the new guardrails

Type of Change:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update
  • Refactoring (no functional changes)


const url = `${parameters.credentials.baseUrl}/v1/guard_chat_completions`;
const target = eventType === 'beforeRequestHook' ? 'request' : 'response';
const json = context[target].json;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the whole request body is being passed without any transformation, this would only work for chat completions API but it might fail for OpenAI responses API and Anthropic messages API format if the Crowdstrike guardrails API does not support them.

Are responses and messages format supported? I do not see them in the documentation

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Marking this as a known limitation and merging this for now.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@VisargD Thanks, I'll look into it on our end. Right now the endpoint supports any kind of JSON, but it does implicitly expect the format to be the OpenAI completions AI format and we do have special logic for the messages array which might be lost.

@VisargD VisargD merged commit 7799507 into Portkey-AI:main Jan 23, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants