@@ -95,6 +95,47 @@ var response = await chat.GetResponseAsync(messages, options);
9595
9696```
9797
98+ ## Observing Request/Response
99+
100+ The underlying HTTP pipeline provided by the Azure SDK allows setting up
101+ policies that can observe requests and responses. This is useful for
102+ monitoring the requests and responses sent to the AI service, regardless
103+ of the chat pipeline configuration used.
104+
105+ This is added to the ` OpenAIClientOptions ` (or more properly, any
106+ ` ClientPipelineOptions ` -derived options) using the ` Observe ` method:
107+
108+ ``` csharp
109+ var openai = new OpenAIClient (
110+ Env .Get (" OPENAI_API_KEY" )! ,
111+ new OpenAIClientOptions ().Observe (
112+ onRequest : request => Console .WriteLine ($" Request: {request }" ),
113+ onResponse : response => Console .WriteLine ($" Response: {response }" ),
114+ ));
115+ ```
116+
117+ You can for example trivially collect both requests and responses for
118+ payload analysis in tests as follows:
119+
120+ ``` csharp
121+ var requests = new List <JsonNode >();
122+ var responses = new List <JsonNode >();
123+ var openai = new OpenAIClient (
124+ Env .Get (" OPENAI_API_KEY" )! ,
125+ new OpenAIClientOptions ().Observe (requests .Add , responses .Add ));
126+ ```
127+
128+ We also provide a shorthand factory method that creates the options
129+ and observes is in a single call:
130+
131+ ``` csharp
132+ var requests = new List <JsonNode >();
133+ var responses = new List <JsonNode >();
134+ var openai = new OpenAIClient (
135+ Env .Get (" OPENAI_API_KEY" )! ,
136+ ClientOptions .Observe (requests .Add , responses .Add ));
137+ ```
138+
98139
99140## Console Logging
100141
0 commit comments