**Describe the bug** Invoking `AddOpenAIEmbeddingGenerator()` on an `IKernelBuild` instance adds a `Microsoft.Extensions.AI.IEmbeddingGenerator` that does not respect the `BaseAddress` its `httpClient` parameter. This means that when you eventually call `GenerateAsync()` on the embedding service, it attempts to connect to "https://api.openai.com/" instead the URI specified in the client. The same `HttpClient` instance works correctly with the now obsoleted `AddOpenAITextEmbeddingGeneration()`, the generated `ITextEmbeddingGenerationService`, and eventual call to `GenerateEmbeddingAsync()`. It continues to work with `AddOpenAIChatCompletion()` and the eventual call to `GetChatMessageContentAsync()`. **To Reproduce** Create a kernel. Add chat completion and embedding generation. Then try to generate some embeddings. Your call will hit the wrong server. ``` // NOTE: This is a fake URI, not the one we actually use, but it does not matter because // GenerateAsync() will end up hitting "https://api.openai.com/" instead. Uri openAiProxy = new Uri("https://SOME.PROXY.URL/openai/v1"); HttpClientHandler handler = new(); HttpClient foo = new(handler) { BaseAddress = openAiProxy; }; var builder = Kernel.CreateBuilder(); // NOTE: _model and apiKey are strings set elsewhere. They do not matter for purposes of this issue. builder = builder.AddOpenAIChatCompletion(modelId: _model, apiKey: apiKey, httpClient: httpClient); #pragma warning disable SKEXP0010 // NOTE: _embeddingModel and apiKey are strings set elsewhere. They do not matter for purposes of this issue. builder = builder.AddOpenAIEmbeddingGenerator(modelId: _embeddingModel, apiKey: apiKey, httpClient: httpClient); #pragma warning restore SKEXP001 var _kernel = builder.Build(); var _chatService = _kernel.GetRequiredService<IChatCompletionService>(); var _embeddingService = _kernel.GetRequiredService<IEmbeddingGenerator<string, Embedding<float>>>(); // THIS CALL WILL HIT THE WRONG SERVER even though A call to // _chatService.GetChatMessageContentAsync() will work correctly here. var response = await _embeddingService.GenerateAsync("Some string"); ``` **Expected behavior** The call to `_embeddingService.GenerateAsync("Some string")` should invoke "https://SOME.PROXY.URL/openai/v1/embeddings". Instead, it invokes "https://api.openai.com/v1/embeddings". **Screenshots** N/A **Platform** - Language: C# - Source: Microsoft.SemanticKernel 1.55.0 (NuGet) - AI model: N/A - IDE: Visual Studio - OS: Windows **Additional context** I cannot share a true working reproduction of this issue because I cannot post credentials to log in to our proxy. The easiest way to see this issue is to implement a custom `HttpClientHandler`, override `SendAsyc()`, and set a breakpoint. I was able to work around this issue by modifying `request.RequestUri` in `SendAsync()`. Here is a screenshot of the breakpoint working for the /v1/chat/completions call. You can see the request URI is our custom target. <img width="1250" height="638" alt="Image" src="https://github.com/user-attachments/assets/a21030e2-5014-4316-92d3-7bfb2f0fa847" /> Here is a screenshot for embeddings. You can see that it is still the default openai endpoint: <img width="1277" height="676" alt="Image" src="https://github.com/user-attachments/assets/4ccbe48b-a6d5-48a6-92ab-367823953c00" /> NOTE: The screenshot shows the code that works around the issue. The breakpoint is immediately before the workaround.