Skip to content
Closed
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
253 changes: 253 additions & 0 deletions src/Custom/ChatModelFactory.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,253 @@
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Diagnostics.CodeAnalysis;
using System.Linq;
using OpenAI.Chat;

namespace OpenAI;

/// <summary> Model factory for Chat models. </summary>
public static partial class ChatModelFactory
{
/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatCompletion"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatCompletion"/> instance for mocking. </returns>
public static ChatCompletion ChatCompletion(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@KrzysztofCwalina - I was wondering, why do we have to write the model factory methods manually? Shouldn't these be generated by the code gen?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure. Josh would know

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JoshLove-msft / @jorgerangel-msft / @joseharriaga - Do you know why model factory methods aren't being generated for these types?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The implementation is calling into a different model factory overload? Isn't that one being generated? What is the purpose of this new overload?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, this is a namespace specific model factory. The generator doesn't support this currently. There is one model factory per library.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wait, maybe the PR I was going before the last GA merged? I thought we punted it and this was the attempt to restart. Let's me investigate

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For reference, some of these custom methods are already being generated in the root namespace: https://github.com/openai/openai-dotnet/blob/main/src/Generated/OpenAIModelFactory.cs#L296

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@copilot, dont add model factory methods (or classes) if they aready exist. Only if they don't as in this PR description

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're correct. I've removed the entire ChatModelFactory class since OpenAIChatModelFactory already provides all the necessary factory methods for Chat-related types. Commit ea57bbc removes the redundant factory class and its associated tests.

string id = null,
ChatFinishReason finishReason = default,
ChatMessageContent content = null,
string refusal = null,
IEnumerable<ChatToolCall> toolCalls = null,
ChatMessageRole role = default,
IEnumerable<ChatTokenLogProbabilityDetails> contentTokenLogProbabilities = null,
IEnumerable<ChatTokenLogProbabilityDetails> refusalTokenLogProbabilities = null,
DateTimeOffset createdAt = default,
string model = null,
string systemFingerprint = null,
ChatTokenUsage usage = default)
{
return OpenAI.Chat.OpenAIChatModelFactory.ChatCompletion(
id: id,
finishReason: finishReason,
content: content,
refusal: refusal,
toolCalls: toolCalls,
role: role,
functionCall: default,
contentTokenLogProbabilities: contentTokenLogProbabilities,
refusalTokenLogProbabilities: refusalTokenLogProbabilities,
createdAt: createdAt,
model: model,
systemFingerprint: systemFingerprint,
usage: usage);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.StreamingChatCompletionUpdate"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.StreamingChatCompletionUpdate"/> instance for mocking. </returns>
public static StreamingChatCompletionUpdate StreamingChatCompletionUpdate(
string completionId = null,
ChatMessageContent contentUpdate = null,
IEnumerable<StreamingChatToolCallUpdate> toolCallUpdates = null,
ChatMessageRole? role = default,
string refusalUpdate = null,
IEnumerable<ChatTokenLogProbabilityDetails> contentTokenLogProbabilities = null,
IEnumerable<ChatTokenLogProbabilityDetails> refusalTokenLogProbabilities = null,
ChatFinishReason? finishReason = default,
DateTimeOffset createdAt = default,
string model = null,
string systemFingerprint = null,
ChatTokenUsage usage = default)
{
return OpenAI.Chat.OpenAIChatModelFactory.StreamingChatCompletionUpdate(
completionId: completionId,
contentUpdate: contentUpdate,
functionCallUpdate: default,
toolCallUpdates: toolCallUpdates,
role: role,
refusalUpdate: refusalUpdate,
contentTokenLogProbabilities: contentTokenLogProbabilities,
refusalTokenLogProbabilities: refusalTokenLogProbabilities,
finishReason: finishReason,
createdAt: createdAt,
model: model,
systemFingerprint: systemFingerprint,
usage: usage);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatCompletionDeletionResult"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatCompletionDeletionResult"/> instance for mocking. </returns>
public static ChatCompletionDeletionResult ChatCompletionDeletionResult(
bool deleted = true,
string chatCompletionId = null)
{
return new ChatCompletionDeletionResult(
deleted: deleted,
@object: "chat.completion.deleted",
chatCompletionId: chatCompletionId,
additionalBinaryDataProperties: null);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.UserChatMessage"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.UserChatMessage"/> instance for mocking. </returns>
public static UserChatMessage UserChatMessage(
string content = null,
string participantName = null)
{
content ??= "User message content";
var message = new UserChatMessage(content);
if (participantName != null)
{
message.ParticipantName = participantName;
}
return message;
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.AssistantChatMessage"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.AssistantChatMessage"/> instance for mocking. </returns>
public static AssistantChatMessage AssistantChatMessage(
string content = null,
IEnumerable<ChatToolCall> toolCalls = null,
string participantName = null)
{
AssistantChatMessage message;
if (toolCalls != null)
{
message = new AssistantChatMessage(toolCalls);
}
else
{
content ??= "Assistant message content";
message = new AssistantChatMessage(content);
}

if (participantName != null)
{
message.ParticipantName = participantName;
}
return message;
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.SystemChatMessage"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.SystemChatMessage"/> instance for mocking. </returns>
public static SystemChatMessage SystemChatMessage(
string content = null,
string participantName = null)
{
content ??= "System message content";
var message = new SystemChatMessage(content);
if (participantName != null)
{
message.ParticipantName = participantName;
}
return message;
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ToolChatMessage"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ToolChatMessage"/> instance for mocking. </returns>
public static ToolChatMessage ToolChatMessage(
string toolCallId = "tool_call_id",
string content = null)
{
content ??= "Tool message content";
return new ToolChatMessage(toolCallId, content);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatTokenUsage"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatTokenUsage"/> instance for mocking. </returns>
public static ChatTokenUsage ChatTokenUsage(
int outputTokenCount = default,
int inputTokenCount = default,
int totalTokenCount = default)
{
return OpenAI.Chat.OpenAIChatModelFactory.ChatTokenUsage(
outputTokenCount: outputTokenCount,
inputTokenCount: inputTokenCount,
totalTokenCount: totalTokenCount);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatTokenLogProbabilityDetails"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatTokenLogProbabilityDetails"/> instance for mocking. </returns>
public static ChatTokenLogProbabilityDetails ChatTokenLogProbabilityDetails(
string token = null,
float logProbability = default,
ReadOnlyMemory<byte>? utf8Bytes = null,
IEnumerable<ChatTokenTopLogProbabilityDetails> topLogProbabilities = null)
{
return OpenAI.Chat.OpenAIChatModelFactory.ChatTokenLogProbabilityDetails(
token: token,
logProbability: logProbability,
utf8Bytes: utf8Bytes,
topLogProbabilities: topLogProbabilities);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatTokenTopLogProbabilityDetails"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatTokenTopLogProbabilityDetails"/> instance for mocking. </returns>
public static ChatTokenTopLogProbabilityDetails ChatTokenTopLogProbabilityDetails(
string token = null,
float logProbability = default,
ReadOnlyMemory<byte>? utf8Bytes = null)
{
return OpenAI.Chat.OpenAIChatModelFactory.ChatTokenTopLogProbabilityDetails(
token: token,
logProbability: logProbability,
utf8Bytes: utf8Bytes);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.StreamingChatToolCallUpdate"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.StreamingChatToolCallUpdate"/> instance for mocking. </returns>
public static StreamingChatToolCallUpdate StreamingChatToolCallUpdate(
int index = default,
string toolCallId = null,
ChatToolCallKind kind = default,
string functionName = null,
BinaryData functionArgumentsUpdate = null)
{
return OpenAI.Chat.OpenAIChatModelFactory.StreamingChatToolCallUpdate(
index: index,
toolCallId: toolCallId,
kind: kind,
functionName: functionName,
functionArgumentsUpdate: functionArgumentsUpdate);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatMessageAnnotation"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatMessageAnnotation"/> instance for mocking. </returns>
public static ChatMessageAnnotation ChatMessageAnnotation(
int startIndex = default,
int endIndex = default,
Uri webResourceUri = default,
string webResourceTitle = default)
{
return OpenAI.Chat.OpenAIChatModelFactory.ChatMessageAnnotation(
startIndex: startIndex,
endIndex: endIndex,
webResourceUri: webResourceUri,
webResourceTitle: webResourceTitle);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatInputTokenUsageDetails"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatInputTokenUsageDetails"/> instance for mocking. </returns>
public static ChatInputTokenUsageDetails ChatInputTokenUsageDetails(
int audioTokenCount = default,
int cachedTokenCount = default)
{
return OpenAI.Chat.OpenAIChatModelFactory.ChatInputTokenUsageDetails(
audioTokenCount: audioTokenCount,
cachedTokenCount: cachedTokenCount);
}

/// <summary> Initializes a new instance of <see cref="OpenAI.Chat.ChatOutputTokenUsageDetails"/>. </summary>
/// <returns> A new <see cref="OpenAI.Chat.ChatOutputTokenUsageDetails"/> instance for mocking. </returns>
public static ChatOutputTokenUsageDetails ChatOutputTokenUsageDetails(
int reasoningTokenCount = default,
int audioTokenCount = default,
int acceptedPredictionTokenCount = default,
int rejectedPredictionTokenCount = default)
{
return OpenAI.Chat.OpenAIChatModelFactory.ChatOutputTokenUsageDetails(
reasoningTokenCount: reasoningTokenCount,
audioTokenCount: audioTokenCount,
acceptedPredictionTokenCount: acceptedPredictionTokenCount,
rejectedPredictionTokenCount: rejectedPredictionTokenCount);
}
}
Loading