Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 14 additions & 14 deletions openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10900,16 +10900,7 @@ components:
An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should match the audio language.
type: string
response_format:
description: |
The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`.
type: string
enum:
- json
- text
- srt
- verbose_json
- vtt
default: json
$ref: "#/components/schemas/AudioResponseFormat"
temperature:
description: |
The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit.
Expand Down Expand Up @@ -11042,6 +11033,18 @@ components:
group: audio
example: *verbose_transcription_response_example

AudioResponseFormat:
description: |
The format of the output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`.
type: string
enum:
- json
- text
- srt
- verbose_json
- vtt
default: json

CreateTranslationRequest:
type: object
additionalProperties: false
Expand All @@ -11066,10 +11069,7 @@ components:
An optional text to guide the model's style or continue a previous audio segment. The [prompt](/docs/guides/speech-to-text/prompting) should be in English.
type: string
response_format:
description: |
The format of the transcript output, in one of these options: `json`, `text`, `srt`, `verbose_json`, or `vtt`.
type: string
default: json
$ref: "#/components/schemas/AudioResponseFormat"
temperature:
description: |
The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use [log probability](https://en.wikipedia.org/wiki/Log_probability) to automatically increase the temperature until certain thresholds are hit.
Expand Down
Loading