Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "2.14.0"
".": "2.15.0"
}
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,25 @@
# Changelog

## 2.15.0 (2025-07-17)

Full Changelog: [v2.14.0...v2.15.0](https://github.com/openai/openai-java/compare/v2.14.0...v2.15.0)

### Features

* **client:** add `ResponseAccumulator` ([#391](https://github.com/openai/openai-java/issues/391)) ([77f54fd](https://github.com/openai/openai-java/commit/77f54fdea8bf0a609f90ec511977531bffc1a9b1))


### Bug Fixes

* **client:** ensure error handling always occurs ([a00c39b](https://github.com/openai/openai-java/commit/a00c39b9b1e06a15fa3a0b2b495adfff86cddd10))


### Chores

* **client:** remove non-existent method ([2d185ba](https://github.com/openai/openai-java/commit/2d185ba387569d90ffffa07adf3337ffce918e3e))
* **internal:** Add CONTRIBUTING.md for SDK developers ([608947c](https://github.com/openai/openai-java/commit/608947cd875167c5aa2beb65cf98f47050914b71))
* **internal:** allow running specific example from cli ([3239c2d](https://github.com/openai/openai-java/commit/3239c2de360864456786043a2a3ffb1a71ac0a45))

## 2.14.0 (2025-07-16)

Full Changelog: [v2.13.1...v2.14.0](https://github.com/openai/openai-java/compare/v2.13.1...v2.14.0)
Expand Down
217 changes: 217 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,217 @@
# Contributing to OpenAI Java SDK

## Setting up the environment

This repository uses [Gradle](https://gradle.org/) with Kotlin DSL for building and dependency management. The SDK requires Java 8, but development requires JDK 21 for the Kotlin toolchain.

## Project structure

The SDK consists of three artifacts:

- `openai-java-core`
- Contains core SDK logic
- Does not depend on [OkHttp](https://square.github.io/okhttp)
- Exposes [`OpenAIClient`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClient.kt), [`OpenAIClientAsync`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsync.kt), [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt), and [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), all of which can work with any HTTP client
- `openai-java-client-okhttp`
- Depends on [OkHttp](https://square.github.io/okhttp)
- Exposes [`OpenAIOkHttpClient`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClient.kt) and [`OpenAIOkHttpClientAsync`](openai-java-client-okhttp/src/main/kotlin/com/openai/client/okhttp/OpenAIOkHttpClientAsync.kt), which provide a way to construct [`OpenAIClientImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientImpl.kt) and [`OpenAIClientAsyncImpl`](openai-java-core/src/main/kotlin/com/openai/client/OpenAIClientAsyncImpl.kt), respectively, using OkHttp
- `openai-java`
- Depends on and exposes the APIs of both `openai-java-core` and `openai-java-client-okhttp`
- Does not have its own logic

## Modifying or adding code

Most of the SDK is generated code. Modifications to code will be persisted between generations, but may
result in merge conflicts between manual patches and changes from the generator. The generator will never
modify the contents of the `openai-java-example/` directory.

## Adding and running examples

All files in the `openai-java-example/` directory are not modified by the generator and can be freely edited or added to.

```java
// openai-java-example/src/main/java/com/openai/example/YourExample.java
package com.openai.example;

public class YourExample {
public static void main(String[] args) {
// ...
}
}
```

```sh
$ ./gradlew :openai-java-example:run -PmainClass=com.openai.example.YourExample
```

## Using the repository from source

If you'd like to use the repository from source, you can either [install from git](https://jitpack.io/) or link to a cloned repository.

To use a local version of this library from source in another project, you can publish it to your local Maven repository:

```sh
$ ./gradlew publishToMavenLocal
```

> [!NOTE]
> For now, to publish locally, you'll need to comment out the line for `signAllPublications()` here: `buildSrc/src/main/kotlin/openai.publish.gradle.kts`

Then in your project's `build.gradle.kts` or `pom.xml`, reference the locally published version:

<!-- x-release-please-start-version -->

```kotlin
implementation("com.openai:openai-java:2.9.1")
```

```xml
<dependency>
<groupId>com.openai</groupId>
<artifactId>openai-java</artifactId>
<version>2.9.1</version>
</dependency>
```

<!-- x-release-please-end -->

Alternatively, you can build and install the JAR files directly:

```sh
$ ./gradlew build
```

JAR files will be available in each module's `build/libs/` directory.

## Running tests

Most tests require [our mock server](https://github.com/stoplightio/prism) to be running against the OpenAPI spec to work.

The test script will automatically start the mock server for you (if it's not already running) and run the tests against it:

```sh
$ ./scripts/test
```

You can also manually start the mock server if you want to run tests repeatedly:

```sh
$ ./scripts/mock
```

Then run the tests:

```sh
$ ./scripts/test

```

### Test configuration

- Tests run in parallel for better performance
- Mock server runs on `localhost:4010`
- You can disable mock server tests with `SKIP_MOCK_TESTS=true`
- You can target a custom API URL with `TEST_API_BASE_URL=<url>`

### Testing framework

The project uses:

- **JUnit 5** for test framework
- **Mockito** for mocking
- **AssertJ** for fluent assertions
- **WireMock** for HTTP service mocking
- **Custom TestServerExtension** for mock server management

## Linting and formatting

This repository uses [Spotless](https://github.com/diffplug/spotless) with Palantir Java Format for code formatting and various linting tools.

To check formatting and run lints:

```sh
$ ./scripts/lint
```

This will compile all modules and run static analysis checks.

To fix all formatting issues automatically:

```sh
$ ./scripts/format
```

You can also check formatting directly with Gradle:

```sh
$ ./gradlew spotlessCheck # Check formatting
```

## Building

To build all modules:

```sh
$ ./gradlew build
```

To build a specific module:

```sh
$ ./gradlew :openai-java-core:build
```

## Adding and running examples

All files in the `openai-java-example/` directory are not modified by the generator and can be freely edited or added to.

```java
// add an example to openai-java-example/src/main/java/com/openai/example/<YourExample>.java

package com.openai.example;

public class YourExample {
public static void main(String[] args) {
// ...
}
}
```

## Publishing and releases

Changes made to this repository via the automated release PR pipeline should publish to Maven Central automatically. If
the changes aren't made through the automated pipeline, you may want to make releases manually.

### Publish with a GitHub workflow

You can release to package managers by using [the `Publish Sonatype` GitHub action](https://www.github.com/openai/openai-java/actions/workflows/publish-sonatype.yml). This requires setup organization or repository secrets to be configured.

### Publish manually

If you need to manually release a package, you can run:

```sh
$ ./gradlew publishToSonatype closeAndReleaseSonatypeStagingRepository
```

This requires the following environment variables to be set:

- `SONATYPE_USER` - Your Sonatype Central Portal username
- `SONATYPE_PASSWORD` - Your Sonatype Central Portal password
- `GPG_SIGNING_KEY` - Your GPG private key for signing artifacts
- `GPG_SIGNING_PASSWORD` - Your GPG key passphrase

## Development tools

### Available gradle tasks

Some useful Gradle tasks:

```sh
$ ./gradlew tasks # List all available tasks
$ ./gradlew build # Build all modules
$ ./gradlew test # Run all tests
$ ./gradlew spotlessApply # Format code
$ ./gradlew publishToMavenLocal # Publish to local Maven repository
$ ./gradlew dependencies # Show dependency tree
```
72 changes: 62 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,16 @@

<!-- x-release-please-start-version -->

[![Maven Central](https://img.shields.io/maven-central/v/com.openai/openai-java)](https://central.sonatype.com/artifact/com.openai/openai-java/2.14.0)
[![javadoc](https://javadoc.io/badge2/com.openai/openai-java/2.14.0/javadoc.svg)](https://javadoc.io/doc/com.openai/openai-java/2.14.0)
[![Maven Central](https://img.shields.io/maven-central/v/com.openai/openai-java)](https://central.sonatype.com/artifact/com.openai/openai-java/2.15.0)
[![javadoc](https://javadoc.io/badge2/com.openai/openai-java/2.15.0/javadoc.svg)](https://javadoc.io/doc/com.openai/openai-java/2.15.0)

<!-- x-release-please-end -->

The OpenAI Java SDK provides convenient access to the [OpenAI REST API](https://platform.openai.com/docs) from applications written in Java.

<!-- x-release-please-start-version -->

The REST API documentation can be found on [platform.openai.com](https://platform.openai.com/docs). Javadocs are available on [javadoc.io](https://javadoc.io/doc/com.openai/openai-java/2.14.0).
The REST API documentation can be found on [platform.openai.com](https://platform.openai.com/docs). Javadocs are available on [javadoc.io](https://javadoc.io/doc/com.openai/openai-java/2.15.0).

<!-- x-release-please-end -->

Expand All @@ -22,7 +22,7 @@ The REST API documentation can be found on [platform.openai.com](https://platfor
### Gradle

```kotlin
implementation("com.openai:openai-java:2.14.0")
implementation("com.openai:openai-java:2.15.0")
```

### Maven
Expand All @@ -31,7 +31,7 @@ implementation("com.openai:openai-java:2.14.0")
<dependency>
<groupId>com.openai</groupId>
<artifactId>openai-java</artifactId>
<version>2.14.0</version>
<version>2.15.0</version>
</dependency>
```

Expand Down Expand Up @@ -350,6 +350,53 @@ client.chat()
ChatCompletion chatCompletion = chatCompletionAccumulator.chatCompletion();
```

The SDK provides conveniences for streamed responses. A
[`ResponseAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt)
can record the stream of response events as they are processed and accumulate a
[`Response`](openai-java-core/src/main/kotlin/com/openai/models/responses/Response.kt)
object similar to that which would have been returned by the non-streaming API.

For a synchronous response add a
[`Stream.peek()`](https://docs.oracle.com/javase/8/docs/api/java/util/stream/Stream.html#peek-java.util.function.Consumer-)
call to the stream pipeline to accumulate each event:

```java
import com.openai.core.http.StreamResponse;
import com.openai.helpers.ResponseAccumulator;
import com.openai.models.responses.Response;
import com.openai.models.responses.ResponseStreamEvent;

ResponseAccumulator responseAccumulator = ResponseAccumulator.create();

try (StreamResponse<ResponseStreamEvent> streamResponse =
client.responses().createStreaming(createParams)) {
streamResponse.stream()
.peek(responseAccumulator::accumulate)
.flatMap(event -> event.outputTextDelta().stream())
.forEach(textEvent -> System.out.print(textEvent.delta()));
}

Response response = responseAccumulator.response();
```

For an asynchronous response, add the `ResponseAccumulator` to the `subscribe()` call:

```java
import com.openai.helpers.ResponseAccumulator;
import com.openai.models.responses.Response;

ResponseAccumulator responseAccumulator = ResponseAccumulator.create();

client.responses()
.createStreaming(createParams)
.subscribe(event -> responseAccumulator.accumulate(event)
.outputTextDelta().ifPresent(textEvent -> System.out.print(textEvent.delta())))
.onCompleteFuture()
.join();

Response response = responseAccumulator.response();
```

## Structured outputs with JSON schemas

Open AI [Structured Outputs](https://platform.openai.com/docs/guides/structured-outputs?api-mode=chat)
Expand Down Expand Up @@ -527,11 +574,16 @@ For a full example of the usage of _Structured Outputs_ with Streaming and the C
see
[`StructuredOutputsStreamingExample`](openai-java-example/src/main/java/com/openai/example/StructuredOutputsStreamingExample.java).

At present, there is no accumulator for streaming responses using the Responses API. It is still
possible to derive a JSON schema from a Java class and create a streaming response for a
[`StructuredResponseCreateParams`](openai-java-core/src/main/kotlin/com/openai/models/responses/StructuredResponseCreateParams.kt)
object, but there is no helper for deserialization of the response to an instance of that Java
class.
With the Responses API, accumulate events while streaming using the
[`ResponseAccumulator`](openai-java-core/src/main/kotlin/com/openai/helpers/ResponseAccumulator.kt).
Once accumulated, use `ResponseAccumulator.response(Class<T>)` to convert the accumulated `Response`
into a
[`StructuredResponse`](openai-java-core/src/main/kotlin/com/openai/models/responses/StructuredResponse.kt).
The [`StructuredResponse`] can then automatically deserialize the JSON strings into instances of
your Java class.

For a full example of the usage of _Structured Outputs_ with Streaming and the Responses API, see
[`ResponsesStructuredOutputsStreamingExample`](openai-java-example/src/main/java/com/openai/example/ResponsesStructuredOutputsStreamingExample.java).

### Defining JSON schema properties

Expand Down
2 changes: 1 addition & 1 deletion build.gradle.kts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ repositories {

allprojects {
group = "com.openai"
version = "2.14.0" // x-release-please-version
version = "2.15.0" // x-release-please-version
}

subprojects {
Expand Down
Loading
Loading