Skip to content

Conversation

@SamBarker
Copy link
Contributor

@SamBarker SamBarker commented Nov 6, 2024

What is the purpose of the change

Brief change log

Verifying this change

This change is already covered by existing tests, such as test_application_operations.sh should pass with each of the supported JDK versions.

Does this pull request potentially affect one of the following parts:

  • Dependencies (does it add or upgrade a dependency): no
  • The public API, i.e., is any changes to the CustomResourceDescriptors: no
  • Core observer or reconciler logic that is regularly executed: no

Documentation

  • Does this pull request introduce a new feature? no
  • If yes, how is the feature documented? not applicable

@SamBarker
Copy link
Contributor Author

cc: @tomncooper & @robobario

- "v1_18"
uses: ./.github/workflows/e2e.yaml
with:
java-version: ${{ matrix.java-version }}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm wondering if it's a bit funky that we are using java-version to control both the JDK/JRE used to build/run the operator and the runtime of the flink image. Aren't they different dimensions of the matrix?

This would show that a JDK17 operator works with a JDK17 flink image, but not cover the current default operator runtime with a JDK17 flink image.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I had debated making a flink-tag parameter but didn't as we would then have to manually encode the matrix.

This would show that a JDK17 operator works with a JDK17 flink image, but not cover the current default operator runtime with a JDK17 flink image.

I'm not convinced that the operator JDK really matters so don't think its particularly important to have a JDK11 operator deploying a JDK 17 Flink. I can go that direction if others see it as valuable.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah you're right we aren't likely to see some incompatibility arising from different JRE combinations. Maybe it would be better to hold the operator JDK constant and vary only the flink runtime. Just to make it clearer which thing is varied.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I agree with @SamBarker. The issue here is actually not the interplay of JDK versions but rather the config supplied by the operator's logic not being compatible (or more accuratly enabling) Java 17 support. So that's what we need to test.

create-namespace:
type: boolean
default: false
append-java-version:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe this should be an optional flink-java-version and the matrix could contain the versions we want to test

@SamBarker SamBarker changed the title [FLINK-36646] Test different versions of the JDK [FLINK-36646] Test different versions of the JDK in the Flink image Nov 6, 2024
Copy link
Contributor

@tomncooper tomncooper left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just need to update the changelog in the PR description.

@SamBarker
Copy link
Contributor Author

@gyfora
Copy link
Contributor

gyfora commented Nov 21, 2024

Closing this as it was merged in another PR

@gyfora gyfora closed this Nov 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants