-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Fix hdfs-related IT tests for java24 #122044
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 13 commits
Commits
Show all changes
19 commits
Select commit
Hold shift + click to select a range
3fc401d
Use hadoop patcher for hdfs-fixture too
ldematte d484793
Merge remote-tracking branch 'upstream/main' into fix-hdfs-tests-java24
ldematte 08f59d6
Merge remote-tracking branch 'upstream/main' into fix-hdfs-tests-java24
ldematte 6de0f88
Try to use gradle transformer, not working
ldematte 537cdd7
Removing patcher project
ldematte 1d389bf
[CI] Auto commit changes from spotless
bb497df
Patcher fix
ldematte 473ee30
Merge branch 'fix-hdfs-tests-java24' of github.com:ldematte/elasticse…
ldematte f715051
Merge remote-tracking branch 'upstream/main' into fix-hdfs-tests-java24
ldematte 4674a0b
Fix repository-hdfs gradle to perform transformation directly
ldematte af8fbd9
Add test classpaths
ldematte e478f43
Merge remote-tracking branch 'upstream/main' into fix-hdfs-tests-java24
ldematte c86d491
Unmuting
ldematte 4646fdc
Reorder gradle file
ldematte affaf73
Regex?
ldematte ec9e749
Merge branch 'main' into fix-hdfs-tests-java24
ldematte 791c56b
Add hdfs3 fixture too
ldematte cecb60b
Use regex to match artifact; add check that we patch all classes in t…
ldematte 256d104
Merge branch 'fix-hdfs-tests-java24' of github.com:ldematte/elasticse…
ldematte File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
126 changes: 126 additions & 0 deletions
126
...in/java/org/elasticsearch/gradle/internal/dependencies/patches/hdfs/HdfsClassPatcher.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,126 @@ | ||
| /* | ||
| * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one | ||
| * or more contributor license agreements. Licensed under the "Elastic License | ||
| * 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side | ||
| * Public License v 1"; you may not use this file except in compliance with, at | ||
| * your election, the "Elastic License 2.0", the "GNU Affero General Public | ||
| * License v3.0 only", or the "Server Side Public License, v 1". | ||
| */ | ||
|
|
||
| package org.elasticsearch.gradle.internal.dependencies.patches.hdfs; | ||
|
|
||
| import org.gradle.api.artifacts.transform.CacheableTransform; | ||
| import org.gradle.api.artifacts.transform.InputArtifact; | ||
| import org.gradle.api.artifacts.transform.TransformAction; | ||
| import org.gradle.api.artifacts.transform.TransformOutputs; | ||
| import org.gradle.api.artifacts.transform.TransformParameters; | ||
| import org.gradle.api.file.FileSystemLocation; | ||
| import org.gradle.api.provider.Provider; | ||
| import org.gradle.api.tasks.Classpath; | ||
| import org.gradle.api.tasks.Input; | ||
| import org.gradle.api.tasks.Optional; | ||
| import org.jetbrains.annotations.NotNull; | ||
| import org.objectweb.asm.ClassReader; | ||
| import org.objectweb.asm.ClassVisitor; | ||
| import org.objectweb.asm.ClassWriter; | ||
|
|
||
| import java.io.File; | ||
| import java.io.FileOutputStream; | ||
| import java.io.IOException; | ||
| import java.io.InputStream; | ||
| import java.util.Enumeration; | ||
| import java.util.List; | ||
| import java.util.Map; | ||
| import java.util.function.Function; | ||
| import java.util.jar.JarEntry; | ||
| import java.util.jar.JarFile; | ||
| import java.util.jar.JarOutputStream; | ||
| import java.util.stream.Stream; | ||
|
|
||
| import static java.util.Map.entry; | ||
|
|
||
| @CacheableTransform | ||
| public abstract class HdfsClassPatcher implements TransformAction<HdfsClassPatcher.Parameters> { | ||
|
|
||
| record JarPatchers(String artifactName, Map<String, Function<ClassWriter, ClassVisitor>> jarPatchers) {} | ||
|
|
||
| static final List<JarPatchers> allPatchers = List.of( | ||
| new JarPatchers( | ||
| "hadoop-common", | ||
| Map.ofEntries( | ||
| entry("org/apache/hadoop/util/ShutdownHookManager.class", ShutdownHookManagerPatcher::new), | ||
| entry("org/apache/hadoop/util/Shell.class", ShellPatcher::new), | ||
| entry("org/apache/hadoop/security/UserGroupInformation.class", SubjectGetSubjectPatcher::new) | ||
| ) | ||
| ), | ||
| new JarPatchers( | ||
| "hadoop-client-api", | ||
| Map.ofEntries( | ||
| entry("org/apache/hadoop/util/ShutdownHookManager.class", ShutdownHookManagerPatcher::new), | ||
| entry("org/apache/hadoop/util/Shell.class", ShellPatcher::new), | ||
| entry("org/apache/hadoop/security/UserGroupInformation.class", SubjectGetSubjectPatcher::new), | ||
| entry("org/apache/hadoop/security/authentication/client/KerberosAuthenticator.class", SubjectGetSubjectPatcher::new) | ||
| ) | ||
| ) | ||
| ); | ||
|
|
||
| interface Parameters extends TransformParameters { | ||
| @Input | ||
| @Optional | ||
| List<String> getMatchingArtifacts(); | ||
|
|
||
| void setMatchingArtifacts(List<String> matchingArtifacts); | ||
| } | ||
|
|
||
| @Classpath | ||
| @InputArtifact | ||
| public abstract Provider<FileSystemLocation> getInputArtifact(); | ||
|
|
||
| @Override | ||
| public void transform(@NotNull TransformOutputs outputs) { | ||
| File inputFile = getInputArtifact().get().getAsFile(); | ||
|
|
||
| List<String> matchingArtifacts = getParameters().getMatchingArtifacts(); | ||
| if (matchingArtifacts.isEmpty() == false | ||
| && matchingArtifacts.stream().noneMatch(supported -> inputFile.getName().contains(supported))) { | ||
| outputs.file(getInputArtifact()); | ||
| } else { | ||
| Stream<JarPatchers> patchersToApply = allPatchers.stream().filter(jp -> matchingArtifacts.contains(jp.artifactName())); | ||
| patchersToApply.forEach(patchers -> { | ||
| File outputFile = outputs.file(inputFile.getName().replace(".jar", "-patched.jar")); | ||
| patchJar(inputFile, outputFile, patchers.jarPatchers()); | ||
| }); | ||
| } | ||
| } | ||
|
|
||
| private static void patchJar(File inputFile, File outputFile, Map<String, Function<ClassWriter, ClassVisitor>> jarPatchers) { | ||
| try (JarFile jarFile = new JarFile(inputFile); JarOutputStream jos = new JarOutputStream(new FileOutputStream(outputFile))) { | ||
| Enumeration<JarEntry> entries = jarFile.entries(); | ||
| while (entries.hasMoreElements()) { | ||
| JarEntry entry = entries.nextElement(); | ||
| String entryName = entry.getName(); | ||
| // Add the entry to the new JAR file | ||
| jos.putNextEntry(new JarEntry(entryName)); | ||
|
|
||
| Function<ClassWriter, ClassVisitor> classPatcher = jarPatchers.get(entryName); | ||
| if (classPatcher != null) { | ||
| byte[] classToPatch = jarFile.getInputStream(entry).readAllBytes(); | ||
|
|
||
| ClassReader classReader = new ClassReader(classToPatch); | ||
| ClassWriter classWriter = new ClassWriter(classReader, 0); | ||
| classReader.accept(classPatcher.apply(classWriter), 0); | ||
|
|
||
| jos.write(classWriter.toByteArray()); | ||
| } else { | ||
| // Read the entry's data and write it to the new JAR | ||
| try (InputStream is = jarFile.getInputStream(entry)) { | ||
| is.transferTo(jos); | ||
| } | ||
| } | ||
| jos.closeEntry(); | ||
| } | ||
| } catch (IOException ex) { | ||
| throw new RuntimeException(ex); | ||
| } | ||
| } | ||
| } | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if we should add a check here to verify all expected patches are applied and fail early if not to avoid having undetected changes sneaked into this again by e.g. updating hadoop versions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tried to do that and it's a great idea; however there is one big blocker for this: the hadoop-common artifact has one additional "classifier" named
tests, which contains only tests of course but is brought in anyway.I tried to exclude it but apparently it's not possible to do that in gradle (there is limited support for classifiers).
The thing is that hadoop-common is applied to both hadoop-common jars (regular and "tests"), and it will fail on the second, because none of the expected patches will be applied there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TL;DR: if you have a good way to either exclude
hadoop-common:testsor to recognize it and avoid trying to patch it I'm all ears!There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could tweak the regex to match the file name being stricter in a sense of expected name + version regex instead of just a simple string.contains based check to fix to not match -tests artifacts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought about that; I discarded it because "2.8.5-tests" is a valid "version" (isn't it? like 1.0.0-beta).
I can allow only the numeric part; it's not super clean but..
I would have preferred excluding the jar completely, since it's only tests (no purpose having it), but seems that this is very hard to do in gradle
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me try and see the regex
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had to deal with that test classifier stuff with hadoop before and suffered myself. okay. lets just live with that for now. we can revisit later if we feel the need. I will use parts of that stuff later on to fix other places in our codebase.