-
Notifications
You must be signed in to change notification settings - Fork 67
ci: Use Linkage Checker to test Binary Compatibility #3650
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| - uses: actions/setup-java@v4 | ||
| with: | ||
| java-version: 17 | ||
| # Use Java 11 for this as Linkage Checker is only compatible with Java 11 or below |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Curious why Linkage Checker can not be run with Java 11+?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure yet. I suspect there may be older incompatible dependency versions, but there isn't much info given in the error message. GoogleCloudPlatform/cloud-opensource-java#2395
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a slight concern when we drop Java 11, maybe it's worthwhile to at least put the issue into our backlog.
| # Create two mappings of possible API names (Key: Maven Artifact ID Prefix, Value: Maven Group ID) | ||
| # for the libraries that should be tested. | ||
| # 1. These are special handwritten libraries in google-cloud-java that should be tested | ||
| declare -A monorepo_handwritten_libraries |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We probably want to include all libraries with handwritten layers. AFAIR, java-translate has a heavy handwritten layer, java-dns as well. Not sure if there is a good way to find all of them though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding Translate. IIRC, DNS is apiary based (or wraps an Apiary library)
| popd | ||
|
|
||
| echo "Artifact List: ${artifact_list}" | ||
| # Only run Linkage Checker if the repo has any relevant artifacts to test for |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess this is not a realistic scenario, it is in case someone passes in an invalid repo name?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I meant to include a check to fail the CI if there are no artifacts found
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there a scenario that could cause "no artifacts found"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Currently, no. But adding new downstream tests for repos relies on versions.txt's accuracy. We have few hardcoded values in the script that always assume cloud/ google-cloud/ or com.google.cloud.
Adding something like auth to be tested would lead to no artifacts found as it no longer follows google-cloud prefix (rather google-auth).
I think it would be harder to figure out the cause if the error message from linkage checker is Exception in thread "main" java.lang.IllegalArgumentException: Bad artifact coordinates, rather than Unable to find any matching artifacts to test in ...
|
|
||
| # Builds a string output to `artifact_list`. It contains a comma separate list of Maven GAV coordinates. Parses | ||
| # the `versions.txt` file by searching for the matching artifact_id_prefix to get the corresponding version. | ||
| function build_artifact_list() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For future enhancements: we could extract this function to a separate workflow, so that the artifact list can be passed to the binary check in a matrix, which can then be run in parallel. Similar to how we generate Apiary libraries.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function only builds a string of packages for the linkage checker to test again. The artifactList corresponds to modules found in a repo and every repo is already built using a matrix.
ie.. bigtable -> artifactList: google-cloud-bigtable
google-cloud-java -> artifactList: translate,resourcemanager,vertexai
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I'm misunderstanding something, I thought there is still a for loop that runs each repo one by one?
Maybe I was not clear, when I mentioned "we could extract this function to a separate workflow", I meant that "we could extract the functionality of building a list of artifact for all repos to a separate workflow".
i.e.
google-cloud-java, java-bigtable, java-bigquery, java-bigquerystorage etc.
->
google-cloud-bigtable, translate,resourcemanager,vertexai etc.
Then we put all of the artifacts to a Github matrix so that they can run in parallel.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This job is already running in parallel and each downstream repo has it's own job in the matrix. The for loop is only building a list of source artifacts for the linkage checker to use since each repo can have multiple relevant artifacts to test
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Discussed offline that we are going to maintain a hardcode list of repo -> list of artifacts mapping. This makes the shell script much easier to maintain.
|
|
||
| # Builds a string output to `artifact_list`. It contains a comma separate list of Maven GAV coordinates. Parses | ||
| # the `versions.txt` file by searching for the matching artifact_id_prefix to get the corresponding version. | ||
| function build_artifact_list() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Discussed offline that we are going to maintain a hardcode list of repo -> list of artifacts mapping. This makes the shell script much easier to maintain.
|
|



Create a Binary Compatibility Check for Protobuf versions for all downstream libraries. Binary compatiblity is done via the linkage checker tool.