Skip to content
Merged
Show file tree
Hide file tree
Changes from 26 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
a3ea5da
ci: Use Linkage Checker to test Binary Compatibility
lqiu96 Feb 20, 2025
b55c8ad
ci: Fix generating non-cloud api list
lqiu96 Feb 21, 2025
20a654c
ci: Fix exit code 1 for non-matching elements
lqiu96 Feb 21, 2025
35b4324
ci: Fix exit code 1 for non-matching elements
lqiu96 Feb 21, 2025
de0b889
ci: Create a map of non-cloud APIs
lqiu96 Feb 24, 2025
fe89f01
ci: Fix string interpolation issue with single quotes
lqiu96 Feb 24, 2025
cd0a03b
ci: Fix awk command
lqiu96 Feb 24, 2025
9d125ac
ch: Fix CI
lqiu96 Feb 24, 2025
dbe3051
chore: Clean up script
lqiu96 Feb 24, 2025
fe491b6
chore: Clean up script
lqiu96 Feb 24, 2025
002868d
chore: Clean up script
lqiu96 Feb 24, 2025
c20ca05
chore: Clean up script
lqiu96 Feb 24, 2025
b1b4d21
chore: Add mappings for the handwritten libraries
lqiu96 Mar 6, 2025
8a1aa88
chore: Fix typo
lqiu96 Mar 6, 2025
fa542c7
chore: Skip installing if repo is already built
lqiu96 Mar 6, 2025
7f295aa
ci: cd to downstream repo and build it
lqiu96 Mar 6, 2025
5237b8e
ci: cd back into root directory
lqiu96 Mar 6, 2025
c7a4b06
ci: use nested directories
lqiu96 Mar 6, 2025
1c07195
ci: use nested directories
lqiu96 Mar 6, 2025
49f658c
ci: use nested directories
lqiu96 Mar 6, 2025
dffa8fa
ci: Re-enable this for all handwritten libraries
lqiu96 Mar 6, 2025
edde57d
chore: Use function to build artifact list
lqiu96 Mar 7, 2025
594e5a8
chore: Use exec-linkage-checker maven profile
lqiu96 Mar 7, 2025
2c5e527
chore: Remove local install repo from source test
lqiu96 Mar 7, 2025
eaf2389
chore: Install artifacts for binary
lqiu96 Mar 7, 2025
e11cb7b
chore: Revert back to original source changes
lqiu96 Mar 7, 2025
ab4bd15
chore: Address PR comments
lqiu96 Mar 10, 2025
e58dc76
chore: Address PR comments
lqiu96 Mar 10, 2025
ee4177b
chore: Address PR comments
lqiu96 Mar 10, 2025
3b79741
chore: Run if there are either handwritten modules or gRPC modules
lqiu96 Mar 10, 2025
ad8544a
chore: Fix issue with gRPC modules to test
lqiu96 Mar 10, 2025
6b46510
chore: Add a few grpc modules for google-cloud-java
lqiu96 Mar 10, 2025
2fb338f
chore: Address PR comments
lqiu96 Mar 12, 2025
806fece
chore: update google-cloud-java deps
lqiu96 Mar 13, 2025
b746ed0
Merge branch 'main' into downstream-protobuf-binary-test
lqiu96 Mar 13, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -43,12 +43,17 @@ jobs:
# which values to use and would resolve to ''.
protobuf-version: ${{ fromJSON(format('[{0}]', inputs.protobuf_runtime_versions || '"3.25.5","4.28.3"')) }}
steps:
- uses: actions/checkout@v4
- name: Checkout sdk-platform-java repo
uses: actions/checkout@v4
- uses: actions/setup-java@v4
with:
java-version: 17
# Use Java 11 for this as Linkage Checker is only compatible with Java 11 or below
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Curious why Linkage Checker can not be run with Java 11+?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure yet. I suspect there may be older incompatible dependency versions, but there isn't much info given in the error message. GoogleCloudPlatform/cloud-opensource-java#2395

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a slight concern when we drop Java 11, maybe it's worthwhile to at least put the issue into our backlog.

java-version: 11
distribution: temurin
- name: Print Protobuf-Java testing version
run: echo "Testing with Protobuf-Java v${{ matrix.protobuf-version }}"
- name: Perform downstream source compatibility testing
run: REPOS_UNDER_TEST="${{ matrix.repo }}" PROTOBUF_RUNTIME_VERSION="${{ matrix.protobuf-version}}" ./.kokoro/nightly/downstream-protobuf-source-compatibility.sh
run: REPOS_UNDER_TEST="${{ matrix.repo }}" PROTOBUF_RUNTIME_VERSION="${{ matrix.protobuf-version }}" ./.kokoro/nightly/downstream-protobuf-source-compatibility.sh
- name: Perform downstream binary compatibility testing
run: REPOS_UNDER_TEST="${{ matrix.repo }}" PROTOBUF_RUNTIME_VERSION="${{ matrix.protobuf-version }}" ./.kokoro/nightly/downstream-protobuf-binary-compatibility.sh

107 changes: 107 additions & 0 deletions .kokoro/nightly/downstream-protobuf-binary-compatibility.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,107 @@
#!/bin/bash
# Copyright 2025 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

set -eo pipefail

# Comma-delimited list of repos to test with the local java-shared-dependencies
if [ -z "${REPOS_UNDER_TEST}" ]; then
echo "REPOS_UNDER_TEST must be set to run downstream-protobuf-binary-compatibility.sh"
echo "Expects a comma-delimited list: i.e REPOS_UNDER_TEST=\"java-bigtable,java-bigquery\""
exit 1
fi

# Version of Protobuf-Java runtime to compile with
if [ -z "${PROTOBUF_RUNTIME_VERSION}" ]; then
echo "PROTOBUF_RUNTIME_VERSION must be set to run downstream-protobuf-binary-compatibility.sh"
echo "Expects a single Protobuf-Java runtime version i.e. PROTOBUF_RUNTIME_VERSION=\"4.28.3\""
exit 1
fi

# Create two mappings of possible API names (Key: Maven Artifact ID Prefix, Value: Maven Group ID)
# for the libraries that should be tested.
# 1. These are special handwritten libraries in google-cloud-java that should be tested
declare -A monorepo_handwritten_libraries
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably want to include all libraries with handwritten layers. AFAIR, java-translate has a heavy handwritten layer, java-dns as well. Not sure if there is a good way to find all of them though.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding Translate. IIRC, DNS is apiary based (or wraps an Apiary library)

monorepo_handwritten_libraries["grafeas"]="io.grafeas"
monorepo_handwritten_libraries["google-cloud-vertexai"]="com.google.cloud"
monorepo_handwritten_libraries["google-cloud-resourcemanager"]="com.google.cloud"

# 2. These are the mappings of all the downstream handwritten libraries' artifacts
declare -A downstream_handwritten_libraries
downstream_handwritten_libraries["google-cloud"]="com.google.cloud"

# Builds a string output to `artifact_list`. It contains a comma separate list of Maven GAV coordinates. Parses
# the `versions.txt` file by searching for the matching artifact_id_prefix to get the corresponding version.
function build_artifact_list() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For future enhancements: we could extract this function to a separate workflow, so that the artifact list can be passed to the binary check in a matrix, which can then be run in parallel. Similar to how we generate Apiary libraries.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function only builds a string of packages for the linkage checker to test again. The artifactList corresponds to modules found in a repo and every repo is already built using a matrix.

ie.. bigtable -> artifactList: google-cloud-bigtable
google-cloud-java -> artifactList: translate,resourcemanager,vertexai

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe I'm misunderstanding something, I thought there is still a for loop that runs each repo one by one?
Maybe I was not clear, when I mentioned "we could extract this function to a separate workflow", I meant that "we could extract the functionality of building a list of artifact for all repos to a separate workflow".

i.e.

google-cloud-java, java-bigtable, java-bigquery, java-bigquerystorage etc.

->

google-cloud-bigtable, translate,resourcemanager,vertexai etc.

Then we put all of the artifacts to a Github matrix so that they can run in parallel.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This job is already running in parallel and each downstream repo has it's own job in the matrix. The for loop is only building a list of source artifacts for the linkage checker to use since each repo can have multiple relevant artifacts to test

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Discussed offline that we are going to maintain a hardcode list of repo -> list of artifacts mapping. This makes the shell script much easier to maintain.

local -n api_maven_mapping=$1
for artifact_id_prefix in "${!api_maven_mapping[@]}"; do
group_id="${api_maven_mapping[${artifact_id_prefix}]}"

# Match all artifacts that start with the artifact_id_prefix to exclude any proto and grpc modules.
repo_artifact_list=$(cat "versions.txt" | grep "^${artifact_id_prefix}" || true)

# Only proceed if there are matching elements
if [ -n "${repo_artifact_list}" ]; then
# Exclude any matches to BOM artifacts or emulators. The repo artifact list will look like:
# "com.google.cloud:google-cloud-accessapproval:2.60.0-SNAPSHOT,com.google.cloud:google-cloud-aiplatform:3.60.0-SNAPSHOT,"
repo_artifact_list=$(echo "${repo_artifact_list}" | grep -vE "(bom|emulator|google-cloud-java)" | awk -F: "{\$1=\"${group_id}:\"\$1; \$2=\"\"; print}" OFS=: | sed 's/::/:/' | tr '\n' ',')
# Remove the trailing comma after the last entry
repo_artifact_list=${repo_artifact_list%,}

# The first entry added is not separated with a comma. Avoids generating `,{ARTIFACT_LIST}`
if [ -z "${artifact_list}" ]; then
artifact_list="${repo_artifact_list}"
else
artifact_list="${artifact_list},${repo_artifact_list}"
fi
fi
done
}

# cloud-opensource-java contains the Linkage Checker tool
git clone https://github.com/GoogleCloudPlatform/cloud-opensource-java.git
pushd cloud-opensource-java
mvn -B -ntp clean compile -T 1C
# Linkage Checker tool resides in the /dependencies subfolder
pushd dependencies

for repo in ${REPOS_UNDER_TEST//,/ }; do # Split on comma
# Perform testing on main (with latest changes). Shallow copy as history is not important
git clone "https://github.com/googleapis/${repo}.git" --depth=1
pushd "${repo}"
# Install all repo modules to ~/.m2 (there can be multiple relevant artifacts to test i.e. core, admin, control)
mvn -B -ntp install -T 1C -DskipTests -Dclirr.skip -Denforcer.skip

artifact_list=""
if [ "${repo}" == "google-cloud-java" ]; then
build_artifact_list monorepo_handwritten_libraries
else
build_artifact_list downstream_handwritten_libraries
fi

# Linkage Checker /dependencies
popd

echo "Artifact List: ${artifact_list}"
# Only run Linkage Checker if the repo has any relevant artifacts to test for
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess this is not a realistic scenario, it is in case someone passes in an invalid repo name?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I meant to include a check to fail the CI if there are no artifacts found

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a scenario that could cause "no artifacts found"?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, no. But adding new downstream tests for repos relies on versions.txt's accuracy. We have few hardcoded values in the script that always assume cloud/ google-cloud/ or com.google.cloud.

Adding something like auth to be tested would lead to no artifacts found as it no longer follows google-cloud prefix (rather google-auth).

I think it would be harder to figure out the cause if the error message from linkage checker is Exception in thread "main" java.lang.IllegalArgumentException: Bad artifact coordinates, rather than Unable to find any matching artifacts to test in ...

if [ -n "${artifact_list}" ]; then
# The `-s` argument filters the linkage check problems that stem from the artifact
program_args="-r --artifacts ${artifact_list},com.google.protobuf:protobuf-java:${PROTOBUF_RUNTIME_VERSION},com.google.protobuf:protobuf-java-util:${PROTOBUF_RUNTIME_VERSION} -s ${artifact_list}"
echo "Linkage Checker Program Arguments: ${program_args}"
mvn -B -ntp exec:java -Dexec.args="${program_args}" -P exec-linkage-checker
fi
echo "done"
done
popd
popd
Loading