Skip to content

Commit 57ea620

Browse files
committed
Reduce amount of data downloaded from GitHub in resources/rebuild.sh
Use a tree-less git checkout [1], which reduces the amount of data downloaded by ~60% (from >3GiB to ~1GiB currently) when cloneing the upstream amazon linux repository from GitHub. It will now download some stuff during checkout, but this is significantly less than a full clone, as we're only checking out a handful of tags. We sadly cannot use a `--depth=1` checkout because we need access to all tags, in chronological order, and for this, we need at least the commit metadata itself fetched locally (even git ls-remote --tags will not sort tags without first fetching these). [1]: https://github.blog/open-source/git/get-up-to-speed-with-partial-clone-and-shallow-clone/ Signed-off-by: Patrick Roy <[email protected]>
1 parent 3fb9a2a commit 57ea620

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

resources/rebuild.sh

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ EOF
120120
}
121121

122122
function clone_amazon_linux_repo {
123-
[ -d linux ] || git clone https://github.com/amazonlinux/linux linux
123+
[ -d linux ] || git clone --no-checkout --filter=tree:0 https://github.com/amazonlinux/linux
124124
}
125125

126126
# prints the git tag corresponding to the newest and best matching the provided kernel version $1
@@ -145,7 +145,8 @@ function build_al_kernel {
145145
local KERNEL_VERSION=$(echo $KERNEL_CFG | grep -Po "microvm-kernel-ci-$ARCH-\K(\d+\.\d+)")
146146

147147
pushd linux
148-
make distclean
148+
# fails immediately after clone because nothing is checked out
149+
make distclean || true
149150

150151
git checkout $(get_tag $KERNEL_VERSION)
151152

0 commit comments

Comments
 (0)