Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
6d872c2
chore(spark): version 4.0.0-preview
razvan Aug 16, 2024
818b9b7
Merge branch 'main' into feat/spark-4
razvan Jul 4, 2025
c1842b0
update spark 4.0.0 deps
razvan Jul 4, 2025
023006d
init spark 4.0.0 patches
razvan Jul 4, 2025
9c23d6f
successful spark-4 build
razvan Jul 8, 2025
595e9c8
successful spark-3 build
razvan Jul 8, 2025
a5d9781
a bit of cleanup and backwards comp
razvan Jul 8, 2025
348b957
spark-connect-client 4.0.0
razvan Jul 9, 2025
973d1fb
Various update: log4shell removal, testing-tools uid/gid (#1192)
lfrancke Jul 8, 2025
cdb5cb9
feat: HBase resolvable endpoints (#1159)
adwk67 Jul 8, 2025
8240fc7
fix: image bloat due to https://github.com/moby/moby/issues/5419 (#1196)
lfrancke Jul 9, 2025
419e3fa
Remove Hello World operator (#1194)
lfrancke Jul 9, 2025
272a83c
chore: Bump UBI9-minimal base image, cargo-auditable and protoc (#1197)
sbernauer Jul 10, 2025
a88d38d
fix: selectively copy items from hadoop-builder (#1201)
lfrancke Jul 11, 2025
79e6173
Change UID & GID to 1000/0 (#1193)
lfrancke Jul 11, 2025
8489250
fix: Change GID to 1000 (#1202)
Techassi Jul 11, 2025
6d51b30
fix: Add missing patched-libs (#1204)
lfrancke Jul 11, 2025
a0d5dd5
fix: Update nipyapi to 0.22.0 to work with custom nifi versions (#1205)
lfrancke Jul 14, 2025
28fd27f
feat: separate Dockerfile for Hadoop (#1186)
dervoeti Jul 14, 2025
aff4ad3
fix: spark connect client spark version (#1206)
razvan Jul 16, 2025
be28658
fix(hadoop): Backport HADOOP-18583 & fix OpenSSL native library (#1209)
lfrancke Jul 18, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
31 changes: 0 additions & 31 deletions .github/workflows/build_hello-world.yaml

This file was deleted.

15 changes: 9 additions & 6 deletions .scripts/update_readme_badges.sh
Original file line number Diff line number Diff line change
Expand Up @@ -54,12 +54,15 @@ for BUILD_WORKFLOW_FILE in .github/workflows/build_*.yaml; do
echo >> "$BADGES_TMP"
fi
done
# This needs to add the remaning empty columns of the last row in the table
# This is a hack to fix the status quo and make markdownlint happy.
for _ in $(seq 0 $((COLS - 1))); do
echo -n "| " >> "$BADGES_TMP"
done
echo "|" >> "$BADGES_TMP"

# Add remaining empty columns to complete the last row if needed
# "if needed" is the first if here: It'll only run when we're NOT on the last column (0 indexed)
if [ ${CURRENT_COLUMN} -ne $((COLS - 1)) ]; then
for _ in $(seq $((CURRENT_COLUMN + 1)) $((COLS - 1))); do
echo -n "| " >> "$BADGES_TMP"
done
echo "|" >> "$BADGES_TMP"
fi
echo -n "<!-- end:badges -->" >> "$BADGES_TMP"

# Print the image and link shortcuts. Eg:
Expand Down
12 changes: 10 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ All notable changes to this project will be documented in this file.
`check-permissions-ownership.sh` provided in stackable-base image ([#1029]).
- hbase: check for correct permissions and ownerships in /stackable folder via
`check-permissions-ownership.sh` provided in stackable-base image ([#1028]).
- hbase: provide patches to implement listener endpoints ([#1159]).
- hive: check for correct permissions and ownerships in /stackable folder via
`check-permissions-ownership.sh` provided in stackable-base image ([#1040]).
- spark-connect-client: A new image for Spark connect tests and demos ([#1034])
Expand Down Expand Up @@ -58,6 +59,7 @@ All notable changes to this project will be documented in this file.
- zookeeper: bump jetty version for CVE-2024-13009 in 3.9.3 ([#1179])
- zookeeper: bump netty version for CVE-2025-24970 in 3.9.3 ([#1180])
- hadoop: backport HADOOP-19352, HADOOP-19335, HADOOP-19465, HADOOP-19456 and HADOOP-19225 to fix vulnerabilities in Hadoop `3.4.1` ([#1184])
- hadoop: Backport HADOOP-18583 to make OpenSSL 3.x work with the native hadoop libraries ([#1209]).

### Changed

Expand Down Expand Up @@ -91,7 +93,10 @@ All notable changes to this project will be documented in this file.
- opa: Enable custom versions ([#1170]).
- use custom product versions for Hadoop, HBase, Phoenix, hbase-operator-tools, Druid, Hive and Spark ([#1173]).
- hbase: Bump dependencies to the latest patch level for HBase `2.6.1` and `2.6.2` ([#1185]).
- Changed default user & group IDs from 1000/1000 to 782252253/574654813 ([#1164])
- hadoop: Separate Dockerfiles for Hadoop build and HDFS image ([#1186]).
- ubi-rust-builder: Bump Rust toolchain to 1.87.0, cargo-auditable to 0.7.0 and protoc to 31.1 ([#1197]).
- stackable-base, stackable-devel, ubi-rust-builder: Update `ubi-minimal` base image ([#1197]).
- testing-tools: Update `python` 3.12-slim-bullseye base image ([#1197]).

### Fixed

Expand Down Expand Up @@ -199,8 +204,8 @@ All notable changes to this project will be documented in this file.
[#1151]: https://github.com/stackabletech/docker-images/pull/1151
[#1152]: https://github.com/stackabletech/docker-images/pull/1152
[#1156]: https://github.com/stackabletech/docker-images/pull/1156
[#1159]: https://github.com/stackabletech/docker-images/pull/1159
[#1163]: https://github.com/stackabletech/docker-images/pull/1163
[#1164]: https://github.com/stackabletech/docker-images/pull/1164
[#1165]: https://github.com/stackabletech/docker-images/pull/1165
[#1168]: https://github.com/stackabletech/docker-images/pull/1168
[#1169]: https://github.com/stackabletech/docker-images/pull/1169
Expand All @@ -213,8 +218,11 @@ All notable changes to this project will be documented in this file.
[#1180]: https://github.com/stackabletech/docker-images/pull/1180
[#1184]: https://github.com/stackabletech/docker-images/pull/1184
[#1185]: https://github.com/stackabletech/docker-images/pull/1185
[#1186]: https://github.com/stackabletech/docker-images/pull/1186
[#1188]: https://github.com/stackabletech/docker-images/pull/1188
[#1189]: https://github.com/stackabletech/docker-images/pull/1189
[#1197]: https://github.com/stackabletech/docker-images/pull/1197
[#1209]: https://github.com/stackabletech/docker-images/pull/1209

## [25.3.0] - 2025-03-21

Expand Down
13 changes: 5 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,11 @@ This repository contains Dockerfiles and scripts to build base images for use wi
| | | | |
| -: | -: | -: | -: |
| [![Build Airflow]][build_airflow.yaml] | [![Build Druid]][build_druid.yaml] | [![Build Hadoop]][build_hadoop.yaml] | [![Build HBase]][build_hbase.yaml] |
| [![Build Hello-World]][build_hello-world.yaml] | [![Build Hive]][build_hive.yaml] | [![Build Java Base]][build_java-base.yaml] | [![Build Java Development]][build_java-devel.yaml] |
| [![Build Kafka Testing Tools]][build_kafka-testing-tools.yaml] | [![Build Kafka]][build_kafka.yaml] | [![Build Krb5]][build_krb5.yaml] | [![Build NiFi]][build_nifi.yaml] |
| [![Build Omid]][build_omid.yaml] | [![Build OPA]][build_opa.yaml] | [![Build Spark Connect Client]][build_spark-connect-client.yaml] | [![Build Spark K8s]][build_spark-k8s.yaml] |
| [![Build Stackable Base]][build_stackable-base.yaml] | [![Build Superset]][build_superset.yaml] | [![Build Testing Tools]][build_testing-tools.yaml] | [![Build Tools]][build_tools.yaml] |
| [![Build Trino CLI]][build_trino-cli.yaml] | [![Build Trino]][build_trino.yaml] | [![Build Vector]][build_vector.yaml] | [![Build ZooKeeper]][build_zookeeper.yaml] |
| | | | |
| [![Build Hive]][build_hive.yaml] | [![Build Java Base]][build_java-base.yaml] | [![Build Java Development]][build_java-devel.yaml] | [![Build Kafka Testing Tools]][build_kafka-testing-tools.yaml] |
| [![Build Kafka]][build_kafka.yaml] | [![Build Krb5]][build_krb5.yaml] | [![Build NiFi]][build_nifi.yaml] | [![Build Omid]][build_omid.yaml] |
| [![Build OPA]][build_opa.yaml] | [![Build Spark Connect Client]][build_spark-connect-client.yaml] | [![Build Spark K8s]][build_spark-k8s.yaml] | [![Build Stackable Base]][build_stackable-base.yaml] |
| [![Build Superset]][build_superset.yaml] | [![Build Testing Tools]][build_testing-tools.yaml] | [![Build Tools]][build_tools.yaml] | [![Build Trino CLI]][build_trino-cli.yaml] |
| [![Build Trino]][build_trino.yaml] | [![Build Vector]][build_vector.yaml] | [![Build ZooKeeper]][build_zookeeper.yaml] | |
<!-- end:badges -->

## Prerequisites
Expand Down Expand Up @@ -222,8 +221,6 @@ ENTRYPOINT ["/stackable-zookeeper-operator"]
[build_hadoop.yaml]: https://github.com/stackabletech/docker-images/actions/workflows/build_hadoop.yaml
[Build HBase]: https://github.com/stackabletech/docker-images/actions/workflows/build_hbase.yaml/badge.svg
[build_hbase.yaml]: https://github.com/stackabletech/docker-images/actions/workflows/build_hbase.yaml
[Build Hello-World]: https://github.com/stackabletech/docker-images/actions/workflows/build_hello-world.yaml/badge.svg
[build_hello-world.yaml]: https://github.com/stackabletech/docker-images/actions/workflows/build_hello-world.yaml
[Build Hive]: https://github.com/stackabletech/docker-images/actions/workflows/build_hive.yaml/badge.svg
[build_hive.yaml]: https://github.com/stackabletech/docker-images/actions/workflows/build_hive.yaml
[Build Java Base]: https://github.com/stackabletech/docker-images/actions/workflows/build_java-base.yaml/badge.svg
Expand Down
8 changes: 4 additions & 4 deletions conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,12 @@
airflow = importlib.import_module("airflow.versions")
druid = importlib.import_module("druid.versions")
hadoop = importlib.import_module("hadoop.versions")
hadoop_jars = importlib.import_module("hadoop.hadoop.versions")
hbase = importlib.import_module("hbase.versions")
hbase_jars = importlib.import_module("hbase.hbase.versions")
hbase_phoenix = importlib.import_module("hbase.phoenix.versions")
hbase_opa_authorizer = importlib.import_module("hbase.hbase-opa-authorizer.versions")
hbase_operator_tools = importlib.import_module("hbase.hbase-operator-tools.versions")
hello_world = importlib.import_module("hello-world.versions")
hive = importlib.import_module("hive.versions")
java_base = importlib.import_module("java-base.versions")
java_devel = importlib.import_module("java-devel.versions")
Expand Down Expand Up @@ -49,12 +49,12 @@
{"name": "airflow", "versions": airflow.versions},
{"name": "druid", "versions": druid.versions},
{"name": "hadoop", "versions": hadoop.versions},
{"name": "hadoop/hadoop", "versions": hadoop_jars.versions},
{"name": "hbase", "versions": hbase.versions},
{"name": "hbase/hbase", "versions": hbase_jars.versions},
{"name": "hbase/phoenix", "versions": hbase_phoenix.versions},
{"name": "hbase/hbase-opa-authorizer", "versions": hbase_opa_authorizer.versions},
{"name": "hbase/hbase-operator-tools", "versions": hbase_operator_tools.versions},
{"name": "hello-world", "versions": hello_world.versions},
{"name": "hive", "versions": hive.versions},
{"name": "java-base", "versions": java_base.versions},
{"name": "java-devel", "versions": java_devel.versions},
Expand Down Expand Up @@ -110,7 +110,7 @@

args = {
"STACKABLE_USER_NAME": "stackable",
"STACKABLE_USER_UID": "782252253", # This is a random high id to not conflict with any existing user
"STACKABLE_USER_GID": "574654813", # This is a random high id to not conflict with any existing group
"STACKABLE_USER_UID": "1000",
"STACKABLE_USER_GID": "1000",
"DELETE_CACHES": "true",
}
10 changes: 6 additions & 4 deletions druid/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# syntax=docker/dockerfile:1.16.0@sha256:e2dd261f92e4b763d789984f6eab84be66ab4f5f08052316d8eb8f173593acf7
# check=error=true

FROM stackable/image/hadoop AS hadoop-builder
FROM stackable/image/hadoop/hadoop AS hadoop-builder

FROM stackable/image/java-devel AS druid-builder

Expand All @@ -12,7 +12,9 @@ ARG STAX2_API
ARG WOODSTOX_CORE
ARG AUTHORIZER
ARG STACKABLE_USER_UID
ARG HADOOP
ARG HADOOP_HADOOP
# Reassign the arg to `HADOOP_VERSION` for better readability.
ENV HADOOP_VERSION=${HADOOP_HADOOP}

# Setting this to anything other than "true" will keep the cache folders around (e.g. for Maven, NPM etc.)
# This can be used to speed up builds when disk space is of no concern.
Expand Down Expand Up @@ -41,7 +43,7 @@ COPY --chown=${STACKABLE_USER_UID}:0 druid/stackable/patches/${PRODUCT} /stackab

COPY --from=hadoop-builder --chown=${STACKABLE_USER_UID}:0 /stackable/patched-libs /stackable/patched-libs
# Cache mounts are owned by root by default
# We need to explicitly give the uid to use which is hardcoded to "1000" in stackable-base
# We need to explicitly give the uid to use.
# The cache id has to include the product version that we are building because otherwise
# docker encounters race conditions when building multiple versions in parallel, as all
# builder containers will share the same cache and the `rm -rf` commands will fail
Expand Down Expand Up @@ -75,7 +77,7 @@ mvn \
--no-transfer-progress \
clean install \
-Pdist,stackable-bundle-contrib-exts \
-Dhadoop.compile.version=${HADOOP}-stackable${RELEASE} \
-Dhadoop.compile.version=${HADOOP_VERSION}-stackable${RELEASE} \
-DskipTests `# Skip test execution` \
-Dcheckstyle.skip `# Skip checkstyle checks. We dont care if the code is properly formatted, it just wastes time` \
-Dmaven.javadoc.skip=true `# Dont generate javadoc` \
Expand Down
6 changes: 3 additions & 3 deletions druid/versions.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,23 +4,23 @@
# https://druid.apache.org/docs/30.0.1/operations/java/
"java-base": "17",
"java-devel": "17",
"hadoop": "3.3.6",
"hadoop/hadoop": "3.3.6",
"authorizer": "0.7.0",
},
{
"product": "31.0.1",
# https://druid.apache.org/docs/31.0.1/operations/java/
"java-base": "17",
"java-devel": "17",
"hadoop": "3.3.6",
"hadoop/hadoop": "3.3.6",
"authorizer": "0.7.0",
},
{
"product": "33.0.0",
# https://druid.apache.org/docs/33.0.0/operations/java/
"java-base": "17",
"java-devel": "17",
"hadoop": "3.3.6",
"hadoop/hadoop": "3.3.6",
"authorizer": "0.7.0",
},
]
Loading