Skip to content

Commit be78111

Browse files
authored
Merge branch 'main' into docs/expand-end-to-end-security-demo-documentation
2 parents b38272c + 7f59015 commit be78111

File tree

4 files changed

+31
-9
lines changed

4 files changed

+31
-9
lines changed

demos/hbase-hdfs-load-cycling-data/create-hfile-and-import-to-hbase.yaml

Lines changed: 12 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -28,16 +28,24 @@ spec:
2828
- mountPath: /stackable/conf/hbase-env.sh
2929
name: config-volume-hbase
3030
subPath: hbase-env.sh
31-
command: [ "bash", "-c", "/stackable/hbase/bin/hbase \
31+
command:
32+
- bash
33+
- -euo
34+
- pipefail
35+
- -c
36+
- |
37+
# https://hbase.apache.org/book.html#tools
38+
/stackable/hbase/bin/hbase \
3239
org.apache.hadoop.hbase.mapreduce.ImportTsv \
3340
-Dimporttsv.separator=, \
3441
-Dimporttsv.columns=HBASE_ROW_KEY,rideable_type,started_at,ended_at,start_station_name,start_station_id,end_station_name,end_station_id,start_lat,start_lng,end_lat,end_lng,member_casual \
3542
-Dimporttsv.bulk.output=hdfs://hdfs/data/hfile \
36-
cycling-tripdata hdfs://hdfs/data/raw/demo-cycling-tripdata.csv.gz \
37-
&& /stackable/hbase/bin/hbase \
43+
cycling-tripdata hdfs://hdfs/data/raw/demo-cycling-tripdata.csv.gz
44+
45+
/stackable/hbase/bin/hbase \
3846
org.apache.hadoop.hbase.tool.LoadIncrementalHFiles \
3947
hdfs://hdfs/data/hfile \
40-
cycling-tripdata" ] # https://hbase.apache.org/book.html#tools
48+
cycling-tripdata
4149
volumes:
4250
- name: config-volume-hbase
4351
configMap:

demos/hbase-hdfs-load-cycling-data/distcp-cycling-data.yaml

Lines changed: 12 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,9 @@ spec:
99
containers:
1010
- name: distcp-cycling-data
1111
# We use 24.3.0 here which contains the distcp MapReduce components
12-
# This is not included in the 24.7 images and will fail.
12+
# This is not included in the 24.7 and 24.11 images and will fail.
1313
# See: https://github.com/stackabletech/docker-images/issues/793
14-
image: docker.stackable.tech/stackable/hadoop:3.4.0-stackable0.0.0-dev
14+
image: docker.stackable.tech/stackable/hadoop:3.3.6-stackable24.3.0
1515
env:
1616
- name: HADOOP_USER_NAME
1717
value: stackable
@@ -20,7 +20,16 @@ spec:
2020
- name: HADOOP_CLASSPATH
2121
value: "/stackable/hadoop/share/hadoop/tools/lib/*.jar"
2222
# yamllint disable-line rule:line-length
23-
command: ["bash", "-c", "bin/hdfs dfs -mkdir -p /data/raw && bin/hadoop distcp -D fs.s3a.aws.credentials.provider=org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider s3a://public-backup-nyc-tlc/cycling-tripdata/demo-cycling-tripdata.csv.gz hdfs://hdfs/data/raw"]
23+
command:
24+
- bash
25+
- -euo
26+
- pipefail
27+
- -c
28+
- |
29+
bin/hdfs dfs -mkdir -p /data/raw
30+
bin/hadoop distcp -D fs.s3a.aws.credentials.provider=org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider \
31+
s3a://public-backup-nyc-tlc/cycling-tripdata/demo-cycling-tripdata.csv.gz \
32+
hdfs://hdfs/data/raw
2433
volumeMounts:
2534
- name: config-volume-hdfs
2635
mountPath: /stackable/conf/hdfs

docs/modules/demos/pages/hbase-hdfs-load-cycling-data.adoc

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -206,7 +206,7 @@ image::hbase-hdfs-load-cycling-data/hbase-table-ui.png[]
206206

207207
== Accessing the HDFS web interface
208208

209-
You can also see HDFS details via a UI by running `stackablectl stacklet list` and following the link next to one of the namenodes.
209+
You can also see HDFS details via a UI by running `stackablectl stacklet list` and following the http links next to the namenodes.
210210

211211
Below you will see the overview of your HDFS cluster.
212212

@@ -218,6 +218,11 @@ image::hbase-hdfs-load-cycling-data/hdfs-datanode.png[]
218218

219219
You can also browse the file system by clicking on the `Utilities` tab and selecting `Browse the file system`.
220220

221+
[TIP]
222+
====
223+
Check that the namenode you browse to is the _active_ namenode in the Overview page. Otherwise you will not be able to browse files.
224+
====
225+
221226
image::hbase-hdfs-load-cycling-data/hdfs-data.png[]
222227

223228
Navigate in the file system to the folder `data` and then the `raw` folder.

docs/modules/demos/pages/nifi-kafka-druid-water-level-data.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -286,7 +286,7 @@ keeps streaming near-real-time updates for every available measuring station.
286286
You can look at the ingestion job running in NiFi by opening the endpoint `https` from your `stackablectl stacklet list`
287287
command output. You have to use the endpoint from your command output. In this case, it is https://172.18.0.2:30198.
288288
Open it with your favourite browser. Suppose you get a warning regarding the self-signed certificate generated by the
289-
xref:secret-operator::index.adoc[Secret Operator] (e.g. Warning: Potential Security Risk Ahead). In that case, you must
289+
xref:home:secret-operator:index.adoc[Secret Operator] (e.g. Warning: Potential Security Risk Ahead). In that case, you must
290290
tell your browser to trust the website and continue.
291291

292292
image::nifi-kafka-druid-water-level-data/nifi_1.png[]

0 commit comments

Comments
 (0)