Skip to content

Commit 67f0eba

Browse files
authored
add step of configuring hive-site.xml (#56)
* update docs * rename * fix bug * update doc * revert * add docs * refactor * refactor * refactor * modify cluster installation document * add cluster_node.md * fix * add png * change png * change host name * modify * add cluster_code ref * modify * modify * modify * modify * modify * modify * refactor * add cluster_api for doc-cn * update png url * fix url * change * fix bug * add hive-site.xml step
1 parent 053eb6a commit 67f0eba

File tree

1 file changed

+14
-1
lines changed

1 file changed

+14
-1
lines changed

doc-cn/source/install/install_arctern_on_spark_cn.md

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -44,12 +44,13 @@ export SPARK_LOCAL_HOSTNAME=localhost
4444
4545
> **注意:** 你需要重启终端以使上面的设置生效。
4646
47-
创建 **spark-defaults.conf****spark-env.sh** 文件:
47+
创建 **spark-defaults.conf****spark-env.sh****hive-site.xml** 文件:
4848

4949
```bash
5050
$ cd spark-3.0.0-bin-hadoop2.7/conf
5151
$ cp spark-defaults.conf.template spark-defaults.conf
5252
$ cp spark-env.sh.template spark-env.sh
53+
$ touch hive-site.xml
5354
```
5455

5556
在文件 spark-defaults.conf 的最后添加以下内容:
@@ -70,6 +71,18 @@ spark.executor.extraClassPath <conda_prefix>/jars/arctern_scala-assembly-0.3.0.j
7071
$ export PYSPARK_PYTHON=<conda_prefix>/bin/python
7172
```
7273

74+
在文件 **hive-site.xml** 中添加如下内容:
75+
76+
```xml
77+
<configuration>
78+
<property>
79+
<name>hive.metastore.warehouse.dir</name>
80+
<value>${user.home}/hive/warehouse</value>
81+
<description>location of default database for the warehouse</description>
82+
</property>
83+
</configuration>
84+
```
85+
7386
### 编译安装 pyspark 包
7487

7588
进入 Conda 环境:

0 commit comments

Comments
 (0)