Skip to content

Commit 15fabb1

Browse files
Merge remote-tracking branch 'origin/main' into main
# Conflicts: # zh_CN/安装部署/DSS单机部署文档.md
2 parents 9a2bdb1 + bc77240 commit 15fabb1

File tree

2 files changed

+11
-9
lines changed

2 files changed

+11
-9
lines changed

en_US/Installation_and_Deployment/DSS_Single-Server_Deployment_Documentation.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -57,16 +57,16 @@ sudo useradd hadoop
5757

5858
```bash
5959
vi /etc/sudoers
60-
````
60+
```
6161

62-
````properties
62+
```properties
6363
hadoop ALL=(ALL) NOPASSWD: NOPASSWD: ALL
64-
````
64+
```
6565

6666
4. Make sure that the server where DSS and Linkis are deployed can execute commands such as hdfs , hive -e and spark-sql -e normally. In the one-click install script, the components are checked.
6767

6868
5. **If your Pyspark wants to have the drawing function, you also need to install the drawing module on all installation nodes**. The command is as follows:
69-
69+
7070

7171
```bash
7272
python -m pip install matplotlib
@@ -84,7 +84,7 @@ Compile by yourself or go to the component release page to download the installa
8484

8585
3. Download the DSS & LINKIS one-click installation deployment package, and unzip it. The following is the hierarchical directory structure of the one-click installation deployment package:
8686

87-
````text
87+
```text
8888
├── dss_linkis # One-click deployment home directory
8989
├── bin # for one-click installation, and one-click to start DSS + Linkis
9090
├── conf # Parameter configuration directory for one-click deployment
@@ -99,7 +99,7 @@ Compile by yourself or go to the component release page to download the installa
9999

100100
```bash
101101
vi conf/config.sh
102-
````
102+
```
103103

104104
The parameter description is as follows:
105105

zh_CN/安装部署/DSS单机部署文档.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@ Linkix需要的命令工具(在正式安装前,脚本会自动检测这些
1414
- yum
1515
- java
1616
- unzip
17+
- zip
1718
- expect
1819

1920
需要安装的软件:
@@ -67,7 +68,6 @@ Tips:
6768
4. 确保部署 DSS 和 Linkis 的服务器可正常执行 `hdfs dfs -ls``hive -e``spark-sql -e` 等命令。在一键安装脚本中,会对组件进行检查。
6869

6970
5. **如果您的 Pyspark 想拥有画图功能,则还需在所有安装节点,安装画图模块**。命令如下:
70-
7171

7272
```shell script
7373
python -m pip install matplotlib
@@ -117,7 +117,7 @@ DSS & LINKIS 一键安装部署包的层级目录结构如下:
117117
# 非必须不建议修改
118118
LINKIS_VERSION=1.1.1
119119

120-
### DSS Web,本机安装无需修改
120+
### DSS Web,本机安装一般无需修改,但需确认此端口是否占用,若被占用,修改一个可用端口即可。
121121
#DSS_NGINX_IP=127.0.0.1
122122
#DSS_WEB_PORT=8088
123123

@@ -155,7 +155,7 @@ SPARK_CONF_DIR=/appcom/config/spark-config
155155
#LINKIS_PUBLIC_MODULE=lib/linkis-commons/public-module
156156

157157

158-
## YARN REST URL
158+
## YARN REST URL,ensure this PORT have no conflict with DSS_WEB_PORT
159159
YARN_RESTFUL_URL=http://127.0.0.1:8088
160160

161161
## Engine版本配置,不配置则采用默认配置
@@ -298,6 +298,8 @@ HIVE_PASSWORD=xxx
298298

299299
### 4. 启动服务
300300

301+
如果您本机已有 DSS 和 Linkis 的服务正在运行中,请先停掉所有相关服务。
302+
301303
#### (1) 启动服务:
302304

303305
     在安装目录执行以下命令,启动所有服务:

0 commit comments

Comments
 (0)