Skip to content

Commit 90097a6

Browse files
authored
Merge branch 'develop' into hdp
2 parents 3d4f997 + 4f06ce8 commit 90097a6

File tree

2 files changed

+10
-6
lines changed

2 files changed

+10
-6
lines changed

CHANGELOG.md

Lines changed: 9 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,18 @@
11
# Change Log
22
All notable changes to this project will be documented in this file.
33

4-
## Unreleased
4+
## [Unreleased]
5+
### Added:
6+
- PNDA-2445: Support for Hortonworks HDP
7+
8+
## [0.4.0] 2017-05-23
59
### Added
6-
- PNDA-2729: Added support for spark streaming jobs written in python (pyspark). Use `main_py` instead of `main_jar` in properties.json and specify additional files using `py_files`.
7-
- PNDA-2445: Support for Hortonworks HDP hadoop distro.
10+
- PNDA-2729: Added support for spark streaming jobs written in python (pyspark). Use `main_py` instead of `main_jar` in properties.json and specify additional files using `py_files`.
11+
- PNDA-2784: Make tests pass on RedHat
812

913
### Changed
10-
- PNDA-2700: Spark streaming jobs no longer require upstart.conf or yarn-kill.py files, default ones are supplied by the deployment manager.
14+
- PNDA-2700: Spark streaming jobs no longer require upstart.conf or yarn-kill.py files, default ones are supplied by the deployment manager.
15+
- PNDA-2782: Disabled Ubuntu-only test
1116

1217

1318
## [0.3.0] 2017-01-20

api/src/main/resources/plugins/upstart.conf.py.tpl

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,5 @@ respawn
44
respawn limit unlimited
55
pre-start exec /opt/${environment_namespace}/${component_application}/${component_name}/yarn-kill.py
66
pre-stop exec /opt/${environment_namespace}/${component_application}/${component_name}/yarn-kill.py
7-
env programDir=/opt/${environment_namespace}/${component_application}/${component_name}/
87
chdir /opt/${environment_namespace}/${component_application}/${component_name}/
9-
exec sudo -u hdfs spark-submit --driver-java-options "-Dlog4j.configuration=file:///${programDir}log4j.properties" --conf 'spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///${programDir}log4j.properties' --name '${component_job_name}' --master yarn-cluster --py-files application.properties,${component_py_files} ${component_spark_submit_args} ${component_main_py}
8+
exec sudo -u hdfs spark-submit --driver-java-options "-Dlog4j.configuration=file:///opt/${environment_namespace}/${component_application}/${component_name}/log4j.properties" --conf 'spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///opt/${environment_namespace}/${component_application}/${component_name}/log4j.properties' --name '${component_job_name}' --master yarn-cluster --py-files application.properties,${component_py_files} ${component_spark_submit_args} ${component_main_py}

0 commit comments

Comments
 (0)