Skip to content

Commit 7df6d21

Browse files
committed
Misc build tweaks.
1 parent 2c34001 commit 7df6d21

File tree

5 files changed

+31
-37
lines changed

5 files changed

+31
-37
lines changed

.travis.yml

Lines changed: 0 additions & 36 deletions
This file was deleted.

core/src/it/resources/log4j.properties

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,8 @@ log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
4040
log4j.logger.org.locationtech.rasterframes=WARN
4141
log4j.logger.org.locationtech.rasterframes.ref=WARN
4242
log4j.logger.org.apache.parquet.hadoop.ParquetRecordReader=OFF
43+
log4j.logger.geotrellis.spark=INFO
44+
log4j.logger.geotrellis.raster.gdal=ERROR
4345

4446
# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
4547
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL

docs/src/main/paradox/release-notes.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,14 +23,15 @@
2323
- Revisit use of `Tile` equality since [it's more strict](https://github.com/locationtech/geotrellis/pull/2991)
2424
- Update `reference.conf` to use `geotrellis.raster.gdal` namespace.
2525
- Replace all uses of `TileDimensions` with `geotrellis.raster.Dimensions[Int]`.
26+
* Upgraded to `gdal-warp-bindings` 1.0.0.
27+
* Upgraded to Spark 2.4.5
2628
* Formally abandoned support for Python 2. Python 2 is dead. Long live Python 2.
2729
* Introduction of type hints in Python API.
2830
* Add functions for changing cell values based on either conditions or to achieve a distribution of values. ([#449](https://github.com/locationtech/rasterframes/pull/449))
2931
* Add `rf_local_min`, `rf_local_max`, and `rf_local_clip` functions.
3032
* Add cell value scaling functions `rf_rescale` and `rf_standardize`.
3133
* Add `rf_where` function, similar in spirit to numpy's `where`, or a cell-wise version of Spark SQL's `when` and `otherwise`.
3234
* Add `rf_sqrt` function to compute cell-wise square root.
33-
* Upgraded to Spark 2.4.5
3435

3536
## 0.8.x
3637

pyrasterframes/src/main/python/docs/__init__.py

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,25 @@
2020

2121
from pweave import PwebPandocFormatter
2222

23+
# Setuptools/easy_install doesn't properly set the execute bit on the Spark scripts,
24+
# So this preemptively attempts to do it.
25+
def _chmodit():
26+
try:
27+
from importlib.util import find_spec
28+
import os
29+
module_home = find_spec("pyspark").origin
30+
print(module_home)
31+
bin_dir = os.path.join(os.path.dirname(module_home), 'bin')
32+
for filename in os.listdir(bin_dir):
33+
try:
34+
os.chmod(os.path.join(bin_dir, filename), mode=0o555, follow_symlinks=True)
35+
except OSError:
36+
pass
37+
except ImportError:
38+
pass
39+
40+
_chmodit()
41+
2342

2443
class PegdownMarkdownFormatter(PwebPandocFormatter):
2544
def __init__(self, *args, **kwargs):

pyrasterframes/src/main/python/pyrasterframes/utils.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,12 @@ def find_pyrasterframes_assembly() -> Union[bytes, str]:
6868
return jarpath[0]
6969

7070

71+
def quiet_logs(sc):
72+
logger = sc._jvm.org.apache.log4j
73+
logger.LogManager.getLogger("geotrellis.raster.gdal").setLevel(logger.Level.ERROR)
74+
logger.LogManager.getLogger("akka").setLevel(logger.Level.ERROR)
75+
76+
7177
def create_rf_spark_session(master="local[*]", **kwargs: str) -> SparkSession:
7278
""" Create a SparkSession with pyrasterframes enabled and configured. """
7379
jar_path = find_pyrasterframes_assembly()
@@ -86,6 +92,8 @@ def create_rf_spark_session(master="local[*]", **kwargs: str) -> SparkSession:
8692
.config(conf=conf) # user can override the defaults
8793
.getOrCreate())
8894

95+
quiet_logs(spark)
96+
8997
try:
9098
spark.withRasterFrames()
9199
return spark

0 commit comments

Comments
 (0)