Skip to content

Commit a46b3a5

Browse files
committed
Merge remote-tracking branch 'lt/develop' into feature/mask_by_bits
2 parents ea8974c + 9de8ebc commit a46b3a5

File tree

2 files changed

+15
-16
lines changed

2 files changed

+15
-16
lines changed

pyrasterframes/src/main/python/docs/masking.pymd

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -183,18 +183,19 @@ to_clip = to_rasterize.withColumn('clip_raster',
183183
rf_rasterize('geom_native', rf_geometry('blue_masked'), lit(1), rf_dimensions('blue_masked').cols, rf_dimensions('blue_masked').rows))
184184

185185
# visualize some of the edges of our circle
186-
to_clip.select('clip_raster', 'blue_masked') \
186+
to_clip.select('blue_masked', 'clip_raster') \
187187
.filter(rf_data_cells('clip_raster') > 20) \
188188
.orderBy(rf_data_cells('clip_raster'))
189189
```
190190

191191
Finally, we create a new _tile_ column with the blue band clipped to our circle. Again we will use the `rf_mask` function to pass the NoData regions along from the rasterized geometry.
192192

193-
clipped = to_clip.select('blue_masked',
194-
'clip_raster',
195-
rf_mask('blue_masked', 'clip_raster').alias('blue_clipped')) \
196-
.filter(rf_data_cells('clip_raster') > 20) \
197-
.orderBy(rf_data_cells('clip_raster'))
198-
193+
```python, clip
194+
to_clip.select('blue_masked',
195+
'clip_raster',
196+
rf_mask('blue_masked', 'clip_raster').alias('blue_clipped')) \
197+
.filter(rf_data_cells('clip_raster') > 20) \
198+
.orderBy(rf_data_cells('clip_raster'))
199+
```
199200

200201
This kind of clipping technique is further used in @ref:[zonal statistics](zonal-algebra.md).

rf-notebook/src/main/docker/Dockerfile

Lines changed: 7 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -34,21 +34,19 @@ RUN cd /usr/local && ln -s spark-${APACHE_SPARK_VERSION}-bin-hadoop${HADOOP_VERS
3434
ENV SPARK_HOME /usr/local/spark
3535
ENV PYTHONPATH $SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip
3636
ENV SPARK_OPTS --driver-java-options=-Xms1024M --driver-java-options=-Xmx4096M --driver-java-options=-Dlog4j.logLevel=info
37+
ENV RF_LIB_LOC=/usr/local/rasterframes
3738

38-
COPY conda_cleanup.sh .
39-
RUN chmod u+x conda_cleanup.sh
39+
COPY conda_cleanup.sh $RF_LIB_LOC/
40+
RUN chmod u+x $RF_LIB_LOC/conda_cleanup.sh
4041

4142
ENV LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/opt/conda/lib"
4243
# Sphinx (for Notebook->html) and pyarrow (from pyspark build)
4344
RUN \
44-
conda install --quiet --yes pyarrow \
45-
anaconda sphinx nbsphinx shapely numpy folium geopandas geojsonio rasterio descartes && \
46-
./conda_cleanup.sh $NB_USER $CONDA_DIR
47-
48-
ENV RF_LIB_LOC=/usr/local/rasterframes
49-
RUN mkdir $RF_LIB_LOC
45+
conda install --quiet --yes --channel conda-forge \
46+
pyarrow anaconda sphinx nbsphinx shapely numpy folium geopandas geojsonio rasterio descartes && \
47+
$RF_LIB_LOC/conda_cleanup.sh $NB_USER $CONDA_DIR
5048

51-
COPY *.whl $RF_LIB_LOC
49+
COPY *.whl $RF_LIB_LOC/
5250
COPY jupyter_notebook_config.py $HOME/.jupyter
5351
COPY examples $HOME/examples
5452

0 commit comments

Comments
 (0)