Skip to content

Commit df7f161

Browse files
committed
Update README with the new Pipeline/Stage API
1 parent 06ce411 commit df7f161

File tree

1 file changed

+76
-55
lines changed

1 file changed

+76
-55
lines changed

README.rst

Lines changed: 76 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,60 @@ sorts it by the ``X`` dimension:
6262
metadata = pipeline.metadata
6363
log = pipeline.log
6464
65+
Programmatic Pipeline Construction
66+
................................................................................
67+
68+
The previous example specified the pipeline as a JSON string. Alternatively, a
69+
pipeline can be constructed by creating ``Stage`` instances and piping them
70+
together. For example, the previous pipeline can be specified as:
71+
72+
.. code-block:: python
73+
74+
pipeline = pdal.Reader("1.2-with-color.las") | pdal.Filter.sort(dimension="X")
75+
76+
Stage Objects
77+
=============
78+
79+
- A stage is an instance of ``pdal.Reader``, ``pdal.Filter`` or ``pdal.Writer``.
80+
- A stage can be instantiated by passing as keyword arguments the options
81+
applicable to the respective PDAL stage. For more on PDAL stages and their
82+
options, check the PDAL documentation on `Stage Objects <https://pdal.io/pipeline.html#stage-objects>`__.
83+
- The ``filename`` option of ``Readers`` and ``Writers`` as well as the ``type``
84+
option of ``Filters`` can be passed positionally as the first argument.
85+
- The ``inputs`` option specifies a sequence of stages to be set as input to the
86+
current stage. Each input can be either the string tag of another stage, or
87+
the ``Stage`` instance itself.
88+
- The ``Reader``, ``Filter`` and ``Writer`` classes come with classmethods for
89+
all the respective PDAL drivers. For example, ``pdal.Filter.head()`` is a
90+
shortcut for ``pdal.Filter(type="filters.head")``.
91+
92+
Pipeline Objects
93+
================
94+
95+
A ``pdal.Pipeline`` instance can be created from:
96+
97+
- a JSON string: ``Pipeline(json_string)``
98+
- a sequence of ``Stage`` instances: ``Pipeline([stage1, stage2])``
99+
- a single ``Stage`` with the ``Stage.pipeline`` method: ``stage.pipeline()``
100+
- nothing: ``Pipeline()`` creates a pipeline with no stages.
101+
- joining ``Stage`` and/or other ``Pipeline`` instances together with the pipe
102+
operator (``|``):
103+
104+
- ``stage1 | stage2``
105+
- ``stage1 | pipeline1``
106+
- ``pipeline1 | stage1``
107+
- ``pipeline1 | pipeline2``
108+
109+
Every application of the pipe operator creates a new ``Pipeline`` instance. To
110+
update an existing ``Pipeline`` use the respective in-place pipe operator (``|=``):
111+
112+
.. code-block:: python
113+
114+
# update pipeline in-place
115+
pipeline = pdal.Pipeline()
116+
pipeline |= stage
117+
pipeline |= pipeline2
118+
65119
Reading using Numpy Arrays
66120
................................................................................
67121

@@ -75,77 +129,44 @@ PDAL and Python:
75129

76130
.. code-block:: python
77131
78-
data = "https://github.com/PDAL/PDAL/blob/master/test/data/las/1.2-with-color.las?raw=true"
79-
80-
81-
json = """
82-
{
83-
"pipeline": [
84-
{
85-
"type": "readers.las",
86-
"filename": "%s"
87-
}
88-
]
89-
}"""
90-
91132
import pdal
92133
import numpy as np
93-
pipeline = pdal.Pipeline(json % data)
94-
count = pipeline.execute()
95134
96-
# get the data from the first array
135+
data = "https://github.com/PDAL/PDAL/blob/master/test/data/las/1.2-with-color.las?raw=true"
136+
137+
pipeline = pdal.Reader.las(filename=data).pipeline()
138+
print(pipeline.execute()) # 1065 points
139+
140+
# Get the data from the first array
97141
# [array([(637012.24, 849028.31, 431.66, 143, 1,
98142
# 1, 1, 0, 1, -9., 132, 7326, 245380.78254963, 68, 77, 88),
99143
# dtype=[('X', '<f8'), ('Y', '<f8'), ('Z', '<f8'), ('Intensity', '<u2'),
100144
# ('ReturnNumber', 'u1'), ('NumberOfReturns', 'u1'), ('ScanDirectionFlag', 'u1'),
101145
# ('EdgeOfFlightLine', 'u1'), ('Classification', 'u1'), ('ScanAngleRank', '<f4'),
102146
# ('UserData', 'u1'), ('PointSourceId', '<u2'),
103147
# ('GpsTime', '<f8'), ('Red', '<u2'), ('Green', '<u2'), ('Blue', '<u2')])
104-
105148
arr = pipeline.arrays[0]
106-
print (len(arr)) # 1065 points
107-
108149
109150
# Filter out entries that have intensity < 50
110-
intensity = arr[arr['Intensity'] > 30]
111-
print (len(intensity)) # 704 points
151+
intensity = arr[arr["Intensity"] > 30]
152+
print(len(intensity)) # 704 points
112153
113-
114-
# Now use pdal to clamp points that have intensity
115-
# 100 <= v < 300, and there are 387
116-
clamp =u"""{
117-
"pipeline":[
118-
{
119-
"type":"filters.range",
120-
"limits":"Intensity[100:300)"
121-
}
122-
]
123-
}"""
124-
125-
p = pdal.Pipeline(clamp, [intensity])
126-
count = p.execute()
127-
clamped = p.arrays[0]
128-
print (count)
154+
# Now use pdal to clamp points that have intensity 100 <= v < 300
155+
pipeline = pdal.Filter.range(limits="Intensity[100:300)").pipeline(intensity)
156+
print(pipeline.execute()) # 387 points
157+
clamped = pipeline.arrays[0]
129158
130159
# Write our intensity data to an LAS file
131-
output =u"""{
132-
"pipeline":[
133-
{
134-
"type":"writers.las",
135-
"filename":"clamped.las",
136-
"offset_x":"auto",
137-
"offset_y":"auto",
138-
"offset_z":"auto",
139-
"scale_x":0.01,
140-
"scale_y":0.01,
141-
"scale_z":0.01
142-
}
143-
]
144-
}"""
145-
146-
p = pdal.Pipeline(output, [clamped])
147-
count = p.execute()
148-
print (count)
160+
pipeline = pdal.Writer.las(
161+
filename="clamped2.las",
162+
offset_x="auto",
163+
offset_y="auto",
164+
offset_z="auto",
165+
scale_x=0.01,
166+
scale_y=0.01,
167+
scale_z=0.01,
168+
).pipeline(clamped)
169+
print(pipeline.execute()) # 387 points
149170
150171
151172
Accessing Mesh Data

0 commit comments

Comments
 (0)