Skip to content

Commit df40a3e

Browse files
authored
Merge pull request #35 from fugro-oss/RenameFinal
Rename LasDatasets to LASDatasets
2 parents 94c35c7 + b51b99d commit df40a3e

27 files changed

+236
-236
lines changed

CONTRIBUTING.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
# Contributing to LasDatasets.jl
1+
# Contributing to LASDatasets.jl
22

3-
**Thank you for your interest in LasDatasets.jl. Your contributions are highly welcome.**
3+
**Thank you for your interest in LASDatasets.jl. Your contributions are highly welcome.**
44

55
There are multiple ways of getting involved:
66

Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
name = "LasDatasets"
1+
name = "LASDatasets"
22
uuid = "cc498e2a-d443-4943-8f26-2a8a0f3c7cdb"
33
authors = ["BenCurran98 <b.curran@fugro.com>"]
44
version = "0.2.2"

README.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
1-
# LasDatasets.jl
1+
# LASDatasets.jl
22

3-
[![CI](https://github.com/fugro-oss/LasDatasets.jl/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/fugro-oss/LasDatasets.jl/actions/workflows/ci.yml)
4-
[![](https://img.shields.io/badge/docs-latest-blue.svg)](https://fugro-oss.github.io/LasDatasets.jl/dev)
3+
[![CI](https://github.com/fugro-oss/LASDatasets.jl/actions/workflows/ci.yml/badge.svg?branch=main)](https://github.com/fugro-oss/LASDatasets.jl/actions/workflows/ci.yml)
4+
[![](https://img.shields.io/badge/docs-latest-blue.svg)](https://fugro-oss.github.io/LASDatasets.jl/dev)
55

6-
You can find the latest documentation [here](https://fugro-oss.github.io/LasDatasets.jl/dev/)
6+
You can find the latest documentation [here](https://fugro-oss.github.io/LASDatasets.jl/dev/)
77

88
A Julia package for reading and writing *LAS* data. *LAS* is a public file format for saving and loading 3D point cloud data, and its source repository can be found [here](https://github.com/ASPRSorg/LAS). This package currently supports *LAS* specifications 1.1-1.4 (see [here](https://www.asprs.org/wp-content/uploads/2019/03/LAS_1_4_r14.pdf) for the 1.4 spec.)
99

@@ -20,8 +20,8 @@ These instructions will get you a copy of the project up and running on your loc
2020

2121
```
2222
using Pkg
23-
Pkg.add("git@github.com:fugro-oss/LasDatasets.jl.git")
24-
using LasDatasets
23+
Pkg.add("git@github.com:fugro-oss/LASDatasets.jl.git")
24+
using LASDatasets
2525
```
2626

2727
And you're ready to go!
@@ -33,7 +33,7 @@ you follow the [CODE_OF_CONDUCT.md](CODE_OF_CONDUCT.md).
3333

3434
## Versioning
3535

36-
We use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/fugro-oss/LasDatasets.jl/tags).
36+
We use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/fugro-oss/LASDatasets.jl/tags).
3737

3838
## Authors
3939

docs/Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
11
[deps]
22
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
3-
LasDatasets = "cc498e2a-d443-4943-8f26-2a8a0f3c7cdb"
3+
LASDatasets = "cc498e2a-d443-4943-8f26-2a8a0f3c7cdb"

docs/make.jl

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
1-
using Documenter, LasDatasets
1+
using Documenter, LASDatasets
22

33
push!(LOAD_PATH,"../src/")
44

55
makedocs(
6-
modules = [LasDatasets],
7-
sitename = "LasDatasets.jl",
6+
modules = [LASDatasets],
7+
sitename = "LASDatasets.jl",
88
pages = [
99
"Home" => "index.md",
1010
"Interface" => "interface.md",
@@ -19,7 +19,7 @@ makedocs(
1919
)
2020

2121
deploydocs(
22-
repo = "github.com/fugro-oss/LasDatasets.jl.git",
22+
repo = "github.com/fugro-oss/LASDatasets.jl.git",
2323
versions = ["stable" => "v^", "v#.#", "dev" => "main"],
2424
push_preview=true
2525
)

docs/src/api.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# API
22

33
```@autodocs
4-
Modules = [LasDatasets]
4+
Modules = [LASDatasets]
55
```

docs/src/header.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Header
22

3-
Each *LAS* file starts with a block of header information that contains metadata for the whole file. *LasDatasets.jl* uses the `LasHeader` struct to wrap around this data and defines a user-friendly interface to modify certain aspects of it.
3+
Each *LAS* file starts with a block of header information that contains metadata for the whole file. *LASDatasets.jl* uses the `LasHeader` struct to wrap around this data and defines a user-friendly interface to modify certain aspects of it.
44

55
```@docs; canonical = false
66
LasHeader

docs/src/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# LasDatasets.jl
1+
# LASDatasets.jl
22

33
A Julia package for reading and writing *LAS* data. *LAS* is a public file format for saving and loading 3D point cloud data, and its source repository can be found [here](https://github.com/ASPRSorg/LAS). This package currently supports *LAS* specifications 1.1-1.4 (see [here](https://www.asprs.org/wp-content/uploads/2019/03/LAS_1_4_r14.pdf) for the 1.4 spec.)
44

docs/src/interface.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,22 @@
11
# High-Level Interface
22

3-
*LasDatasets.jl* provides a number of high-level functions to easily manipulate your *LAS* data.
3+
*LASDatasets.jl* provides a number of high-level functions to easily manipulate your *LAS* data.
44

55
## *LAS* Datasets
66

7-
A `LasDataset` is a wrapper around data from a *LAS* file that acts as an interface to read, write and modify your *LAS* data. In general, a *LAS* file will have the following contents:
7+
A `LASDataset` is a wrapper around data from a *LAS* file that acts as an interface to read, write and modify your *LAS* data. In general, a *LAS* file will have the following contents:
88
* File Header: contains metadata about the file contents and byte layout
99
* *VLRs*: Variable length records of data that appear before the point records
1010
* User-defined bytes: Additional bytes included after the last *VLR* and before the first point record
1111
* *LAS* point records: data assigned to each point in the point cloud (following a specific format specified in the header)
1212
* *EVLRs* : Extended *VLRs* that come after the point records (allows larger data payloads)
1313

14-
These are contained in a `LasDataset` as follows
14+
These are contained in a `LASDataset` as follows
1515
```@docs; canonical = false
16-
LasDataset
16+
LASDataset
1717
```
1818

19-
You can query the contents of your `LasDataset` by using the following functions:
19+
You can query the contents of your `LASDataset` by using the following functions:
2020
```@docs; canonical = false
2121
get_header
2222
get_pointcloud
@@ -26,7 +26,7 @@ get_user_defined_bytes
2626
```
2727

2828
## Reading
29-
To read the entire contents of a *LAS* or *LAZ* file, you can use the `load_las` function. This returns a `LasDataset` with all the properties listed above. You also have the option of only loading certain point fields.
29+
To read the entire contents of a *LAS* or *LAZ* file, you can use the `load_las` function. This returns a `LASDataset` with all the properties listed above. You also have the option of only loading certain point fields.
3030

3131
```julia
3232
# read the full dataset
@@ -54,7 +54,7 @@ load_vlrs
5454
```
5555

5656
## Writing
57-
You can write the contents of your `LasDataset` to a file by using the `save_las` function. Note that this takes either a `LasDataset` on its own or a tabular point cloud with *(E)VLRs* and user-defined bytes supplied separately.
57+
You can write the contents of your `LASDataset` to a file by using the `save_las` function. Note that this takes either a `LASDataset` on its own or a tabular point cloud with *(E)VLRs* and user-defined bytes supplied separately.
5858

5959
```@docs; canonical = false
6060
save_las
@@ -74,10 +74,10 @@ pc = Table(position = rand(SVector{3, Float64}, 10), classification = rand(UIn8,
7474
save_las("my_las.las", pc)
7575
```
7676

77-
Note that when you supply just the point cloud outside of a `LasDataset`, *LasDatasets.jl* will automatically construct the appropriate header for you so you don't need to worry about the specifics of appropriate point formats etc.
77+
Note that when you supply just the point cloud outside of a `LASDataset`, *LASDatasets.jl* will automatically construct the appropriate header for you so you don't need to worry about the specifics of appropriate point formats etc.
7878

7979
## Modifying LAS Contents
80-
You can modify point fields in your `LasDataset` by adding new columns or merging in values from an existing vector.
80+
You can modify point fields in your `LASDataset` by adding new columns or merging in values from an existing vector.
8181

8282
```@docs; canonical = false
8383
add_column!

docs/src/internals.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22

33
## Data Consistency
44

5-
When creating a `LasDataset` or writing a tabular point cloud out to a file, we need to make sure that the header information we provide is consistent with that of the point cloud and any *VLRs* and user bytes. Internally, this is done using the function `make_consistent_header!`, which compares a `LasHeader` and some *LAS* data and makes sure the header has the appropriate data offsets, flags and other metadata. This will, for example, make sure that the numbers of points, *VLRs* and *EVLRs* are consistent with the data we've provided, so your `LasDataset` is guaranteed to be consistent.
5+
When creating a `LASDataset` or writing a tabular point cloud out to a file, we need to make sure that the header information we provide is consistent with that of the point cloud and any *VLRs* and user bytes. Internally, this is done using the function `make_consistent_header!`, which compares a `LasHeader` and some *LAS* data and makes sure the header has the appropriate data offsets, flags and other metadata. This will, for example, make sure that the numbers of points, *VLRs* and *EVLRs* are consistent with the data we've provided, so your `LASDataset` is guaranteed to be consistent.
66

77
```@docs; canonical = false
8-
LasDatasets.make_consistent_header!
9-
LasDatasets.make_consistent_header
8+
LASDatasets.make_consistent_header!
9+
LASDatasets.make_consistent_header
1010
```
1111

1212
## Third Party Packages
@@ -20,21 +20,21 @@ We also use [BufferedStreams.jl](https://github.com/JuliaIO/BufferedStreams.jl)
2020
As outlined in the [User Fields Section](./user_fields.md), in order to offer full support of "extra point data" in our *LAS* files, we treat *LAS* point records as having a point, extra user fields and a set of undocumented bytes. Internally, however, this is broken up into 4 separate classes each implementing the `LasRecord` abstract type. These correspond to each combination of a point with/without user fields/undocumented bytes.
2121

2222
```@docs; canonical = false
23-
LasDatasets.LasRecord
24-
LasDatasets.PointRecord
25-
LasDatasets.ExtendedPointRecord
26-
LasDatasets.UndocPointRecord
27-
LasDatasets.FullRecord
23+
LASDatasets.LasRecord
24+
LASDatasets.PointRecord
25+
LASDatasets.ExtendedPointRecord
26+
LASDatasets.UndocPointRecord
27+
LASDatasets.FullRecord
2828
```
2929

3030
This was done largely to increase performance of reading point records, since having one single type for point records would require more conditional checks to see if certain extra fields need to be read from a file which ends up congesting the read process. Instead, we use *Julia*'s multiple dispatch and define `Base.read` and `Base.write` methods for each record type and avoid these checks and also decrease the type inference time when reading these into a vector.
3131

3232
## Reading Points Iterator
3333

34-
When reading, we also wrap our IO stream in an iterator, `LasDatasets.ReadPointsIterator`, to reduce the overhead of reading point records sequentially. It turns out that calling `map(r -> r, iter)` where `iter` is a `LasDatasets.ReadPointsIterator` is much faster than calling `map(_ -> read(io, TRecord), 1:num_points)`
34+
When reading, we also wrap our IO stream in an iterator, `LASDatasets.ReadPointsIterator`, to reduce the overhead of reading point records sequentially. It turns out that calling `map(r -> r, iter)` where `iter` is a `LASDatasets.ReadPointsIterator` is much faster than calling `map(_ -> read(io, TRecord), 1:num_points)`
3535

3636
```@docs; canonical = false
37-
LasDatasets.ReadPointsIterator
37+
LASDatasets.ReadPointsIterator
3838
```
3939

4040
## Writing Optimisations
@@ -43,12 +43,12 @@ Typically, *Julia* is slower at performing multiple consecutive smaller writes t
4343
* How many user fields in this record and their data size in bytes and
4444
* How many undocumented bytes there are.
4545

46-
This is done using `LasDatasets.get_record_bytes`, which takes a collection of *LAS* records and writes each *LAS* field, user field and extra bytes collection into its correct location in the final byte vector.
46+
This is done using `LASDatasets.get_record_bytes`, which takes a collection of *LAS* records and writes each *LAS* field, user field and extra bytes collection into its correct location in the final byte vector.
4747

4848
In order to do this, we need to frequently access each field in a (potentially huge) list of records, which in normal circumstances is slow. We instead first pass our records into a `StructVector` using [StructArrays.jl](https://github.com/JuliaArrays/StructArrays.jl) which vastly increases the speed at which we can access these fields and broadcast over them.
4949

5050
```@docs; canonical = false
51-
LasDatasets.get_record_bytes
51+
LASDatasets.get_record_bytes
5252
```
5353

5454
## Automatic Support for User Fields
@@ -57,6 +57,6 @@ In order for the system to automatically handle a user supplying their own custo
5757

5858
Firstly, the *LAS* 1.4 spec officially supports the following data types directly: `UInt8`, `Int8`, `UInt16`, `Int16`, `UInt32`, `Int32`, `UInt64`, `Int64`, `Float32` and `Float64`
5959

60-
This means that every `ExtraBytes` *VLR* **must** have a data type among these values (note that vectors are not directly supported). *LasDatasets.jl* supports static vectors (static sizing is essential) as user fields as well by internally separating out vector components and adding an `ExtraBytes` *VLR* for each component following the naming convention in the spec. That is, for a user field with `N` entries, the individual component names that are documented in the *VLRs* are "col [0]", "col [1]", ..., "col [N - 1]".
60+
This means that every `ExtraBytes` *VLR* **must** have a data type among these values (note that vectors are not directly supported). *LASDatasets.jl* supports static vectors (static sizing is essential) as user fields as well by internally separating out vector components and adding an `ExtraBytes` *VLR* for each component following the naming convention in the spec. That is, for a user field with `N` entries, the individual component names that are documented in the *VLRs* are "col [0]", "col [1]", ..., "col [N - 1]".
6161

6262
When a user passes a custom field to the system, it will firstly check that the data type for this field is either one of the above types or an `SVector` of one. If it is a vector, it will construct a list of the component element field names as above. Then, it will extract all `ExtraBytes` *VLRs* and check if any of them have matching names and update them iff they exist so their data type matches the new type supplied. If these are new fields, a new `ExtraBytes` *VLR* will be added per field name. Finally, the header is updated to reflect the new number of *VLRs*, the new data offsets and the new point record lengths.

0 commit comments

Comments
 (0)