Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .codespellrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[codespell]
skip = tst_wrf_reads.c,tst_utf8_phrases.c,t_ncio.in,env,*.dods,*.dmp
ignore-words-list = berkley,boths,casesensistive,ccompiler,certificat,clen,clos,conly,datas,equil,fave,filetests,firs,fpr,gage,ges,hace,hieght,indx,inout,installin,intergers,lod,lond,makin,mapp,nam,nce,nd,ndefined,nmme,oce,offsetp,oint,olt,ot,pass-thru,pres,relm,restructed,rin,sav,shs,statics,stopp,testng,tje,udo,varian,varn,vas
2 changes: 1 addition & 1 deletion CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -732,7 +732,7 @@ if(NOT WIN32)
set(USE_STDIO ON CACHE BOOL "")
endif()

# FFIO insteaad of PosixIO
# FFIO instead of PosixIO
option(NETCDF_ENABLE_FFIO "If true, use ffio instead of posixio" OFF)
if(NETCDF_ENABLE_FFIO)
set(USE_FFIO ON CACHE BOOL "")
Expand Down
18 changes: 9 additions & 9 deletions RELEASE_NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ This file contains a high-level description of this package's evolution. Release
* Add new environmental logging variable for `netCDF4` related logging subsystem, `NC4LOGGING`. If `libnetcdf` is compiled with logging enabled, logs can be enabled at runtime by setting this environmental variable to the desired log level.
* Update ncjson.[ch] and ncproplist.[ch]. Also fix references to old API. Also fix include/netcdf_ncjson.h and include/netcdf_proplist.h builds. See [Github #3086](https://github.com/Unidata/netcdf-c/pull/3086) for more information.
* Refactor drc.c to move many of its purely utility functions into dutil.c. Also change the NC_mktmp signature. Change other files to match. See [Github #3094](https://github.com/Unidata/netcdf-c/pull/3094) for more information.
* Provide an auxilliary function, `ncaux_parse_provenance()`, that allows users to parse the _NCProperties attribute into a collection of character pointers. See [Github #3088](https://github.com/Unidata/netcdf-c/pull/3088) for more information.
* Provide an auxiliary function, `ncaux_parse_provenance()`, that allows users to parse the _NCProperties attribute into a collection of character pointers. See [Github #3088](https://github.com/Unidata/netcdf-c/pull/3088) for more information.
* Fix a namespace problem in tinyxml2.cpp. Note that this is a visual studio problem hence use of _MSC_VER. Also turn off DAP4 tests against Hyrax server until DAP4 spec problems are fixed. See [Github #3089](https://github.com/Unidata/netcdf-c/pull/3089) for more information.


Expand Down Expand Up @@ -165,7 +165,7 @@ This file contains a high-level description of this package's evolution. Release
* Modernize CURL in netCDF cmake by \@WardF in https://github.com/Unidata/netcdf-c/pull/2904
* Misc clang-tidy fixes, and added a .clang-tidy config file by \@seanm in https://github.com/Unidata/netcdf-c/pull/2875
* Rename the vendored strlcat symbol by \@weiznich in https://github.com/Unidata/netcdf-c/pull/2906
* Remove superflous check for libcurl by \@WardF in https://github.com/Unidata/netcdf-c/pull/2907
* Remove superfluous check for libcurl by \@WardF in https://github.com/Unidata/netcdf-c/pull/2907
* Fix warnings in `ncgen3` by \@ZedThree in https://github.com/Unidata/netcdf-c/pull/2900
* Add CI for a Windows Runner on Github Actions. by \@WardF in https://github.com/Unidata/netcdf-c/pull/2910
* Fix conversion warnings in libdispatch by \@ZedThree in https://github.com/Unidata/netcdf-c/pull/2905
Expand All @@ -192,8 +192,8 @@ This file contains a high-level description of this package's evolution. Release
* Fix cmake-based libnetcdf.settings.in by \@WardF in https://github.com/Unidata/netcdf-c/pull/2944
* CI: Add Cygwin CMake run by \@DWesl in https://github.com/Unidata/netcdf-c/pull/2930
* Convert NCzarr meta-data to use only Zarr attributes by \@DennisHeimbigner in https://github.com/Unidata/netcdf-c/pull/2936
* Fix: CMAKE_MODULE_PATH contents is being overriden with -D contents, not merged with by \@gorloffslava in https://github.com/Unidata/netcdf-c/pull/2946
* Propegate change to metadata and use of anonymous dimensions to NCZarr test by \@WardF in https://github.com/Unidata/netcdf-c/pull/2949
* Fix: CMAKE_MODULE_PATH contents is being overridden with -D contents, not merged with by \@gorloffslava in https://github.com/Unidata/netcdf-c/pull/2946
* Propagate change to metadata and use of anonymous dimensions to NCZarr test by \@WardF in https://github.com/Unidata/netcdf-c/pull/2949
* S3 Mode url reconstruction defaults to wrong server type by \@mannreis in https://github.com/Unidata/netcdf-c/pull/2947
* Fix most warnings in libdap2 by \@ZedThree in https://github.com/Unidata/netcdf-c/pull/2887
* Check if HDF5 "file" is a DAOS object by \@brtnfld in https://github.com/Unidata/netcdf-c/pull/2021
Expand Down Expand Up @@ -384,7 +384,7 @@ This is the maintenance release which adds support for HDF5 version 1.14.0, in a
* Adding all-error-codes.md back in to distribution documentation. by \@WardF in https://github.com/Unidata/netcdf-c/pull/2501
* Update hdf5 version in github actions. by \@WardF in https://github.com/Unidata/netcdf-c/pull/2504
* Minor update to doxygen function documentation by \@gsjaardema in https://github.com/Unidata/netcdf-c/pull/2451
* Fix some addtional errors in NCZarr by \@DennisHeimbigner in https://github.com/Unidata/netcdf-c/pull/2503
* Fix some additional errors in NCZarr by \@DennisHeimbigner in https://github.com/Unidata/netcdf-c/pull/2503
* Cleanup szip handling some more by \@DennisHeimbigner in https://github.com/Unidata/netcdf-c/pull/2421
* Check for zstd development headers in autotools by \@WardF in https://github.com/Unidata/netcdf-c/pull/2507
* Add new options to nc-config by \@WardF in https://github.com/Unidata/netcdf-c/pull/2509
Expand Down Expand Up @@ -494,7 +494,7 @@ for domain specific info such as used by GDAL/Zarr. See [Github #2278](https://g
* [Enhancement] Improve support for msys2+mingw platform. See [Github #2171](https://github.com/Unidata/netcdf-c/pull/2171).
* [Bug Fix] Clean up the various inter-test dependencies in ncdump for CMake. See [Github #2168](https://github.com/Unidata/netcdf-c/pull/2168).
* [Bug Fix] Fix use of non-aws appliances. See [Github #2152](https://github.com/Unidata/netcdf-c/pull/2152).
* [Enhancement] Added options to suppress the new behavior from [Github #2135](https://github.com/Unidata/netcdf-c/pull/2135). The options for `cmake` and `configure` are, respectively `-DENABLE_LIBXML2` and `--(enable/disable)-libxml2`. Both of these options defaul to 'on/enabled'. When disabled, the bundled `ezxml` XML interpreter is used regardless of whether `libxml2` is present on the system.
* [Enhancement] Added options to suppress the new behavior from [Github #2135](https://github.com/Unidata/netcdf-c/pull/2135). The options for `cmake` and `configure` are, respectively `-DENABLE_LIBXML2` and `--(enable/disable)-libxml2`. Both of these options default to 'on/enabled'. When disabled, the bundled `ezxml` XML interpreter is used regardless of whether `libxml2` is present on the system.
* [Enhancement] Support optional use of libxml2, otherwise default to ezxml. See [Github #2135](https://github.com/Unidata/netcdf-c/pull/2135) -- H/T to [Egbert Eich](https://github.com/e4t).
* [Bug Fix] Fix several os related errors. See [Github #2138](https://github.com/Unidata/netcdf-c/pull/2138).
* [Enhancement] Support byte-range reading of netcdf-3 files stored in private buckets in S3. See [Github #2134](https://github.com/Unidata/netcdf-c/pull/2134)
Expand Down Expand Up @@ -580,7 +580,7 @@ nc_inq_var_filter [https://github.com/Unidata/netcdf-c/issues/1693].
* [Bug Fix] Use proper CURLOPT values for VERIFYHOST and VERIFYPEER; the semantics for VERIFYHOST in particular changed. Documented in NUG/DAP2.md. See [https://github.com/Unidata/netcdf-c/issues/1684].
* [Bug Fix][cmake] Correct an issue with parallel filter test logic in CMake-based builds.
* [Bug Fix] Now allow nc_inq_var_deflate()/nc_inq_var_szip() to be called for all formats, not just HDF5. Non-HDF5 files return NC_NOERR and report no compression in use. This reverts behavior that was changed in the 4.7.4 release. See [https://github.com/Unidata/netcdf-c/issues/1691].
* [Bug Fix] Compiling on a big-endian machine exposes some missing forward delcarations in dfilter.c.
* [Bug Fix] Compiling on a big-endian machine exposes some missing forward declarations in dfilter.c.
* [File Change] Change from HDF5 v1.6 format compatibility, back to v1.8 compatibility, for newly created files. The superblock changes from version 0 back to version 2. An exception is when using libhdf5 deprecated versions 1.10.0 and 1.10.1, which can only create v1.6 compatible format. Full backward read/write compatibility for netCDF-4 is maintained in all cases. See [Github #951](https://github.com/Unidata/netcdf-c/issues/951).

## 4.7.4 - March 27, 2020
Expand All @@ -590,7 +590,7 @@ nc_inq_var_filter [https://github.com/Unidata/netcdf-c/issues/1693].
* [Bug Fix] Correct behavior for the command line utilities when directly accessing a directory using utf8 characters. See [Github #1669](https://github.com/Unidata/netcdf-c/issues/1669), [Github #1668](https://github.com/Unidata/netcdf-c/issues/1668) and [Github #1666](https://github.com/Unidata/netcdf-c/issues/1666) for more information.
* [Bug Fix] Attempts to set filters or chunked storage on scalar vars will now return NC_EINVAL. Scalar vars cannot be chunked, and only chunked vars can have filters. Previously the library ignored these attempts, and always storing scalars as contiguous storage. See [https://github.com/Unidata/netcdf-c/issues/1644].
* [Enhancement] Support has been added for multiple filters per variable. See [https://github.com/Unidata/netcdf-c/issues/1584].
* [Enhancement] Now nc_inq_var_szip retuns 0 for parameter values if szip is not in use for var. See [https://github.com/Unidata/netcdf-c/issues/1618].
* [Enhancement] Now nc_inq_var_szip returns 0 for parameter values if szip is not in use for var. See [https://github.com/Unidata/netcdf-c/issues/1618].
* [Enhancement] Now allow parallel I/O with filters, for HDF5-1.10.3 and later. See [https://github.com/Unidata/netcdf-c/issues/1473].
* [Enhancement] Increased default size of cache buffer to 16 MB, from 4 MB. Increased number of slots to 4133. See [https://github.com/Unidata/netcdf-c/issues/1541].
* [Enhancement] Allow zlib compression to be used with parallel I/O writes, if HDF5 version is 1.10.3 or greater. See [https://github.com/Unidata/netcdf-c/issues/1580].
Expand Down Expand Up @@ -683,7 +683,7 @@ See [GitHub #1251](https://github.com/Unidata/netcdf-c/issues/1251).
## 4.6.1 - March 19, 2018

* [Bug Fix] Corrected an issue which could result in a dap4 failure. See [Github #888](https://github.com/Unidata/netcdf-c/pull/888) for more information.
* [Bug Fix][Enhancement] Allow `nccopy` to control output filter suppresion. See [Github #894](https://github.com/Unidata/netcdf-c/pull/894) for more information.
* [Bug Fix][Enhancement] Allow `nccopy` to control output filter suppression. See [Github #894](https://github.com/Unidata/netcdf-c/pull/894) for more information.
* [Enhancement] Reverted some new behaviors that, while in line with the netCDF specification, broke existing workflows. See [Github #843](https://github.com/Unidata/netcdf-c/issues/843) for more information.
* [Bug Fix] Improved support for CRT builds with Visual Studio, improves zlib detection in hdf5 library. See [Github #853](https://github.com/Unidata/netcdf-c/pull/853) for more information.
* [Enhancement][Internal] Moved HDF4 into a distinct dispatch layer. See [Github #849](https://github.com/Unidata/netcdf-c/pull/849) for more information.
Expand Down
2 changes: 1 addition & 1 deletion configure.ac
Original file line number Diff line number Diff line change
Expand Up @@ -1800,7 +1800,7 @@ if test "x$enable_hdf5" = xyes ; then
if test "x$hdf5_parallel" = xno; then
# If user specifically asked for parallel4, then error out.
if test "x$user_set_parallel4" = xyes; then
AC_MSG_ERROR([Paralllel IO in netCDF-4 requested, but HDF5 does not provide parallel IO.])
AC_MSG_ERROR([Parallel IO in netCDF-4 requested, but HDF5 does not provide parallel IO.])
fi
# User didn't specify, so disable parallel4
enable_parallel4=no
Expand Down
2 changes: 1 addition & 1 deletion dap4_test/test_earthdata.sh
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ THISDIR=`pwd`
cd $ISOPATH

# This shell script tests reading of
# publically accessible DAP4 datasets on earthdata
# publicly accessible DAP4 datasets on earthdata

# Setup
# Assume environment variable USERPWD="username:password"
Expand Down
6 changes: 3 additions & 3 deletions docs/cloud.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ These are as follows.

1. _--enable-s3_ -- Enable S3 support.
2. _--enable-s3-internal_ -- Force use of the *nch5s3comms* SDK instead of the *aws-cpp-sdk* (assuming it is available).
3. _--with-s3-testing_=yes|no|public -- "yes" means do all S3 tests, "no" means do no S3 testing, "public" means do only those tests that involve publically accessible S3 data.
3. _--with-s3-testing_=yes|no|public -- "yes" means do all S3 tests, "no" means do no S3 testing, "public" means do only those tests that involve publicly accessible S3 data.

__A note about using S3 with Automake.__
If S3 support is desired, and using the Amazon "aws-sdk-cpp" SDK, and using Automake, then LDFLAGS must be properly set, namely to this.
Expand All @@ -119,9 +119,9 @@ Note also that if S3 support is enabled, then you need to have a C++ compiler in

The necessary CMake flags are as follows (with defaults)

1. *-DNETCDF_ENABLE_S3* -- Controll S3 support
1. *-DNETCDF_ENABLE_S3* -- Control S3 support
2. *-DNETCDF_ENABLE_S3_INTERNAL* -- Force use of the *nch5s3comms* SDK instead of the *aws-cpp-sdk*.
3. *-DWITH-S3-TESTING_=ON|OFF|PUBLIC -- "ON" means do all S3 tests, "OFF" means do no S3 testing, "PUBLIC" means do only those tests that involve publically accessible S3 data.
3. *-DWITH-S3-TESTING_=ON|OFF|PUBLIC -- "ON" means do all S3 tests, "OFF" means do no S3 testing, "PUBLIC" means do only those tests that involve publicly accessible S3 data.

Note that unlike Automake, CMake can properly locate C++ libraries, so it should not be necessary to specify _-laws-cpp-sdk-s3_ assuming that the aws s3 libraries are installed in the default location.
For CMake with Visual Studio, the default location is here:
Expand Down
4 changes: 2 additions & 2 deletions docs/file_format_specifications.md
Original file line number Diff line number Diff line change
Expand Up @@ -534,7 +534,7 @@ In the case where a coordinate variable is defined for a dimension,
the HDF5 dimscale matches the type of the netCDF coordinate variable,
and contains the coordinate data.

A further difficulty arrises when an n-dimensional coordinate variable
A further difficulty arises when an n-dimensional coordinate variable
is defined, where n is greater than one. NetCDF allows such coordinate
variables, but the HDF5 model does not allow dimension scales to be
attached to other dimension scales, making it impossible to completely
Expand Down Expand Up @@ -645,7 +645,7 @@ NC_STRING).
An odd case may arise in which the user defines a variable with the
same name as a dimension, but which is not intended to be the
coordinate variable for that dimension. In this case the string
"_nc4_non_coord_" is pre-pended to the name of the HDF5 dataset, and
"_nc4_non_coord_" is prepended to the name of the HDF5 dataset, and
stripped from the name for the netCDF API.

\subsection atts_spec Attributes
Expand Down
12 changes: 6 additions & 6 deletions docs/filters.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ A user may encounter an incompatibility if any of the following appears in user
* The function *\_nc\_inq\_var\_filter* was returning the error value NC\_ENOFILTER if a variable had no associated filters.
It has been reverted to the previous case where it returns NC\_NOERR and the returned filter id was set to zero if the variable had no filters.
* The function *nc\_inq\_var\_filterids* was renamed to *nc\_inq\_var\_filter\_ids*.
* Some auxilliary functions for parsing textual filter specifications have been moved to the file *netcdf\_aux.h*. See [Appendix A](#filters_appendixa).
* Some auxiliary functions for parsing textual filter specifications have been moved to the file *netcdf\_aux.h*. See [Appendix A](#filters_appendixa).
* All of the "filterx" functions have been removed. This is unlikely to cause problems because they had limited visibility.

For additional information, see [Appendix B](#filters_appendixb).
Expand Down Expand Up @@ -355,7 +355,7 @@ Some things to note.
2. For an untagged positive integer, the constant is treated as of the smallest type into which it fits (i.e. 8,16,32, or 64 bit).
3. For signed byte and short, the value is sign extended to 32 bits and then treated as an unsigned int value, but maintaining the bit-pattern.
4. For double, and signed|unsigned long long, they are converted as specified in the section on <a href="#filters_paramcoding">parameter encode/decode</a>.
5. In order to support mutiple filters, the argument to *\_Filter* may be a pipeline separated (using '|') to specify a list of filters specs.
5. In order to support multiple filters, the argument to *\_Filter* may be a pipeline separated (using '|') to specify a list of filters specs.

## Dynamic Loading Process {#filters_Process}

Expand Down Expand Up @@ -423,7 +423,7 @@ So it has three parameters:
NCZarr has four constraints that must be met.

1. It must store its filter information in its metadata in the above JSON dictionary format.
2. It is required to re-use the HDF5 filter implementations.
2. It is required to reuse the HDF5 filter implementations.
This is to avoid having to rewrite the filter implementations
This means that some mechanism is needed to translate between the HDF5 id+parameter model and the Zarr JSON dictionary model.
3. It must be possible to modify the set of visible parameters in response to environment information such as the type of the associated variable; this is required to mimic the corresponding HDF5 capability.
Expand Down Expand Up @@ -744,7 +744,7 @@ When reading a file the order of application is of necessity the reverse.
There are some special cases.

1. The fletcher32 filter is always applied first, if enabled.
2. If *nc\_def\_var\_filter*or *nc\_def\_var\_deflate*or *nc\_def\_var\_szip*is called multiple times with the same filter id, but possibly with different sets of parameters, then the position of that filter in the sequence of applictions does not change.
2. If *nc\_def\_var\_filter*or *nc\_def\_var\_deflate*or *nc\_def\_var\_szip*is called multiple times with the same filter id, but possibly with different sets of parameters, then the position of that filter in the sequence of applications does not change.
However the last set of parameters specified is used when actually writing the dataset.
3. Deflate and shuffle &mdash; these two are inextricably linked in the current API, but have quite different semantics.
If you call *nc\_def\_var\_deflate*multiple times, then the previous rule applies with respect to deflate.
Expand Down Expand Up @@ -1131,7 +1131,7 @@ So aside from those four, the current set of standard filters is as follows.
<tr><td>bzip2<td>307<td>https://sourceware.org/bzip2/
</table>

It is important to note that in order to use each standard filter, several additonal libraries must be installed.
It is important to note that in order to use each standard filter, several additional libraries must be installed.
Consider the zstandard compressor, which is one of the supported standard filters.
When installing the netcdf library, the following other libraries must be installed.

Expand Down Expand Up @@ -1202,7 +1202,7 @@ A user may encounter an incompatibility if any of the following appears in user
* The function *nc\_inq\_var\_filter* was returning the error value NC\_ENOFILTER if a variable had no associated filters.
It has been reverted to the previous case where it returns NC\_NOERR and the returned filter id was set to zero if the variable had no filters.
* The function *nc\_inq\_var\_filterids* was renamed to *nc\_inq\_var\_filter\_ids*.
* Some auxilliary functions for parsing textual filter specifications have been moved to the file *netcdf\_aux.h*. See [Appendix A](#filters_appendixa).
* Some auxiliary functions for parsing textual filter specifications have been moved to the file *netcdf\_aux.h*. See [Appendix A](#filters_appendixa).
* All of the "filterx" functions have been removed. This is unlikely to cause problems because they had limited visibility.

For additional information, see [Appendix B](#filters_appendixb).
Expand Down
Loading
Loading