Skip to content

Commit 9b3b340

Browse files
authored
Merge pull request #21 from NeurodataWithoutBorders/add-core-tools
Add core tools section and update tool categories in documentation
2 parents e99b06b + bdc12cf commit 9b3b340

32 files changed

+745
-3
lines changed

assets/jsconfig.json

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,10 @@
22
"compilerOptions": {
33
"baseUrl": ".",
44
"paths": {
5-
"*": null
5+
"*": [
6+
"../../../Library/Caches/hugo_cache/modules/filecache/modules/pkg/mod/github.com/gohugoio/hugo-mod-jslibs-dist/popperjs/[email protected]/package/dist/cjs/*",
7+
"../../../Library/Caches/hugo_cache/modules/filecache/modules/pkg/mod/github.com/twbs/[email protected]+incompatible/js/*"
8+
]
69
}
710
}
811
}

config/_default/menus.yaml

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,12 +29,16 @@ main:
2929
weight: 2
3030
url: "tools/"
3131
parent: "Software & Tools"
32-
- name: "Acquisition Tools"
32+
- name: "Core Tools"
3333
weight: 3
34+
url: "tools/core/"
35+
parent: "Software & Tools"
36+
- name: "Acquisition Tools"
37+
weight: 4
3438
url: "tools/acquisition/"
3539
parent: "Software & Tools"
3640
- name: "Analysis Tools"
37-
weight: 4
41+
weight: 5
3842
url: "tools/analysis/"
3943
parent: "Software & Tools"
4044

content/tools/_index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ This page is a collection of tools we are cataloging as a convenience reference
77

88
## Tool Categories
99

10+
- [Core NWB Tools]({{< ref "tools/core" >}}) - Key software packages of the core NWB software stack
1011
- [Acquisition and Control Tools]({{< ref "tools/acquisition" >}}) - Tools for data acquisition and experimental control
1112
- [Analysis and Visualization Tools]({{< ref "tools/analysis" >}}) - Tools for data analysis, visualization, and exploration
1213

content/tools/core/_index.md

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
---
2+
title: "Core NWB Tools"
3+
description: "Glossary of Core NWB Tools"
4+
---
5+
6+
The glossary shown here provides a quick overview of the key software packages of the core NWB software stack. For a more general discussion of the overall organization of the core NWB software stack see the [NWB Software Ecosystem](https://www.nwb.org/nwb-software) page on the main NWB website.
7+
8+
## Read/Write NWB File APIs
9+
10+
{{< tool-grid category="read-write-api" >}}
11+
12+
## Converting Data to NWB
13+
14+
{{< tool-grid category="data-conversion" >}}
15+
16+
## Validating NWB Files
17+
18+
{{< tool-grid category="validation" >}}
19+
20+
## Extending NWB
21+
22+
{{< tool-grid category="extension" >}}
23+
24+
## Core Development
25+
26+
{{< tool-grid category="core-development" >}}
Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
---
2+
title: "HDMF Common Schema"
3+
description: "Schema of common, general data structures used throughout the NWB Standard Schema"
4+
category: "extension"
5+
image: "/images/hdmf_common_schema_logo_framed.png"
6+
source_url: "https://github.com/hdmf-dev/hdmf-common-schema"
7+
docs_url: "https://hdmf-common-schema.readthedocs.io/en/stable/"
8+
weight: 120
9+
---
10+
11+
## Description
12+
13+
The HDMF Common Schema defines the schema of common, general data structures, which are used throughout the NWB Standard Schema but which are not specific to neurophysiology.
14+
15+
Example types defined in the HDMF common schema include all types related to `DynamicTable` for defining data tables, along with other common data structures that can be reused across different domains.
16+
17+
## Installation
18+
19+
The HDMF Common Schema is a documentation resource, not a software package to install. The documentation can be accessed at:
20+
21+
```
22+
https://hdmf-common-schema.readthedocs.io/
23+
```
24+
25+
## Usage
26+
27+
The HDMF Common Schema is primarily used by the NWB schema and by extension developers. For example, when creating an extension that needs to store tabular data, you'd typically extend or use the `DynamicTable` type from HDMF Common Schema:
28+
29+
```yaml
30+
groups:
31+
- neurodata_type_def: MyCustomTable
32+
neurodata_type_inc: DynamicTable
33+
doc: A custom table for storing specific experiment data.
34+
datasets:
35+
- name: custom_column
36+
neurodata_type_inc: VectorData
37+
doc: Custom column for my specific data.
38+
dtype: text
39+
```
40+
41+
## Additional Information
42+
43+
The HDMF Common Schema is particularly important because it defines general-purpose data structures that enable efficient and flexible representation of complex data. For example, the `DynamicTable` type provides a way to represent tabular data with columns of different data types and potentially varying numbers of rows, making it suitable for storing experimental metadata, subject information, and other tabular data.
44+
45+
Understanding this schema is helpful for developers creating extensions, especially when they need to represent structured data like tables, which are common in scientific datasets.
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
---
2+
title: "HDMF Documentation Utilities"
3+
description: "Utility tools for creating documentation for extension schema"
4+
category: "extension"
5+
image: "/images/documenting_ndx_logo_framed.png"
6+
source_url: "https://github.com/hdmf-dev/hdmf-docutils"
7+
docs_url: ""
8+
weight: 90
9+
---
10+
11+
## Description
12+
13+
The HDMF Documentation Utilities (hdmf-docutils) provide utility tools for creating documentation for extension schema defined using the NWB Schema Language. These tools help developers create clear, comprehensive documentation for their extensions.
14+
15+
The NDX Template automatically sets up the documentation for extensions via the hdmf-docutils, making them part of most NDX code repositories without requiring direct interaction with the tools.
16+
17+
## Installation
18+
19+
```bash
20+
pip install hdmf-docutils
21+
```
22+
23+
## Usage
24+
25+
The HDMF Documentation Utilities are typically used behind the scenes by the NDX Template, but can also be used directly for more customized documentation generation:
26+
27+
```python
28+
from hdmf_docutils.doctools import rst_to_html, set_figure_dirpath
29+
30+
# Set the directory path for figures
31+
set_figure_dirpath('path/to/figures')
32+
33+
# Convert RST documentation to HTML
34+
html_content = rst_to_html('my_documentation.rst')
35+
```
36+
37+
## Additional Information
38+
39+
HDMF Documentation Utilities help ensure that extension documentation follows consistent formatting and includes all necessary information. This consistency improves the usability of extensions and helps users understand how to properly use them in their workflows.
40+
41+
The tools can generate documentation from YAML schema files, create diagrams of data structures, and produce HTML and PDF documentation outputs.
Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
---
2+
title: "HDMF Specification Language"
3+
description: "Formal structures for describing the organization of complex data"
4+
category: "extension"
5+
image: "/images/specification_language_logo_framed.png"
6+
source_url: "https://github.com/hdmf-dev/hdmf-schema-language"
7+
docs_url: "https://hdmf-schema-language.readthedocs.io/en/latest/"
8+
weight: 100
9+
---
10+
11+
## Description
12+
13+
The HDMF Specification Language defines formal structures for describing the organization of complex data using basic concepts, e.g., Groups, Datasets, Attributes, and Links. It provides a standardized way to define data structures and relationships.
14+
15+
The HDMF specification language is defined by the Hierarchical Data Modeling Framework (HDMF). The NWB Specification Language is a derivative of the HDMF Specification Language with minor modifications for NWB (e.g., to use the term neurodata_type).
16+
17+
## Installation
18+
19+
The HDMF Specification Language is a documentation resource, not a software package to install. The documentation can be accessed at:
20+
21+
```
22+
https://hdmf-schema-language.readthedocs.io/
23+
```
24+
25+
## Usage
26+
27+
When creating NWB extensions, you'll use the NWB Specification Language (which builds on HDMF) to define new data types. Here's an example of a simple extension definition in YAML format:
28+
29+
```yaml
30+
groups:
31+
- neurodata_type_def: ElectricalSeries
32+
neurodata_type_inc: TimeSeries
33+
doc: A time series of electrical measurements.
34+
datasets:
35+
- name: data
36+
dtype: numeric
37+
shape:
38+
- null
39+
- null
40+
doc: The recorded voltage data.
41+
- name: electrodes
42+
neurodata_type_inc: DynamicTableRegion
43+
doc: The electrodes that this electrical series was recorded from.
44+
```
45+
46+
## Additional Information
47+
48+
Understanding the HDMF Specification Language is essential for developers who want to create extensions to the NWB format. It provides the foundation for defining structured, self-describing data models that can be used across different programming languages and storage backends.

content/tools/core/hdmf-zarr.md

Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
---
2+
title: "HDMF-Zarr"
3+
description: "Zarr backend for HDMF and PyNWB"
4+
category: "core-development"
5+
image: "/images/hdmf_zarr_logo_framed.png"
6+
source_url: "https://github.com/hdmf-dev/hdmf-zarr"
7+
docs_url: "https://hdmf-zarr.readthedocs.io/en/stable/"
8+
weight: 140
9+
---
10+
11+
## Description
12+
13+
The HDMF Zarr (HDMF-Z) library implements a Zarr backend for HDMF. Zarr is a format for the storage of chunked, compressed, N-dimensional arrays. HDMF-Z also provides convenience classes for integrating Zarr with the PyNWB Python API for NWB to support writing of NWB files to Zarr.
14+
15+
Using Zarr as a storage backend for NWB can provide benefits like:
16+
17+
- Cloud-friendly data access
18+
- Parallel read/write operations
19+
- Efficient access to data chunks
20+
- Flexible compression options
21+
22+
## Installation
23+
24+
```bash
25+
pip install hdmf-zarr
26+
```
27+
28+
## Usage
29+
30+
```python
31+
from pynwb import NWBFile, NWBHDF5IO
32+
from datetime import datetime
33+
from dateutil.tz import tzlocal
34+
from hdmf_zarr import NWBZarrIO
35+
36+
# Create a new NWB file
37+
nwbfile = NWBFile(
38+
session_description='my first recording',
39+
identifier='EXAMPLE_ID',
40+
session_start_time=datetime.now(tzlocal())
41+
)
42+
43+
# Write the file to Zarr format
44+
with NWBZarrIO('example.zarr', mode='w') as io:
45+
io.write(nwbfile)
46+
47+
# Read the file from Zarr format
48+
with NWBZarrIO('example.zarr', mode='r') as io:
49+
nwbfile = io.read()
50+
```
51+
52+
## Additional Information
53+
54+
HDMF-Zarr is particularly useful for working with very large datasets, especially in cloud environments or when parallel data access is needed. It provides an alternative storage format to the traditional HDF5 backend, offering different performance characteristics that may be beneficial for certain use cases.
55+
56+
The library is designed to be a drop-in replacement for the HDF5 backend, making it easy to integrate into existing workflows that use PyNWB.

content/tools/core/hdmf.md

Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
---
2+
title: "HDMF"
3+
description: "Hierarchical Data Modeling Framework for working with hierarchical data"
4+
category: "core-development"
5+
image: "/images/hdmf_logo_framed.png"
6+
source_url: "https://github.com/hdmf-dev/hdmf"
7+
docs_url: "https://hdmf.readthedocs.io/en/stable/"
8+
weight: 130
9+
---
10+
11+
## Description
12+
13+
The Hierarchical Data Modeling Framework (HDMF) is a Python package for working with hierarchical data. It provides APIs for specifying data models, reading and writing data to different storage backends, and representing data with Python objects.
14+
15+
HDMF builds the foundation for the PyNWB Python API for NWB. It offers a flexible, extensible approach to data modeling that allows for the creation of self-describing, structured data formats like NWB.
16+
17+
## Installation
18+
19+
```bash
20+
pip install hdmf
21+
```
22+
23+
## Usage
24+
25+
HDMF provides low-level functionality for working with hierarchical data, typically used by developers creating APIs like PyNWB:
26+
27+
```python
28+
from hdmf.spec import GroupSpec, DatasetSpec, NamespaceBuilder
29+
from hdmf.common import DynamicTable
30+
31+
# Define a new data type specification
32+
spec = GroupSpec(
33+
doc='A custom data type',
34+
data_type_def='MyType',
35+
datasets=[
36+
DatasetSpec(
37+
doc='An example dataset',
38+
name='data',
39+
dtype='float'
40+
)
41+
]
42+
)
43+
44+
# Create a namespace for your specification
45+
namespace_builder = NamespaceBuilder(
46+
doc='My extension',
47+
name='my_extension',
48+
full_name='My Custom Extension',
49+
version='0.1.0'
50+
)
51+
namespace_builder.add_spec('my_extension.yaml', spec)
52+
53+
# Working with DynamicTable
54+
table = DynamicTable(
55+
name='example_table',
56+
description='An example table'
57+
)
58+
table.add_column('column1', 'A string column', dtype='text')
59+
table.add_row(column1='example data')
60+
```
61+
62+
## Additional Information
63+
64+
HDMF is primarily of interest to developers who need to:
65+
66+
1. Create or extend data APIs like PyNWB
67+
2. Implement new storage backends
68+
3. Support new serialization formats
69+
4. Develop data conversion tools
70+
5. Create validation tools
71+
72+
Understanding HDMF is helpful for advanced NWB users who want to contribute to the core NWB ecosystem or develop sophisticated extensions and tools.

content/tools/core/matnwb.md

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
---
2+
title: "MatNWB"
3+
description: "A MATLAB library for reading and writing NWB files"
4+
category: "read-write-api"
5+
image: "/images/matnwb_logo_framed.png"
6+
source_url: "https://github.com/NeurodataWithoutBorders/matnwb/"
7+
docs_url: "https://matnwb.readthedocs.io/"
8+
weight: 20
9+
---
10+
11+
## Description
12+
13+
MatNWB is a MATLAB library for reading and writing NWB files. It provides full support for all components of the NWB standard, including support for extensions. The API is interoperable with PyNWB, i.e., files created with MatNWB can be read in PyNWB and vice versa.
14+
15+
MatNWB supports advanced read/write for efficient interaction with very large data files (i.e., data too large for main memory), via lazy data loading, iterative data write, and data compression among others.
16+
17+
## Installation
18+
19+
```bash
20+
# Clone the repository
21+
git clone https://github.com/NeurodataWithoutBorders/matnwb.git
22+
23+
# Add to MATLAB path
24+
addpath(genpath('/path/to/matnwb'));
25+
```
26+
27+
## Usage
28+
29+
```matlab
30+
% Create a new NWB file
31+
nwb = NwbFile( ...
32+
'session_description', 'a test NWB File', ...
33+
'identifier', 'mouse001', ...
34+
'session_start_time', datetime('now'));
35+
36+
% Write the file
37+
nwbExport(nwb, 'test_file.nwb');
38+
39+
% Read the file
40+
nwb = nwbRead('test_file.nwb');
41+
```
42+
43+
## Additional Information
44+
45+
MatNWB allows MATLAB users to work with NWB data in their preferred environment, using familiar MATLAB data structures and methods.

0 commit comments

Comments
 (0)