Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion assets/jsconfig.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,10 @@
"compilerOptions": {
"baseUrl": ".",
"paths": {
"*": null
"*": [
"../../../Library/Caches/hugo_cache/modules/filecache/modules/pkg/mod/github.com/gohugoio/hugo-mod-jslibs-dist/popperjs/[email protected]/package/dist/cjs/*",
"../../../Library/Caches/hugo_cache/modules/filecache/modules/pkg/mod/github.com/twbs/[email protected]+incompatible/js/*"
]
}
}
}
8 changes: 6 additions & 2 deletions config/_default/menus.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,12 +29,16 @@ main:
weight: 2
url: "tools/"
parent: "Software & Tools"
- name: "Acquisition Tools"
- name: "Core Tools"
weight: 3
url: "tools/core/"
parent: "Software & Tools"
- name: "Acquisition Tools"
weight: 4
url: "tools/acquisition/"
parent: "Software & Tools"
- name: "Analysis Tools"
weight: 4
weight: 5
url: "tools/analysis/"
parent: "Software & Tools"

Expand Down
1 change: 1 addition & 0 deletions content/tools/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ This page is a collection of tools we are cataloging as a convenience reference

## Tool Categories

- [Core NWB Tools]({{< ref "tools/core" >}}) - Key software packages of the core NWB software stack
- [Acquisition and Control Tools]({{< ref "tools/acquisition" >}}) - Tools for data acquisition and experimental control
- [Analysis and Visualization Tools]({{< ref "tools/analysis" >}}) - Tools for data analysis, visualization, and exploration

Expand Down
26 changes: 26 additions & 0 deletions content/tools/core/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
---
title: "Core NWB Tools"
description: "Glossary of Core NWB Tools"
---

The glossary shown here provides a quick overview of the key software packages of the core NWB software stack. For a more general discussion of the overall organization of the core NWB software stack see the [NWB Software Ecosystem](https://www.nwb.org/nwb-software) page on the main NWB website.

## Read/Write NWB File APIs

{{< tool-grid category="read-write-api" >}}

## Converting Data to NWB

{{< tool-grid category="data-conversion" >}}

## Validating NWB Files

{{< tool-grid category="validation" >}}

## Extending NWB

{{< tool-grid category="extension" >}}

## Core Development

{{< tool-grid category="core-development" >}}
45 changes: 45 additions & 0 deletions content/tools/core/hdmf-common-schema.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
---
title: "HDMF Common Schema"
description: "Schema of common, general data structures used throughout the NWB Standard Schema"
category: "extension"
image: "/images/hdmf_common_schema_logo_framed.png"
source_url: "https://github.com/hdmf-dev/hdmf-common-schema"
docs_url: "https://hdmf-common-schema.readthedocs.io/en/stable/"
weight: 120
---

## Description

The HDMF Common Schema defines the schema of common, general data structures, which are used throughout the NWB Standard Schema but which are not specific to neurophysiology.

Example types defined in the HDMF common schema include all types related to `DynamicTable` for defining data tables, along with other common data structures that can be reused across different domains.

## Installation

The HDMF Common Schema is a documentation resource, not a software package to install. The documentation can be accessed at:

```
https://hdmf-common-schema.readthedocs.io/
```

## Usage

The HDMF Common Schema is primarily used by the NWB schema and by extension developers. For example, when creating an extension that needs to store tabular data, you'd typically extend or use the `DynamicTable` type from HDMF Common Schema:

```yaml
groups:
- neurodata_type_def: MyCustomTable
neurodata_type_inc: DynamicTable
doc: A custom table for storing specific experiment data.
datasets:
- name: custom_column
neurodata_type_inc: VectorData
doc: Custom column for my specific data.
dtype: text
```

## Additional Information

The HDMF Common Schema is particularly important because it defines general-purpose data structures that enable efficient and flexible representation of complex data. For example, the `DynamicTable` type provides a way to represent tabular data with columns of different data types and potentially varying numbers of rows, making it suitable for storing experimental metadata, subject information, and other tabular data.

Understanding this schema is helpful for developers creating extensions, especially when they need to represent structured data like tables, which are common in scientific datasets.
41 changes: 41 additions & 0 deletions content/tools/core/hdmf-docutils.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
---
title: "HDMF Documentation Utilities"
description: "Utility tools for creating documentation for extension schema"
category: "extension"
image: "/images/documenting_ndx_logo_framed.png"
source_url: "https://github.com/hdmf-dev/hdmf-docutils"
docs_url: ""
weight: 90
---

## Description

The HDMF Documentation Utilities (hdmf-docutils) provide utility tools for creating documentation for extension schema defined using the NWB Schema Language. These tools help developers create clear, comprehensive documentation for their extensions.

The NDX Template automatically sets up the documentation for extensions via the hdmf-docutils, making them part of most NDX code repositories without requiring direct interaction with the tools.

## Installation

```bash
pip install hdmf-docutils
```

## Usage

The HDMF Documentation Utilities are typically used behind the scenes by the NDX Template, but can also be used directly for more customized documentation generation:

```python
from hdmf_docutils.doctools import rst_to_html, set_figure_dirpath

# Set the directory path for figures
set_figure_dirpath('path/to/figures')

# Convert RST documentation to HTML
html_content = rst_to_html('my_documentation.rst')
```

## Additional Information

HDMF Documentation Utilities help ensure that extension documentation follows consistent formatting and includes all necessary information. This consistency improves the usability of extensions and helps users understand how to properly use them in their workflows.

The tools can generate documentation from YAML schema files, create diagrams of data structures, and produce HTML and PDF documentation outputs.
48 changes: 48 additions & 0 deletions content/tools/core/hdmf-specification-language.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
---
title: "HDMF Specification Language"
description: "Formal structures for describing the organization of complex data"
category: "extension"
image: "/images/specification_language_logo_framed.png"
source_url: "https://github.com/hdmf-dev/hdmf-schema-language"
docs_url: "https://hdmf-schema-language.readthedocs.io/en/latest/"
weight: 100
---

## Description

The HDMF Specification Language defines formal structures for describing the organization of complex data using basic concepts, e.g., Groups, Datasets, Attributes, and Links. It provides a standardized way to define data structures and relationships.

The HDMF specification language is defined by the Hierarchical Data Modeling Framework (HDMF). The NWB Specification Language is a derivative of the HDMF Specification Language with minor modifications for NWB (e.g., to use the term neurodata_type).

## Installation

The HDMF Specification Language is a documentation resource, not a software package to install. The documentation can be accessed at:

```
https://hdmf-schema-language.readthedocs.io/
```

## Usage

When creating NWB extensions, you'll use the NWB Specification Language (which builds on HDMF) to define new data types. Here's an example of a simple extension definition in YAML format:

```yaml
groups:
- neurodata_type_def: ElectricalSeries
neurodata_type_inc: TimeSeries
doc: A time series of electrical measurements.
datasets:
- name: data
dtype: numeric
shape:
- null
- null
doc: The recorded voltage data.
- name: electrodes
neurodata_type_inc: DynamicTableRegion
doc: The electrodes that this electrical series was recorded from.
```

## Additional Information

Understanding the HDMF Specification Language is essential for developers who want to create extensions to the NWB format. It provides the foundation for defining structured, self-describing data models that can be used across different programming languages and storage backends.
56 changes: 56 additions & 0 deletions content/tools/core/hdmf-zarr.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
title: "HDMF-Zarr"
description: "Zarr backend for HDMF and PyNWB"
category: "core-development"
image: "/images/hdmf_zarr_logo_framed.png"
source_url: "https://github.com/hdmf-dev/hdmf-zarr"
docs_url: "https://hdmf-zarr.readthedocs.io/en/stable/"
weight: 140
---

## Description

The HDMF Zarr (HDMF-Z) library implements a Zarr backend for HDMF. Zarr is a format for the storage of chunked, compressed, N-dimensional arrays. HDMF-Z also provides convenience classes for integrating Zarr with the PyNWB Python API for NWB to support writing of NWB files to Zarr.

Using Zarr as a storage backend for NWB can provide benefits like:

- Cloud-friendly data access
- Parallel read/write operations
- Efficient access to data chunks
- Flexible compression options

## Installation

```bash
pip install hdmf-zarr
```

## Usage

```python
from pynwb import NWBFile, NWBHDF5IO
from datetime import datetime
from dateutil.tz import tzlocal
from hdmf_zarr import NWBZarrIO

# Create a new NWB file
nwbfile = NWBFile(
session_description='my first recording',
identifier='EXAMPLE_ID',
session_start_time=datetime.now(tzlocal())
)

# Write the file to Zarr format
with NWBZarrIO('example.zarr', mode='w') as io:
io.write(nwbfile)

# Read the file from Zarr format
with NWBZarrIO('example.zarr', mode='r') as io:
nwbfile = io.read()
```

## Additional Information

HDMF-Zarr is particularly useful for working with very large datasets, especially in cloud environments or when parallel data access is needed. It provides an alternative storage format to the traditional HDF5 backend, offering different performance characteristics that may be beneficial for certain use cases.

The library is designed to be a drop-in replacement for the HDF5 backend, making it easy to integrate into existing workflows that use PyNWB.
72 changes: 72 additions & 0 deletions content/tools/core/hdmf.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
---
title: "HDMF"
description: "Hierarchical Data Modeling Framework for working with hierarchical data"
category: "core-development"
image: "/images/hdmf_logo_framed.png"
source_url: "https://github.com/hdmf-dev/hdmf"
docs_url: "https://hdmf.readthedocs.io/en/stable/"
weight: 130
---

## Description

The Hierarchical Data Modeling Framework (HDMF) is a Python package for working with hierarchical data. It provides APIs for specifying data models, reading and writing data to different storage backends, and representing data with Python objects.

HDMF builds the foundation for the PyNWB Python API for NWB. It offers a flexible, extensible approach to data modeling that allows for the creation of self-describing, structured data formats like NWB.

## Installation

```bash
pip install hdmf
```

## Usage

HDMF provides low-level functionality for working with hierarchical data, typically used by developers creating APIs like PyNWB:

```python
from hdmf.spec import GroupSpec, DatasetSpec, NamespaceBuilder
from hdmf.common import DynamicTable

# Define a new data type specification
spec = GroupSpec(
doc='A custom data type',
data_type_def='MyType',
datasets=[
DatasetSpec(
doc='An example dataset',
name='data',
dtype='float'
)
]
)

# Create a namespace for your specification
namespace_builder = NamespaceBuilder(
doc='My extension',
name='my_extension',
full_name='My Custom Extension',
version='0.1.0'
)
namespace_builder.add_spec('my_extension.yaml', spec)

# Working with DynamicTable
table = DynamicTable(
name='example_table',
description='An example table'
)
table.add_column('column1', 'A string column', dtype='text')
table.add_row(column1='example data')
```

## Additional Information

HDMF is primarily of interest to developers who need to:

1. Create or extend data APIs like PyNWB
2. Implement new storage backends
3. Support new serialization formats
4. Develop data conversion tools
5. Create validation tools

Understanding HDMF is helpful for advanced NWB users who want to contribute to the core NWB ecosystem or develop sophisticated extensions and tools.
45 changes: 45 additions & 0 deletions content/tools/core/matnwb.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
---
title: "MatNWB"
description: "A MATLAB library for reading and writing NWB files"
category: "read-write-api"
image: "/images/matnwb_logo_framed.png"
source_url: "https://github.com/NeurodataWithoutBorders/matnwb/"
docs_url: "https://matnwb.readthedocs.io/"
weight: 20
---

## Description

MatNWB is a MATLAB library for reading and writing NWB files. It provides full support for all components of the NWB standard, including support for extensions. The API is interoperable with PyNWB, i.e., files created with MatNWB can be read in PyNWB and vice versa.

MatNWB supports advanced read/write for efficient interaction with very large data files (i.e., data too large for main memory), via lazy data loading, iterative data write, and data compression among others.

## Installation

```bash
# Clone the repository
git clone https://github.com/NeurodataWithoutBorders/matnwb.git

# Add to MATLAB path
addpath(genpath('/path/to/matnwb'));
```

## Usage

```matlab
% Create a new NWB file
nwb = NwbFile( ...
'session_description', 'a test NWB File', ...
'identifier', 'mouse001', ...
'session_start_time', datetime('now'));

% Write the file
nwbExport(nwb, 'test_file.nwb');

% Read the file
nwb = nwbRead('test_file.nwb');
```

## Additional Information

MatNWB allows MATLAB users to work with NWB data in their preferred environment, using familiar MATLAB data structures and methods.
Loading
Loading