Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions content/tools/analysis/deeplabcut.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,3 +28,8 @@ While ndx-pose was developed initially to store the output of DeepLabCut in NWB,
- Versatile tracking of body parts across species
- NWB export support through DLC2NWB
- Integration with ndx-pose extension for standardized data storage

## Resources

* [Source](https://github.com/DeepLabCut/DeepLabCut)
* [Docs](http://www.mackenziemathislab.org/deeplabcut)
4 changes: 4 additions & 0 deletions content/tools/analysis/ecogvis.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,3 +20,7 @@ EcogVIS is a Python-based, time series visualizer for Electrocorticography (ECoG
- Create annotations
- Mark intervals of interest
- Built-in support for NWB files

## Resources

* [Source](https://github.com/catalystneuro/ecogVIS)
5 changes: 5 additions & 0 deletions content/tools/analysis/movement.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,3 +18,8 @@ weight: 200
## NWB Integration

`movement` pose datasets can be loaded from and saved to NWB files. The pose tracks are formatted according to the [ndx-pose](https://github.com/rly/ndx-pose) extension. See the NWB tab in the [movement I/O docs](https://movement.neuroinformatics.dev/user_guide/input_output.html).

## Resources

* [Source](https://github.com/neuroinformatics-unit/movement)
* [Docs](https://movement.neuroinformatics.dev/)
4 changes: 4 additions & 0 deletions content/tools/analysis/neurosift.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,3 +30,7 @@ https://flatironinstitute.github.io/neurosift/#/nwb?url=https://dandiarchive.s3.
```

Replace the url query parameter with the appropriate URL for your NWB file on DANDI or anywhere else on the web.

## Resources

* [Source](https://github.com/flatironinstitute/neurosift)
4 changes: 4 additions & 0 deletions content/tools/analysis/nwbexplorer.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,7 @@ NWB Explorer is a web application developed by MetaCell for reading, visualizing
![NWB Explorer Demo](/images/tools/nwbexplorer/nwbexplorer.gif)

You can try NWB Explorer online at [https://nwbexplorer.v2.opensourcebrain.org](https://nwbexplorer.v2.opensourcebrain.org).

## Resources

* [Docs](https://nwbexplorer.v2.opensourcebrain.org)
5 changes: 5 additions & 0 deletions content/tools/analysis/nwbview.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,3 +59,8 @@ cargo run --release
```

The release flag builds the artifacts with optimizations. Do not specify it when you need to debug.

## Resources

* [Source](https://github.com/brainhack-ch/nwbview)
* [Docs](https://crates.io/crates/nwbview)
5 changes: 5 additions & 0 deletions content/tools/analysis/nwbwidgets.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,3 +47,8 @@ Instead of supplying a function for the value of the neurodata_vis_spec dict, yo
## Extending

To extend NWBWidgets, all you need to a function that takes as input an instance of a specific neurodata_type class, and outputs a matplotlib figure or a jupyter widget.

## Resources

* [Source](https://github.com/NeurodataWithoutBorders/nwbwidgets)
* [Docs](https://nwb-widgets.readthedocs.io)
5 changes: 5 additions & 0 deletions content/tools/analysis/sleap.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,8 @@ This adaptor uses the NWB extension [ndx-pose](https://github.com/rly/ndx-pose),
- Active learning capabilities
- Proofreading tools
- NWB import/export support

## Resources

* [Source](https://github.com/talmolab/sleap)
* [Docs](https://sleap.ai)
5 changes: 5 additions & 0 deletions content/tools/core/hdmf-common-schema.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,3 +43,8 @@ groups:
The HDMF Common Schema is particularly important because it defines general-purpose data structures that enable efficient and flexible representation of complex data. For example, the `DynamicTable` type provides a way to represent tabular data with columns of different data types and potentially varying numbers of rows, making it suitable for storing experimental metadata, subject information, and other tabular data.

Understanding this schema is helpful for developers creating extensions, especially when they need to represent structured data like tables, which are common in scientific datasets.

## Resources

* [Source](https://github.com/hdmf-dev/hdmf-common-schema)
* [Documentation](https://hdmf-common-schema.readthedocs.io/en/stable/)
4 changes: 4 additions & 0 deletions content/tools/core/hdmf-docutils.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,3 +39,7 @@ html_content = rst_to_html('my_documentation.rst')
HDMF Documentation Utilities help ensure that extension documentation follows consistent formatting and includes all necessary information. This consistency improves the usability of extensions and helps users understand how to properly use them in their workflows.

The tools can generate documentation from YAML schema files, create diagrams of data structures, and produce HTML and PDF documentation outputs.

## Resources

* [Source](https://github.com/hdmf-dev/hdmf-docutils)
5 changes: 5 additions & 0 deletions content/tools/core/hdmf-specification-language.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,3 +46,8 @@ groups:
## Additional Information

Understanding the HDMF Specification Language is essential for developers who want to create extensions to the NWB format. It provides the foundation for defining structured, self-describing data models that can be used across different programming languages and storage backends.

## Resources

* [Sources](https://github.com/hdmf-dev/hdmf-schema-language)
* [Docs](https://hdmf-schema-language.readthedocs.io/en/latest/)
5 changes: 5 additions & 0 deletions content/tools/core/hdmf-zarr.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,3 +54,8 @@ with NWBZarrIO('example.zarr', mode='r') as io:
HDMF-Zarr is particularly useful for working with very large datasets, especially in cloud environments or when parallel data access is needed. It provides an alternative storage format to the traditional HDF5 backend, offering different performance characteristics that may be beneficial for certain use cases.

The library is designed to be a drop-in replacement for the HDF5 backend, making it easy to integrate into existing workflows that use PyNWB.

## Resources

* [Source](https://github.com/hdmf-dev/hdmf-zarr)
* [Docs](https://hdmf-zarr.readthedocs.io/en/stable/)
5 changes: 5 additions & 0 deletions content/tools/core/hdmf.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,3 +70,8 @@ HDMF is primarily of interest to developers who need to:
5. Create validation tools

Understanding HDMF is helpful for advanced NWB users who want to contribute to the core NWB ecosystem or develop sophisticated extensions and tools.

## Resources

* [Source](https://github.com/hdmf-dev/hdmf)
* [Docs](https://hdmf.readthedocs.io/en/stable/)
5 changes: 5 additions & 0 deletions content/tools/core/matnwb.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,3 +43,8 @@ nwb = nwbRead('test_file.nwb');
## Additional Information

MatNWB allows MATLAB users to work with NWB data in their preferred environment, using familiar MATLAB data structures and methods.

## Resources

* [Source](https://github.com/NeurodataWithoutBorders/matnwb/)
* [Docs](https://matnwb.readthedocs.io/)
5 changes: 5 additions & 0 deletions content/tools/core/ndx-catalog.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,3 +37,8 @@ To use a specific extension from the catalog, follow the installation instructio
## Additional Information

The NDX Catalog is an important resource for the NWB community, facilitating the discovery and adoption of extensions that enhance the capabilities of the NWB standard for specific use cases and data types.

## Resources

* [Source](https://github.com/nwb-extensions/)
* [Docs](https://nwb-extensions.github.io/)
4 changes: 4 additions & 0 deletions content/tools/core/ndx-template.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,3 +58,7 @@ After creating your extension using the template, follow the instructions in the
5. Publish your extension to the NDX Catalog

The NDX Template is essential for maintaining consistency across the NWB extension ecosystem and making it easier for researchers to develop and share extensions.

## Resources

* [Source](https://github.com/nwb-extensions/ndx-template)
5 changes: 5 additions & 0 deletions content/tools/core/neuroconv.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,3 +55,8 @@ converter.run_conversion(
## Additional Information

NeuroConv is developed by CatalystNeuro and the NWB community to streamline the process of standardizing neurophysiology data to NWB format. It is designed to be modular and extensible, allowing for easy integration of new data formats.

## Resources

* [Source](https://github.com/catalystneuro/neuroconv)
* [Resources](https://neuroconv.readthedocs.io/en/main/index.html")
5 changes: 5 additions & 0 deletions content/tools/core/nwb-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,3 +42,8 @@ The NWB GUIDE provides a step-by-step workflow for converting data to NWB:
## Additional Information

The NWB GUIDE leverages NeuroConv behind the scenes, making it easier for researchers to adopt the NWB standard without writing code. It provides a user-friendly alternative for data conversion while maintaining the robustness and flexibility of the underlying conversion tools.

## Resources

* [Source](https://github.com/NeurodataWithoutBorders/nwb-guide)
* [Docs](https://nwb-guide.readthedocs.io/en/stable/)
5 changes: 5 additions & 0 deletions content/tools/core/nwb-schema.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,3 +52,8 @@ groups:
The NWB Schema is a living document that evolves through community contributions and feedback. Extensions build on this foundation, allowing researchers to add specialized data types while maintaining compatibility with the core NWB standard.

Understanding the NWB Schema is essential for anyone developing tools for NWB data, creating extensions, or seeking to understand the detailed structure of NWB files.

## Resources

* [Source](https://github.com/NeurodataWithoutBorders/nwb-schema)
* [Docs](https://nwb-schema.readthedocs.io/en/latest/)
5 changes: 5 additions & 0 deletions content/tools/core/nwbinspector.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,3 +52,8 @@ for issue in inspect_nwbfile(nwbfile):
NWB Inspector is particularly useful for ensuring that your NWB files not only comply with the schema but also follow best practices that make them more useful and reusable. It helps improve data quality and increases the utility of shared data.

In practice, most users should use the NWB Inspector to validate NWB files, as it helps to check for compliance with both the schema and best practices and provides greater flexibility. Direct use of PyNWB's validator is primarily useful for use cases where schema compliance and performance are of primary concern.

## Resources

* [Source](https://github.com/NeurodataWithoutBorders/nwbinspector)
* [Docs](https://nwbinspector.readthedocs.io/)
5 changes: 5 additions & 0 deletions content/tools/core/pynwb.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,3 +46,8 @@ with NWBHDF5IO('example.nwb', 'r') as io:
## Additional Information

PyNWB also includes classes and command line tools for validating compliance of files with the core NWB schema and the schema of NWB Neurodata Extensions (NDX).

## Resources

* [Source](https://github.com/NeurodataWithoutBorders/pynwb/)
* [Docs](https://pynwb.readthedocs.io/en/stable/)
4 changes: 4 additions & 0 deletions content/tools/core/staged-extensions.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,3 +43,7 @@ description: Brief description of your extension
## Additional Information

The staged-extensions repository is a key part of the NWB extension ecosystem, ensuring that all extensions in the NDX Catalog meet community standards for quality, documentation, and usability. This review process helps maintain the integrity and usefulness of the extension ecosystem.

## Resources

* [Source](https://github.com/nwb-extensions/staged-extensions)
Loading