hi @jayd1860 and @dboas, as more groups are working on adding supports to snirf (see mne-tools/mne-python#7057, brainstorm-tools/brainstorm3#283, fNIRS/snirf-samples#6) we need to make sure the sample files are consistent, not only across our own matlab parsers, but also parsers written with other languages.
@jayd1860, according to our previous conversation, I know you transpose data in the loading phase of a .snirf file instead of the saving phase. This is not an issue if one uses snirf_homer3 for both saving and loading, but can cause an issue for other parsers.
I just did a quick test using python, and the snirf_homer3 generated sample data shows incorrect data dimensions, here are the script
first, installing python hdf5 support (and numpy) via
sudo apt-get install python-h5py python-numpy
then start python, and load the neuro_run01.snirf file that I previously created using homer3's snirf parser
import h5py
import numpy as np
dat=h5py.File('neuro_run01.snirf','r')
d1=np.array(dat.get('/nirs/data1/dataTimeSeries'));
d1.shape
(18, 8000)
this is inconsistent with our specification, where the dimensions is defined as #time points x #channels.
The transposed data is a result of directly saving a column-major matlab array to a row-majored HDF5 dataset.
I expect that @huppertt's C# parser will also experience similar transpose issues.
In comparison, my easyh5 toolbox transposes the data before saving to hdf5, so it is consistent if the file is opened with other programming languages.
Please let me know if you are able to fix this, and I will be happy to recreate the sample file in the snirf-samples repo so that others can test.
hi @jayd1860 and @dboas, as more groups are working on adding supports to snirf (see mne-tools/mne-python#7057, brainstorm-tools/brainstorm3#283, fNIRS/snirf-samples#6) we need to make sure the sample files are consistent, not only across our own matlab parsers, but also parsers written with other languages.
@jayd1860, according to our previous conversation, I know you transpose data in the loading phase of a
.snirffile instead of the saving phase. This is not an issue if one uses snirf_homer3 for both saving and loading, but can cause an issue for other parsers.I just did a quick test using python, and the snirf_homer3 generated sample data shows incorrect data dimensions, here are the script
first, installing python hdf5 support (and numpy) via
then start python, and load the
neuro_run01.snirffile that I previously created using homer3's snirf parserthis is inconsistent with our specification, where the dimensions is defined as
#time points x #channels.The transposed data is a result of directly saving a column-major matlab array to a row-majored HDF5 dataset.
I expect that @huppertt's C# parser will also experience similar transpose issues.
In comparison, my easyh5 toolbox transposes the data before saving to hdf5, so it is consistent if the file is opened with other programming languages.
Please let me know if you are able to fix this, and I will be happy to recreate the sample file in the snirf-samples repo so that others can test.