Skip to content

Commit 3f78992

Browse files
committed
Merge remote-tracking branch 'refs/remotes/origin/fix_openephys_stream' into fix_openephys_stream
2 parents 44c6b4b + 297abf9 commit 3f78992

16 files changed

+100
-100
lines changed

.github/workflows/caches_cron_job.yml

Lines changed: 51 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -10,57 +10,57 @@ on:
1010

1111
jobs:
1212

13-
create-conda-env-cache-if-missing:
14-
name: Caching conda env
15-
runs-on: "ubuntu-latest"
16-
strategy:
17-
fail-fast: true
18-
defaults:
19-
# by default run in bash mode (required for conda usage)
20-
run:
21-
shell: bash -l {0}
22-
steps:
23-
- uses: actions/checkout@v3
24-
25-
- name: Get current year-month
26-
id: date
27-
run: |
28-
echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT
29-
30-
- name: Get current dependencies hash
31-
id: dependencies
32-
run: |
33-
echo "hash=${{hashFiles('**/pyproject.toml', '**/environment_testing.yml')}}" >> $GITHUB_OUTPUT
34-
35-
- uses: actions/cache@v3
36-
# the cache for python package is reset:
37-
# * every month
38-
# * when package dependencies change
39-
id: cache-conda-env
40-
with:
41-
path: /usr/share/miniconda/envs/neo-test-env
42-
key: ${{ runner.os }}-conda-env-${{ steps.dependencies.outputs.hash }}-${{ steps.date.outputs.date }}
43-
44-
- name: Cache found?
45-
run: echo "Cache-hit == ${{steps.cache-conda-env.outputs.cache-hit == 'true'}}"
46-
47-
# activate environment if not restored from cache
48-
- uses: conda-incubator/[email protected]
49-
if: steps.cache-conda-env.outputs.cache-hit != 'true'
50-
with:
51-
activate-environment: neo-test-env
52-
environment-file: environment_testing.yml
53-
python-version: 3.9
54-
55-
- name: Create the conda environment to be cached
56-
if: steps.cache-conda-env.outputs.cache-hit != 'true'
57-
# create conda env, configure git and install pip, neo and test dependencies from master
58-
# for PRs that change dependencies, this environment will be updated in the test workflow
59-
run: |
60-
git config --global user.email "neo_ci@fake_mail.com"
61-
git config --global user.name "neo CI"
62-
python -m pip install -U pip # Official recommended way
63-
pip install --upgrade -e .[test]
13+
# create-conda-env-cache-if-missing:
14+
# name: Caching conda env
15+
# runs-on: "ubuntu-latest"
16+
# strategy:
17+
# fail-fast: true
18+
# defaults:
19+
# # by default run in bash mode (required for conda usage)
20+
# run:
21+
# shell: bash -l {0}
22+
# steps:
23+
# - uses: actions/checkout@v3
24+
25+
# - name: Get current year-month
26+
# id: date
27+
# run: |
28+
# echo "date=$(date +'%Y-%m')" >> $GITHUB_OUTPUT
29+
30+
# - name: Get current dependencies hash
31+
# id: dependencies
32+
# run: |
33+
# echo "hash=${{hashFiles('**/pyproject.toml', '**/environment_testing.yml')}}" >> $GITHUB_OUTPUT
34+
35+
# - uses: actions/cache@v3
36+
# # the cache for python package is reset:
37+
# # * every month
38+
# # * when package dependencies change
39+
# id: cache-conda-env
40+
# with:
41+
# path: /usr/share/miniconda/envs/neo-test-env
42+
# key: ${{ runner.os }}-conda-env-${{ steps.dependencies.outputs.hash }}-${{ steps.date.outputs.date }}
43+
44+
# - name: Cache found?
45+
# run: echo "Cache-hit == ${{steps.cache-conda-env.outputs.cache-hit == 'true'}}"
46+
47+
# # activate environment if not restored from cache
48+
# - uses: conda-incubator/[email protected]
49+
# if: steps.cache-conda-env.outputs.cache-hit != 'true'
50+
# with:
51+
# activate-environment: neo-test-env
52+
# environment-file: environment_testing.yml
53+
# python-version: 3.9
54+
55+
# - name: Create the conda environment to be cached
56+
# if: steps.cache-conda-env.outputs.cache-hit != 'true'
57+
# # create conda env, configure git and install pip, neo and test dependencies from master
58+
# # for PRs that change dependencies, this environment will be updated in the test workflow
59+
# run: |
60+
# git config --global user.email "neo_ci@fake_mail.com"
61+
# git config --global user.name "neo CI"
62+
# python -m pip install -U pip # Official recommended way
63+
# pip install --upgrade -e .[test]
6464

6565
create-data-cache-if-missing:
6666
name: Caching data env

doc/source/authors.rst

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,8 +90,10 @@ and may not be the current affiliation of a contributor.
9090
* Xin Niu
9191
* Nikhil Chandra [40]
9292
* Luigi Petrucco [42]
93+
* Tommaso Lambresa [43]
94+
* Nina Kudryashova [37]
9395

94-
1. Centre de Recherche en Neuroscience de Lyon, CNRS UMR5292 - INSERM U1028 - Universite Claude Bernard Lyon 1
96+
1. Centre de Recherche en Neuroscience de Lyon, CNRS UMR5292 - INSERM U1028 - Université Claude Bernard Lyon 1
9597
2. Unité de Neuroscience, Information et Complexité, CNRS UPR 3293, Gif-sur-Yvette, France
9698
3. University of California, Berkeley
9799
4. Laboratoire de Neurosciences Intégratives et Adaptatives, CNRS UMR 6149 - Université de Provence, Marseille, France
@@ -133,6 +135,7 @@ and may not be the current affiliation of a contributor.
133135
40. Plexon Inc.
134136
41. Paris Brain Institute
135137
42. Istituto Italiano di Tecnologia (IIT), Italy
138+
43. University of Genoa, Italy
136139

137140

138141

neo/io/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -167,7 +167,7 @@
167167
.. autoclass:: neo.io.MaxwellIO
168168
169169
.. autoattribute:: extensions
170-
170+
171171
.. autoclass:: neo.io.MedIO
172172
173173
.. autoattribute:: extensions

neo/io/neomatlabio.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -281,6 +281,8 @@ def write_block(self, bl, **kargs):
281281
"""
282282
Arguments:
283283
bl: the block to be saved
284+
kargs: extra keyword arguments broadcasted to scipy.io.savemat
285+
284286
"""
285287
import scipy.io
286288

@@ -307,7 +309,7 @@ def write_block(self, bl, **kargs):
307309
else:
308310
group_structure[container_name].append(id(child_obj))
309311

310-
scipy.io.savemat(self.filename, {"block": bl_struct}, oned_as="row")
312+
scipy.io.savemat(self.filename, {"block": bl_struct}, oned_as="row", **kargs)
311313

312314
def _get_matlab_value(self, ob, attrname):
313315
units = None

neo/rawio/axonrawio.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
strings section:
2424
[uModifierNameIndex, uCreatorNameIndex, uProtocolPathIndex, lFileComment, lADCCChannelNames, lADCUnitsIndex
2525
lDACChannelNameIndex, lDACUnitIndex, lDACFilePath, nLeakSubtractADC]
26-
['', 'Clampex', '', 'C:/path/protocol.pro', 'some comment', 'IN 0', 'mV', 'IN 1', 'mV', 'Cmd 0', 'pA',
26+
['', 'Clampex', '', 'C:/path/protocol.pro', 'some comment', 'IN 0', 'mV', 'IN 1', 'mV', 'Cmd 0', 'pA',
2727
'Cmd 1', 'pA', 'Cmd 2', 'mV', 'Cmd 3', 'mV']
2828
2929
Information on abf 1 and 2 formats is available here:

neo/rawio/blackrockrawio.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -979,7 +979,7 @@ def __read_nsx_dataheader_variant_b(
979979
# use of `int` avoids overflow problem
980980
data_size = int(dh["nb_data_points"]) * int(self.__nsx_basic_header[nsx_nb]["channel_count"]) * 2
981981
# define new offset (to possible next data block)
982-
offset = data_header[index]["offset_to_data_block"] + data_size
982+
offset = int(data_header[index]["offset_to_data_block"]) + data_size
983983

984984
index += 1
985985

neo/rawio/edfrawio.py

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,8 @@ def _parse_header(self):
100100
for ch_idx, sig_dict in enumerate(self.signal_headers):
101101
ch_name = sig_dict["label"]
102102
chan_id = ch_idx
103-
sr = sig_dict["sample_rate"] # Hz
103+
# pyedf >= 0.1.39 uses sample_frequency, pyedf < 0.1.39 uses sample_rate
104+
sr = sig_dict.get("sample_frequency") or sig_dict.get("sample_rate") # Hz
104105
dtype = "int16" # assume general int16 based on edf documentation
105106
units = sig_dict["dimension"]
106107
physical_range = sig_dict["physical_max"] - sig_dict["physical_min"]
@@ -160,6 +161,7 @@ def _parse_header(self):
160161
"label",
161162
"dimension",
162163
"sample_rate",
164+
"sample_frequency",
163165
"physical_min",
164166
"physical_max",
165167
"digital_min",

neo/rawio/intanrawio.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
* http://intantech.com/files/Intan_RHD2000_data_file_formats.pdf
1717
* http://intantech.com/files/Intan_RHS2000_data_file_formats.pdf
1818
19-
19+
2020
Author: Samuel Garcia (Initial), Zach McKenzie & Heberto Mayorquin (Updates)
2121
2222
"""

neo/rawio/medrawio.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
"""
22
Class for reading MED (Multiscale Electrophysiology Data) Format.
3-
3+
44
Uses the dhn-med-py python package, created by Dark Horse Neuro, Inc.
55
66
Authors: Dan Crepeau, Matt Stead

neo/rawio/neuronexusrawio.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,13 @@
66
* The *.xdat.json metadata file
77
* The *_data.xdat binary file of all raw data
88
* The *_timestamps.xdat binary file of the timestamp data
9-
9+
1010
Based on sample data is appears that the binary file is always a float32 format
1111
Other information can be found within the metadata json file
1212
1313
1414
The metadata file has a pretty complicated structure as far as I can tell
15-
a lot of which is dedicated to probe information, which won't be handle at the
15+
a lot of which is dedicated to probe information, which won't be handle at the
1616
the Neo level.
1717
1818
It appears that the metadata['status'] provides most of the information necessary

0 commit comments

Comments
 (0)