Skip to content

Commit bffa54f

Browse files
committed
[DATALAD RUNCMD] run codespell throughout fixing typo automagically
=== Do not change lines below === { "chain": [], "cmd": "codespell -w", "exit": 0, "extra_inputs": [], "inputs": [], "outputs": [], "pwd": "." } ^^^ Do not change lines above ^^^
1 parent 2236a8c commit bffa54f

10 files changed

+25
-25
lines changed

completed_tutorials/02-Calcium Imaging Imported Tables.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -732,7 +732,7 @@
732732
"source": [
733733
"In DataJoint, the tier of the table indicates **the nature of the data and the data source for the table**. So far we have encountered two table tiers: `Manual` and `Imported`, and we will encounter the two other major tiers in this session. \n",
734734
"\n",
735-
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beggining of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
735+
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beginning of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
736736
"\n",
737737
"On the other hand, **Imported tables** are understood to pull data (or *import* data) from external data files, and come equipped with functionalities to perform this importing process automatically, as we will see shortly! In the Diagram, `Imported` tables are depicted by blue ellipses."
738738
]

completed_tutorials/03-Calcium Imaging Computed Tables.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -770,7 +770,7 @@
770770
"cell_type": "markdown",
771771
"metadata": {},
772772
"source": [
773-
"There are multiple ways to perform the segementation. To keep it simple, we just detect the cells by setting up the threshold on the average image."
773+
"There are multiple ways to perform the segmentation. To keep it simple, we just detect the cells by setting up the threshold on the average image."
774774
]
775775
},
776776
{
@@ -1027,7 +1027,7 @@
10271027
"cell_type": "markdown",
10281028
"metadata": {},
10291029
"source": [
1030-
"We would like to perform the segmentation for a **combination** of `AverageFrame`s and different set of paremeters of `threshold` and `size_cutoff` values. To do this while still taking advantage of the `make` and `populate` logic, you would want to define a table to house parameters for segmentation in a `Lookup` table!"
1030+
"We would like to perform the segmentation for a **combination** of `AverageFrame`s and different set of parameters of `threshold` and `size_cutoff` values. To do this while still taking advantage of the `make` and `populate` logic, you would want to define a table to house parameters for segmentation in a `Lookup` table!"
10311031
]
10321032
},
10331033
{
@@ -1160,7 +1160,7 @@
11601160
"cell_type": "markdown",
11611161
"metadata": {},
11621162
"source": [
1163-
"The `Computed` table is labeled as a pink oval and the `Part` table is bare text. We see that `Segmentation` is a `Computed` table that depends on **both AverageFrame and SegmentationParam**. Finally, let's go ahead and implement the `make` method for the `Segmenation` table. "
1163+
"The `Computed` table is labeled as a pink oval and the `Part` table is bare text. We see that `Segmentation` is a `Computed` table that depends on **both AverageFrame and SegmentationParam**. Finally, let's go ahead and implement the `make` method for the `Segmentation` table. "
11641164
]
11651165
},
11661166
{
@@ -1342,7 +1342,7 @@
13421342
"cell_type": "markdown",
13431343
"metadata": {},
13441344
"source": [
1345-
"And for the part table `Segmenation.Roi`, there was an additional primary key attribute `roi_idx`:`"
1345+
"And for the part table `Segmentation.Roi`, there was an additional primary key attribute `roi_idx`:`"
13461346
]
13471347
},
13481348
{
@@ -1721,7 +1721,7 @@
17211721
}
17221722
],
17231723
"source": [
1724-
"# ENTER YOUR CODE! - populate the Segmenation table for real!\n",
1724+
"# ENTER YOUR CODE! - populate the Segmentation table for real!\n",
17251725
"Segmentation.populate()"
17261726
]
17271727
},
@@ -2177,7 +2177,7 @@
21772177
"cell_type": "markdown",
21782178
"metadata": {},
21792179
"source": [
2180-
"We can simply delete the unwanted paramter from the `SegmentationParam` table, and let DataJoint cascade the deletion:"
2180+
"We can simply delete the unwanted parameter from the `SegmentationParam` table, and let DataJoint cascade the deletion:"
21812181
]
21822182
},
21832183
{

completed_tutorials/04-Electrophysiology Imported Tables.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -523,7 +523,7 @@
523523
"cell_type": "markdown",
524524
"metadata": {},
525525
"source": [
526-
"Let's take the first key, and generate the file name that corresponds to this session. Remember the `data_{mouse_id}_{session_date}.npy` filename convetion!"
526+
"Let's take the first key, and generate the file name that corresponds to this session. Remember the `data_{mouse_id}_{session_date}.npy` filename convention!"
527527
]
528528
},
529529
{
@@ -977,7 +977,7 @@
977977
"cell_type": "markdown",
978978
"metadata": {},
979979
"source": [
980-
"So this particular file contains a NumPy array of size 1 x 1000. This represents a (simulated) recording of raw electric activity from neuron(s) (1st dimension) over 1000 time bins (2nd dimesion)."
980+
"So this particular file contains a NumPy array of size 1 x 1000. This represents a (simulated) recording of raw electric activity from neuron(s) (1st dimension) over 1000 time bins (2nd dimension)."
981981
]
982982
},
983983
{
@@ -1068,7 +1068,7 @@
10681068
"source": [
10691069
"In DataJoint, the tier of the table indicates **the nature of the data and the data source for the table**. So far we have encountered two table tiers: `Manual` and `Imported`, and we will encounter the two other major tiers in this session. \n",
10701070
"\n",
1071-
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beggining of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
1071+
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beginning of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
10721072
"\n",
10731073
"On the other hand, **Imported tables** are understood to pull data (or *import* data) from external data files, and come equipped with functionalities to perform this importing process automatically, as we will see shortly! In the Diagram, `Imported` tables are depicted by blue ellipses."
10741074
]

completed_tutorials/05-Electrophysiology Computed Tables.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3333,7 +3333,7 @@
33333333
"cell_type": "markdown",
33343334
"metadata": {},
33353335
"source": [
3336-
"We can simply delete the unwanted paramter from the `SpikeDetectionParam` table, and let DataJoint cascade the deletion:"
3336+
"We can simply delete the unwanted parameter from the `SpikeDetectionParam` table, and let DataJoint cascade the deletion:"
33373337
]
33383338
},
33393339
{

short_tutorials/University.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -490,7 +490,7 @@
490490
"metadata": {},
491491
"outputs": [],
492492
"source": [
493-
"# Millenials\n",
493+
"# Millennials\n",
494494
"millennials = Student & 'date_of_birth between \"1981-01-01\" and \"1996-12-31\"'"
495495
]
496496
},
@@ -519,7 +519,7 @@
519519
"metadata": {},
520520
"outputs": [],
521521
"source": [
522-
"# Millenials who have never enrolled\n",
522+
"# Millennials who have never enrolled\n",
523523
"millennials - Enroll"
524524
]
525525
},

tutorials/01-DataJoint Basics.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@
5151
"If you visit the [documentation for DataJoint](https://docs.datajoint.io/introduction/Data-pipelines.html), we define a data pipeline as follows:\n",
5252
"> A data pipeline is a sequence of steps (more generally a directed acyclic graph) with integrated storage at each step. These steps may be thought of as nodes in a graph.\n",
5353
"\n",
54-
"While this is an accurate description, it may not be the most intuitive definition. Put succinctly, a data pipeline is a listing or a \"map\" of various \"things\" that you work with in a project, with line connecting things to each other to indicate their dependecies. The \"things\" in a data pipeline tends to be the *nouns* you find when describing a project. The \"things\" may include anything from mouse, experimenter, equipment, to experiment session, trial, two-photon scans, electric activities, to receptive fields, neuronal spikes, to figures for a publication! A data pipeline gives you a framework to:\n",
54+
"While this is an accurate description, it may not be the most intuitive definition. Put succinctly, a data pipeline is a listing or a \"map\" of various \"things\" that you work with in a project, with line connecting things to each other to indicate their dependencies. The \"things\" in a data pipeline tends to be the *nouns* you find when describing a project. The \"things\" may include anything from mouse, experimenter, equipment, to experiment session, trial, two-photon scans, electric activities, to receptive fields, neuronal spikes, to figures for a publication! A data pipeline gives you a framework to:\n",
5555
"\n",
5656
"1. define these \"things\" as tables in which you can store the information about them\n",
5757
"2. define the relationships (in particular the dependencies) between the \"things\"\n",

tutorials/02-Calcium Imaging Imported Tables.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -342,7 +342,7 @@
342342
"source": [
343343
"In DataJoint, the tier of the table indicates **the nature of the data and the data source for the table**. So far we have encountered two table tiers: `Manual` and `Imported`, and we will encounter the two other major tiers in this session. \n",
344344
"\n",
345-
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beggining of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
345+
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beginning of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
346346
"\n",
347347
"On the other hand, **Imported tables** are understood to pull data (or *import* data) from external data files, and come equipped with functionalities to perform this importing process automatically, as we will see shortly! In the Diagram, `Imported` tables are depicted by blue ellipses."
348348
]

tutorials/03-Calcium Imaging Computed Tables.ipynb

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -250,7 +250,7 @@
250250
"cell_type": "markdown",
251251
"metadata": {},
252252
"source": [
253-
"There are multiple ways to perform the segementation. To keep it simple, we just detect the cells by setting up the threshold on the average image."
253+
"There are multiple ways to perform the segmentation. To keep it simple, we just detect the cells by setting up the threshold on the average image."
254254
]
255255
},
256256
{
@@ -397,7 +397,7 @@
397397
"cell_type": "markdown",
398398
"metadata": {},
399399
"source": [
400-
"We would like to perform the segmentation for a **combination** of `AverageFrame`s and different set of paremeters of `threshold` and `size_cutoff` values. To do this while still taking advantage of the `make` and `populate` logic, you would want to define a table to house parameters for segmentation in a `Lookup` table!"
400+
"We would like to perform the segmentation for a **combination** of `AverageFrame`s and different set of parameters of `threshold` and `size_cutoff` values. To do this while still taking advantage of the `make` and `populate` logic, you would want to define a table to house parameters for segmentation in a `Lookup` table!"
401401
]
402402
},
403403
{
@@ -506,7 +506,7 @@
506506
"cell_type": "markdown",
507507
"metadata": {},
508508
"source": [
509-
"The `Computed` table is labeled as a pink oval and the `Part` table is bare text. We see that `Segmentation` is a `Computed` table that depends on **both AverageFrame and SegmentationParam**. Finally, let's go ahead and implement the `make` method for the `Segmenation` table. "
509+
"The `Computed` table is labeled as a pink oval and the `Part` table is bare text. We see that `Segmentation` is a `Computed` table that depends on **both AverageFrame and SegmentationParam**. Finally, let's go ahead and implement the `make` method for the `Segmentation` table. "
510510
]
511511
},
512512
{
@@ -597,7 +597,7 @@
597597
"cell_type": "markdown",
598598
"metadata": {},
599599
"source": [
600-
"And for the part table `Segmenation.Roi`, there was an additional primary key attribute `roi_idx`:`"
600+
"And for the part table `Segmentation.Roi`, there was an additional primary key attribute `roi_idx`:`"
601601
]
602602
},
603603
{
@@ -693,7 +693,7 @@
693693
"metadata": {},
694694
"outputs": [],
695695
"source": [
696-
"# ENTER YOUR CODE! - populate the Segmenation table for real!\n"
696+
"# ENTER YOUR CODE! - populate the Segmentation table for real!\n"
697697
]
698698
},
699699
{
@@ -804,7 +804,7 @@
804804
"cell_type": "markdown",
805805
"metadata": {},
806806
"source": [
807-
"We can simply delete the unwanted paramter from the `SegmentationParam` table, and let DataJoint cascade the deletion:"
807+
"We can simply delete the unwanted parameter from the `SegmentationParam` table, and let DataJoint cascade the deletion:"
808808
]
809809
},
810810
{

tutorials/04-Electrophysiology Imported Tables.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -190,7 +190,7 @@
190190
"cell_type": "markdown",
191191
"metadata": {},
192192
"source": [
193-
"Let's take the first key, and generate the file name that corresponds to this session. Remember the `data_{mouse_id}_{session_date}.npy` filename convetion!"
193+
"Let's take the first key, and generate the file name that corresponds to this session. Remember the `data_{mouse_id}_{session_date}.npy` filename convention!"
194194
]
195195
},
196196
{
@@ -267,7 +267,7 @@
267267
"cell_type": "markdown",
268268
"metadata": {},
269269
"source": [
270-
"So this particular file contains a NumPy array of size 1 x 1000. This represents a (simulated) recording of raw electric activity from neuron(s) (1st dimension) over 1000 time bins (2nd dimesion)."
270+
"So this particular file contains a NumPy array of size 1 x 1000. This represents a (simulated) recording of raw electric activity from neuron(s) (1st dimension) over 1000 time bins (2nd dimension)."
271271
]
272272
},
273273
{
@@ -345,7 +345,7 @@
345345
"source": [
346346
"In DataJoint, the tier of the table indicates **the nature of the data and the data source for the table**. So far we have encountered two table tiers: `Manual` and `Imported`, and we will encounter the two other major tiers in this session. \n",
347347
"\n",
348-
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beggining of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
348+
"DataJoint tables in `Manual` tier, or simply **Manual tables** indicate that its contents are **manually** entered by either experimenters or a recording system, and its content **do not depend on external data files or other tables**. This is the most basic table type you will encounter, especially as the tables at the beginning of the pipeline. In the Diagram, `Manual` tables are depicted by green rectangles.\n",
349349
"\n",
350350
"On the other hand, **Imported tables** are understood to pull data (or *import* data) from external data files, and come equipped with functionalities to perform this importing process automatically, as we will see shortly! In the Diagram, `Imported` tables are depicted by blue ellipses."
351351
]

tutorials/05-Electrophysiology Computed Tables.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1030,7 +1030,7 @@
10301030
"cell_type": "markdown",
10311031
"metadata": {},
10321032
"source": [
1033-
"We can simply delete the unwanted paramter from the `SpikeDetectionParam` table, and let DataJoint cascade the deletion:"
1033+
"We can simply delete the unwanted parameter from the `SpikeDetectionParam` table, and let DataJoint cascade the deletion:"
10341034
]
10351035
},
10361036
{

0 commit comments

Comments
 (0)