Skip to content

Commit f450a0c

Browse files
authored
Merge branch 'main' into zarr_pin
2 parents 34dc919 + 4a30df0 commit f450a0c

17 files changed

+845
-67
lines changed

.editorconfig

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
root = true
2+
3+
[*]
4+
charset = utf-8
5+
end_of_line = lf
6+
insert_final_newline = true
7+
trim_trailing_whitespace = true
8+
9+
[*.{py,toml}]
10+
indent_style = space
11+
indent_size = 4
12+
13+
[*.{yml,yaml,json}]
14+
indent_style = space
15+
indent_size = 2

.readthedocs.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
version: 2
22
build:
3-
os: ubuntu-20.04
3+
os: ubuntu-22.04
44
tools:
5-
python: "3.11"
5+
python: "3.13"
66
sphinx:
77
configuration: docs/conf.py
88
formats: all

CONTRIBUTING.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ Request features on the [Issue Tracker].
3737

3838
## How to set up your development environment
3939

40-
You need Python 3.10+ and the following tools:
40+
You need Python 3.11+ and the following tools:
4141

4242
- [uv]
4343
- [Nox]
@@ -106,7 +106,7 @@ Open a [pull request] to submit changes to this project.
106106
Your pull request needs to meet the following guidelines for acceptance:
107107

108108
- The Nox test suite must pass without errors and warnings.
109-
- Include unit tests. This project maintains 100% code coverage.
109+
- Include unit tests. This project currently maintains 90%+ code coverage.
110110
- If your changes add functionality, update the documentation accordingly.
111111

112112
Feel free to submit early, though—we can always iterate on this.

docs/reference.md renamed to docs/api_reference.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Reference
1+
# API Reference
22

33
## Readers / Writers
44

docs/usage.md renamed to docs/cli_usage.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Usage
1+
# CLI Usage
22

33
## Ingestion and Export
44

@@ -68,11 +68,11 @@ Credentials can be automatically fetched from pre-authenticated AWS CLI.
6868
See [here](https://s3fs.readthedocs.io/en/latest/index.html#credentials) for the order `s3fs`
6969
checks them. If it is not pre-authenticated, you need to pass `--storage-options-{input,output}`.
7070

71-
**Prefix:**
71+
**Prefix:**
7272
`s3://`
7373

74-
**Storage Options:**
75-
`key`: The auth key from AWS
74+
**Storage Options:**
75+
`key`: The auth key from AWS
7676
`secret`: The auth secret from AWS
7777

7878
Using UNIX:
@@ -104,10 +104,10 @@ checks them. If it is not pre-authenticated, you need to pass `--storage-options
104104
GCP uses [service accounts](https://cloud.google.com/iam/docs/service-accounts) to pass
105105
authentication information to APIs.
106106

107-
**Prefix:**
107+
**Prefix:**
108108
`gs://` or `gcs://`
109109

110-
**Storage Options:**
110+
**Storage Options:**
111111
`token`: The service account JSON value as string, or local path to JSON
112112

113113
Using a service account:
@@ -136,11 +136,11 @@ There are various ways to authenticate with Azure Data Lake (ADL).
136136
See [here](https://github.com/fsspec/adlfs#details) for some details.
137137
If ADL is not pre-authenticated, you need to pass `--storage-options-{input,output}`.
138138

139-
**Prefix:**
139+
**Prefix:**
140140
`az://` or `abfs://`
141141

142-
**Storage Options:**
143-
`account_name`: Azure Data Lake storage account name
142+
**Storage Options:**
143+
`account_name`: Azure Data Lake storage account name
144144
`account_key`: Azure Data Lake storage account access key
145145

146146
```shell

docs/conf.py

Lines changed: 36 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,31 +1,66 @@
11
"""Sphinx configuration."""
22

3+
# -- Project information -----------------------------------------------------
4+
35
project = "MDIO"
46
author = "TGS"
57
copyright = "2023, TGS"
8+
9+
# -- General configuration ---------------------------------------------------
10+
11+
# Add any Sphinx extension module names here, as strings. They can be
12+
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
13+
# ones.
14+
615
extensions = [
716
"sphinx.ext.autodoc",
817
"sphinx.ext.napoleon",
18+
"sphinx.ext.intersphinx",
19+
"sphinx.ext.autosummary",
920
"sphinx.ext.autosectionlabel",
1021
"sphinx_click",
1122
"sphinx_copybutton",
1223
"myst_nb",
24+
"sphinx_design",
1325
]
1426

27+
# List of patterns, relative to source directory, that match files and
28+
# directories to ignore when looking for source files.
29+
# This pattern also affects html_static_path and html_extra_path.
30+
exclude_patterns = [
31+
"_build",
32+
"Thumbs.db",
33+
"jupyter_execute",
34+
".DS_Store",
35+
"**.ipynb_checkpoints",
36+
]
37+
38+
intersphinx_mapping = {
39+
"python": ("https://docs.python.org/3", None),
40+
"numpy": ("https://numpy.org/doc/stable/", None),
41+
"zarr": ("https://zarr.readthedocs.io/en/stable/", None),
42+
}
43+
44+
pygments_style = "vs"
45+
pygments_dark_style = "material"
46+
1547
autodoc_typehints = "description"
1648
autodoc_typehints_format = "short"
1749
autodoc_member_order = "groupwise"
18-
autoclass_content = "both"
50+
autoclass_content = "class"
1951
autosectionlabel_prefix_document = True
2052

2153
html_theme = "furo"
2254

2355
myst_number_code_blocks = ["python"]
2456
myst_heading_anchors = 2
57+
myst_words_per_minute = 80
2558
myst_enable_extensions = [
59+
"colon_fence",
2660
"linkify",
2761
"replacements",
2862
"smartquotes",
63+
"attrs_inline",
2964
]
3065

3166
# sphinx-copybutton configurations

docs/index.md

Lines changed: 26 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -6,24 +6,38 @@ end-before: <!-- github-only -->
66

77
[apache 2.0 license]: license
88
[contributor guide]: contributing
9-
[command-line usage]: usage
10-
[api reference]: reference
9+
[command-line usage]: cli_usage
10+
[api reference]: api_reference
1111
[installation instructions]: installation
1212

1313
```{toctree}
14-
---
15-
hidden:
16-
maxdepth: 1
17-
---
14+
:hidden:
15+
:caption: Getting Started
16+
1817
installation
19-
notebooks/quickstart
20-
notebooks/creation
21-
notebooks/compression
22-
notebooks/rechunking
23-
usage
24-
reference
18+
cli_usage
19+
```
20+
21+
```{toctree}
22+
:hidden:
23+
:caption: Learning and Support
24+
25+
tutorials/index
26+
api_reference
27+
```
28+
29+
```{toctree}
30+
:hidden:
31+
:caption: Community and Contribution
32+
2533
contributing
2634
Code of Conduct <codeofconduct>
35+
```
36+
37+
```{toctree}
38+
:hidden:
39+
:caption: Additional Resources
40+
2741
License <license>
2842
Changelog <https://github.com/TGSAI/mdio-python/releases>
2943
```

docs/installation.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Install Instructions
1+
# Installation
22

33
There are different ways to install MDIO:
44

docs/requirements.txt

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
furo==2024.8.6
2-
sphinx==8.2.1
2+
linkify-it-py==2.0.3
3+
myst-nb==1.2.0
4+
sphinx==8.2.3
35
sphinx-click==6.0.0
46
sphinx-copybutton==0.5.2
5-
myst-nb==1.2.0
6-
linkify-it-py==2.0.3
7+
sphinx-design==0.6.1

docs/notebooks/compression.ipynb renamed to docs/tutorials/compression.ipynb

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,13 @@
66
"source": [
77
"# Seismic Data Compression\n",
88
"\n",
9+
"```{article-info}\n",
10+
":author: Altay Sansal\n",
11+
":date: \"{sub-ref}`today`\"\n",
12+
":read-time: \"{sub-ref}`wordcount-minutes` min read\"\n",
13+
":class-container: sd-p-0 sd-outline-muted sd-rounded-3 sd-font-weight-light\n",
14+
"```\n",
15+
"\n",
916
"In this page we will be showing compression performance of _MDIO_.\n",
1017
"\n",
1118
"For demonstration purposes, we will use one of the Volve dataset stacks.\n",

0 commit comments

Comments
 (0)