55# This source code is licensed under the BSD-style license found in the
66# LICENSE file in the root directory of this source tree.
77
8+ """
9+ You can store the scheduler :py:class:`torchx.specs.RunConfig` for your project
10+ by storing them in the ``.torchxconfig`` file. Currently this file is only read
11+ and honored when running the component from the CLI.
12+
13+ CLI Usage
14+ ~~~~~~~~~~~
15+
16+ #. ``cd`` into the directory where you want the ``.torchxconfig`` file to be dropped.
17+ The CLI only picks up ``.torchxconfig`` files from the current-working-directory (CWD)
18+ so chose a directory where you typically run ``torchx`` from. Typically this
19+ is the root of your project directory.
20+
21+ #. Generate the config file by running
22+
23+ .. code-block:: shell-session
24+
25+ $ torchx configure -s <comma,delimited,scheduler,names>
26+
27+ # -- or for all registered schedulers --
28+ $ torchx configure
29+
30+ #. If you specified ``-s local_cwd,kubernetes``, you should see a ``.torchxconfig``
31+ file as shown below:
32+
33+ .. code-block:: shell-session
34+
35+ $ cat .torchxconfig
36+ [local_cwd]
37+
38+ [kubernetes]
39+ queue = #FIXME:(str) Volcano queue to schedule job in
40+
41+ #. ``.torchxconfig`` in in INI format and the section names map to the scheduler names.
42+ Each section contains the run configs for the scheduler as ``$key = $value`` pairs.
43+ You may find that certain schedulers have empty sections, this means that
44+ the scheduler defines sensible defaults for all its run configs hence no run configs
45+ are required at runtime. If you'd like to override the default you can add them.
46+ **TIP:** To see all the run options for a scheduler use ``torchx runopts <scheduler_name>``.
47+
48+ #. The sections with ``FIXME`` placeholders are run configs that are required
49+ by the scheduler. Replace these with the values that apply to you.
50+
51+ #. **IMPORTANT:** If you are happy with the scheduler provided defaults for a particular
52+ run config, you **should not** redundantly specity them in ``.torchxconfig`` with the
53+ same default value. This is because the scheduler may decide to change the default
54+ value at a later date which would leave you with a stale default.
55+
56+ #. Now you can run your component without having to specify the scheduler run configs
57+ each time. Just make sure the directory you are running ``torchx`` cli from actually
58+ has ``.torchxconfig``!
59+
60+ .. code-block:: shell-session
61+
62+ $ ls .torchxconfig
63+ .torchxconfig
64+
65+ $ torchx run -s local_cwd ./my_component.py:train
66+
67+ Programmatic Usage
68+ ~~~~~~~~~~~~~~~~~~~
69+
70+ Unlike the cli, ``.torchxconfig`` file **is not** picked up automatically
71+ from ``CWD`` if you are programmatically running your component with :py:class:`torchx.runner.Runner`.
72+ You'll have to manually specify the directory containing ``.torchxconfig``.
73+
74+ Below is an example
75+
76+ .. doctest:: [runner_config_example]
77+
78+ from torchx.runner import get_runner
79+ from torchx.runner.config import apply
80+ import torchx.specs as specs
81+
82+ def my_component(a: int) -> specs.AppDef:
83+ # <... component body omitted for brevity ...>
84+ pass
85+
86+ scheduler = "local_cwd"
87+ cfg = specs.RunConfig()
88+ cfg.set("log_dir", "/these/take/outmost/precedence")
89+
90+ apply(scheduler, cfg, dirs=["/home/bob"]) # looks for /home/bob/.torchxconfig
91+ get_runner().run(my_component(1), scheduler, cfg)
92+
93+ You may also specify multiple directories (in preceding order) which is useful when
94+ you want to keep personal config overrides on top of a project defined default.
95+
96+ """
897import configparser as configparser
998import logging
1099from pathlib import Path
@@ -66,7 +155,7 @@ def dump(
66155 To only dump required runopts pass ``required_only=True``.
67156
68157 Each scheduler's runopts are written in the section called
69- ``[default. {scheduler_name}.cfg ]``.
158+ ``[{scheduler_name}]``.
70159
71160 For example:
72161
@@ -77,7 +166,7 @@ def dump(
77166 queue = #FIXME (str)Volcano queue to schedule job in
78167
79168 Raises:
80- `` ValueError`` - if given a scheduler name that is not known
169+ ValueError: if given a scheduler name that is not known
81170 """
82171
83172 if schedulers :
@@ -128,7 +217,7 @@ def apply(scheduler: str, cfg: RunConfig, dirs: Optional[List[str]] = None) -> N
128217 over the ones in the config file and only new configs are added. The same holds
129218 true for the configs loaded in list order.
130219
131- For instance if ``cfg = {"foo": "bar"}`` and the config file is:
220+ For instance if ``cfg= {"foo":"bar"}`` and the config file is:
132221
133222 ::
134223
@@ -137,12 +226,12 @@ def apply(scheduler: str, cfg: RunConfig, dirs: Optional[List[str]] = None) -> N
137226 foo = baz
138227 hello = world
139228
140- # dir_2/.torchxconfig
141- [local_cwd]
142- hello = bob
229+ # dir_2/.torchxconfig
230+ [local_cwd]
231+ hello = bob
143232
144233
145- Then after the method call, ``cfg = {"foo": "bar", "hello": "world"}``.
234+ Then after the method call, ``cfg= {"foo":"bar","hello":"world"}``.
146235 """
147236
148237 if not dirs :
0 commit comments