Skip to content

Commit a2bf54f

Browse files
committed
initial scaffold - add jsonargparse
Signed-off-by: Jack Luar <[email protected]>
1 parent 2b0d8be commit a2bf54f

File tree

5 files changed

+108
-3
lines changed

5 files changed

+108
-3
lines changed

docs/user/CLIGuideAutotuner.md

Lines changed: 69 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,69 @@
1+
# Autotuner CLI Guide
2+
3+
AutoTuner may be triggered by specifying a configuration path instead of
4+
typing every single argument one-by-one.
5+
6+
## Motivation of Configuration files
7+
8+
Why configuration files?
9+
- Improve CLI user experience - less typing, less mistakes.
10+
- Greatly simplify the reproducibility of experiments - share your configuration
11+
file for easier experiment tracking!
12+
13+
Previously for a command like tune, you might have had to do:
14+
15+
```shell
16+
openroad_autotuner --design gcd --platform sky130hd --experiment abcdef \
17+
--verbose \
18+
--jobs 4 --openroad_threads 16 \
19+
tune \
20+
--config ../../flow/designs/sky130hd/gcd/autotuner.json \
21+
--samples 5 --iterations 1 --algorithm hyperopt \
22+
--resources_per_trial 1.0 --seed 42
23+
```
24+
25+
With our new approach all you have to do is specify a YAML document `test.yaml`
26+
27+
```yaml
28+
---
29+
# Common
30+
design: gcd
31+
platform: sky130hd
32+
experiment: test
33+
verbose: 0
34+
config: ../../flow/designs/sky130hd/gcd/autotuner.json
35+
36+
# Workload
37+
jobs: 4
38+
openroad_threads: 16
39+
40+
# Tune-specific (set these if mode is tune)
41+
tune:
42+
algorithm: hyperopt
43+
eval: default
44+
samples: 10
45+
iterations: 1
46+
resources_per_trial: 1.0
47+
reference: null
48+
perturbation: 25
49+
seed: 42
50+
```
51+
52+
and run:
53+
54+
```bash
55+
openroad_autotuner --yaml test.yaml
56+
```
57+
58+
## How to generate new config files
59+
60+
```bash
61+
openroad_autotuner --design gcd --platform sky130hd --experiment abcdef \
62+
--verbose \
63+
--jobs 4 --openroad_threads 16 \
64+
--print_config \
65+
tune \
66+
--config ../../flow/designs/sky130hd/gcd/autotuner.json \
67+
--samples 5 --iterations 1 --algorithm hyperopt \
68+
--resources_per_trial 1.0 --seed 42 > new_test.yaml
69+
```

docs/user/InstructionsForAutoTuner.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -104,6 +104,9 @@ The `autotuner.distributed` module uses [Ray's](https://docs.ray.io/en/latest/in
104104
fully utilize available hardware resources from a single server
105105
configuration, on-premise or over the cloud with multiple CPUs.
106106

107+
For advanced users, we have provided a configuration file, please
108+
refer to [this guide](./CLIGuideAutotuner.md).
109+
107110
The two modes of operation:
108111
- `sweep`, where every possible parameter combination in the search space is tested
109112
- `tune`, where we use Ray's Tune feature to intelligently search the space and optimize hyperparameters using one of the algorithms listed above.

tools/AutoTuner/cli.yaml

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
---
2+
# Common
3+
design: gcd
4+
platform: sky130hd
5+
experiment: test
6+
timeout: null
7+
verbose: 0
8+
9+
# Workload
10+
jobs: 4
11+
openroad_threads: 16
12+
server: null
13+
port: 10001
14+
15+
# Tune-specific
16+
tune:
17+
config: ../../flow/designs/sky130hd/gcd/autotuner.json
18+
algorithm: hyperopt
19+
eval: default
20+
samples: 10
21+
iterations: 1
22+
resources_per_trial: 1.0
23+
reference: null
24+
perturbation: 25
25+
seed: 42
26+
27+
# Sweep-specific
28+
sweep:
29+
config: ./src/autotuner/distributed-sweep-example.json

tools/AutoTuner/requirements.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,3 +11,4 @@ SQLAlchemy==1.4.17
1111
urllib3>=1.26.17
1212
matplotlib==3.10.0
1313
pyyaml==6.0.1
14+
jsonargparse==4.3.8

tools/AutoTuner/src/autotuner/distributed.py

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -106,6 +106,9 @@
106106
)
107107
# Global variable for args
108108
args = None
109+
design = None
110+
platform = None
111+
config = None
109112

110113

111114
class AutoTunerBase(tune.Trainable):
@@ -124,7 +127,7 @@ def setup(self, config):
124127
self.parameters = parse_config(
125128
config=config,
126129
base_dir=self.repo_dir,
127-
platform=args.platform,
130+
platform=platform,
128131
sdc_original=SDC_ORIGINAL,
129132
constraints_sdc=CONSTRAINTS_SDC,
130133
fr_original=FR_ORIGINAL,
@@ -579,13 +582,13 @@ def main():
579582
# Read config and original files before handling where to run in case we
580583
# need to upload the files.
581584
config_dict, SDC_ORIGINAL, FR_ORIGINAL = read_config(
582-
os.path.abspath(args.config), args.mode, getattr(args, "algorithm", None)
585+
os.path.abspath(config), args.mode, getattr(args, "algorithm", None)
583586
)
584587

585588
LOCAL_DIR, ORFS_FLOW_DIR, INSTALL_PATH = prepare_ray_server(args)
586589

587590
if args.mode == "tune":
588-
best_params = set_best_params(args.platform, args.design)
591+
best_params = set_best_params(platform, design)
589592
search_algo = set_algorithm(
590593
args.algorithm,
591594
args.experiment,

0 commit comments

Comments
 (0)