Skip to content

Commit 682f87e

Browse files
authored
Merge pull request #2503 from jeffng-or/at-doc-typos
doc fixes
2 parents 9f67f4a + 83a3c2a commit 682f87e

File tree

1 file changed

+13
-12
lines changed

1 file changed

+13
-12
lines changed

docs/user/InstructionsForAutoTuner.md

Lines changed: 13 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -23,17 +23,17 @@ User-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) o
2323

2424
## Setting up AutoTuner
2525

26-
We have provided two convenience scripts, `./install.sh` and `./setup.sh`
26+
We have provided two convenience scripts, `./installer.sh` and `./setup.sh`
2727
that works in Python3.8 for installation and configuration of AutoTuner,
2828
as shown below:
2929

3030
```{note}
31-
Make sure you run the following commands in `./tools/AutoTuner/src/autotuner`.
31+
Make sure you run the following commands in the ORFS root directory.
3232
```
3333

3434
```shell
3535
# Install prerequisites
36-
./tools/AutoTuner/install.sh
36+
./tools/AutoTuner/installer.sh
3737

3838
# Start virtual environment
3939
./tools/AutoTuner/setup.sh
@@ -104,14 +104,15 @@ For Global Routing parameters that are set on `fastroute.tcl` you can use:
104104

105105
### General Information
106106

107-
The `distributed.py` script uses Ray's job scheduling and management to
107+
The `distributed.py` script located in `./tools/AutoTuner/src/autotuner` uses [Ray's](https://docs.ray.io/en/latest/index.html) job scheduling and management to
108108
fully utilize available hardware resources from a single server
109-
configuration, on-premies or over the cloud with multiple CPUs.
110-
The two modes of operation: `sweep`, where every possible parameter
111-
combination in the search space is tested; and `tune`, where we use
112-
Ray's Tune feature to intelligently search the space and optimize
113-
hyperparameters using one of the algorithms listed above. The `sweep`
114-
mode is useful when we want to isolate or test a single or very few
109+
configuration, on-premise or over the cloud with multiple CPUs.
110+
111+
The two modes of operation:
112+
- `sweep`, where every possible parameter combination in the search space is tested
113+
- `tune`, where we use Ray's Tune feature to intelligently search the space and optimize hyperparameters using one of the algorithms listed above.
114+
115+
The `sweep` mode is useful when we want to isolate or test a single or very few
115116
parameters. On the other hand, `tune` is more suitable for finding
116117
the best combination of a complex and large number of flow
117118
parameters. Both modes rely on user-specified search space that is
@@ -120,7 +121,7 @@ though some features may not be available for sweeping.
120121

121122
```{note}
122123
The order of the parameters matter. Arguments `--design`, `--platform` and
123-
`--config` are always required and should precede <mode>.
124+
`--config` are always required and should precede *mode*.
124125
```
125126

126127
#### Tune only
@@ -169,7 +170,7 @@ GCP Setup Tutorial coming soon.
169170
| `--git_url` | OpenROAD-flow-scripts repo URL to use. |
170171
| `--build_args` | Additional arguments given to ./build_openroad.sh |
171172
| `--algorithm` | Search algorithm to use for Autotuning. |
172-
| `--eval` | Evalaute function to use with search algorithm. \ |
173+
| `--eval` | Evaluate function to use with search algorithm. |
173174
| `--samples` | Number of samples for tuning. |
174175
| `--iterations` | Number of iterations for tuning. |
175176
| `--resources_per_trial` | Number of CPUs to request for each tuning job. |

0 commit comments

Comments
 (0)