Skip to content

Commit b301bf1

Browse files
committed
Update readme.
1 parent 8a80da6 commit b301bf1

File tree

1 file changed

+77
-47
lines changed

1 file changed

+77
-47
lines changed

README.md

Lines changed: 77 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ section below.
88

99
## Requirements
1010

11+
- Supported operating systems: Linux, Mac
1112
- [OCaml 4.07.1](https://ocaml.org)
1213
- [SerAPI 0.7.1](https://github.com/ejgallego/coq-serapi)
1314
- [Coq 8.10.2](https://coq.inria.fr/download)
@@ -16,7 +17,19 @@ section below.
1617

1718
## Installation
1819

19-
### Installation of OCaml, Coq, and SerAPI
20+
You can install Roosterize from source code by cloning this GitHub
21+
repository and setting up the dependencies following steps 1 & 2.
22+
(Alternatively, you can download the a [binary
23+
distribution](https://github.com/EngineeringSoftware/roosterize/releases)
24+
which already contains the Python dependencies, and then you only need
25+
step 1.)
26+
27+
```
28+
git clone https://github.com/EngineeringSoftware/roosterize.git
29+
cd roosterize
30+
```
31+
32+
### 1. Installation of OCaml, Coq, and SerAPI
2033

2134
We strongly recommend installing the required versions of OCaml, Coq,
2235
and SerAPI via the [OPAM package manager](https://opam.ocaml.org),
@@ -35,38 +48,42 @@ opam pin add coq 8.10.2
3548
opam pin add coq-serapi 8.10.0+0.7.1
3649
```
3750

38-
### Installation of PyTorch and Python libraries
51+
### 2. Installation of PyTorch and Python libraries
52+
53+
We strongly recommend installing the required versions of Python and
54+
PyTorch using [Conda](https://docs.conda.io/en/latest/miniconda.html).
3955

40-
We strongly recommend installing the required versions of Python and PyTorch
41-
using [Conda](https://docs.conda.io/en/latest/miniconda.html).
56+
To set up the Conda environment, use one of the following command
57+
suitable for your operating system and whether you want to use it on a
58+
CPU or GPU.
4259

43-
First, create a Python 3.7 environment using Conda:
60+
- Linux, CPU:
4461
```
45-
conda create --name roosterize python=3.7 pip
46-
conda activate roosterize
62+
conda env create --name roosterize --file conda-envs/cpu.yml
4763
```
4864

49-
Then, install PyTorch 1.1.0 by following the
50-
[official instructions](https://pytorch.org/get-started/previous-versions/#conda-5),
51-
using the command suitable for your operating system and whether you want to use
52-
it on a CPU or GPU. For example, the following command installs PyTorch for
53-
CPU-only use on Linux or Windows:
65+
- Linux, GPU w/ CUDA 10.0:
5466
```
55-
conda install pytorch-cpu==1.1.0 torchvision-cpu==0.3.0 cpuonly -c pytorch
67+
conda env create --name roosterize --file conda-envs/gpu-cuda10.yml
5668
```
57-
Finally, install the required Python libraries:
69+
70+
- Linux, GPU w/ CUDA 9.0:
5871
```
59-
conda install configargparse nltk future seutil==0.4.12 six torchtext==0.4.0 tqdm==4.30.*
72+
conda env create --name roosterize --file conda-envs/gpu-cuda9.yml
6073
```
6174

62-
### Installation of Roosterize and trained models
75+
- Mac, CPU:
76+
```
77+
conda env create --name roosterize --file conda-envs/mac-cpu.yml
78+
```
6379

64-
To install Roosterize itself, clone the GitHub repository and enter the root directory:
80+
Finally, activate the Conda environment before using Roosterize:
6581
```
66-
git clone https://github.com/EngineeringSoftware/roosterize.git
67-
cd roosterize
82+
conda activate roosterize
6883
```
6984

85+
### Installation of Roosterize and trained models
86+
7087
Next, you need to obtain pre-trained models that capture
7188
naming conventions. We have trained several models using our
7289
[corpus][math-comp-corpus], which follows the conventions
@@ -85,43 +102,56 @@ tar xzf models.tgz
85102

86103
## Usage
87104

105+
You need to obtain a pre-trained model that capture naming
106+
conventions. The default pre-trained model, which was trained using
107+
our [corpus][math-comp-corpus] and follows the conventions used in the
108+
[Mathematical Components][math-comp-website] family of projects, can
109+
be obtained by running the command:
110+
111+
```
112+
./bin/roosterize download_global_model
113+
```
114+
115+
The model will be downloaded to `$HOME/.roosterize/`. To use a
116+
different model (that we [released][latest-release] or you trained),
117+
simply put it in `$HOME/.roosterize/`.
118+
88119
To use Roosterize on a Coq verification project, you first need to
89-
build the Coq project using a command provided by the project
90-
(usually `make`). Then, at the root directory of
91-
the Roosterize project repository, run the command
92-
```
93-
python -m roosterize.main suggest_lemmas \
94-
--project=$PATH_TO_PROJECT \
95-
--serapi-options=$SERAPI_OPTIONS \
96-
--model-dir=./models/roosterize-ta \
97-
--output=./output
98-
```
99-
where `$PATH_TO_PROJECT` should be replaced with the path to the
100-
Coq project, and `$SERAPI_OPTIONS` should be replaced with the SerAPI
101-
command line options for mapping logical paths to directories
102-
(see [SerAPI's documentation][serapi-faq-link]). For example,
103-
if the logical path (inside Coq) for the project is `Verified`,
104-
you should set `SERAPI_OPTIONS="-R $PATH_TO_PROJECT,Verified"`.
105-
106-
The above command extracts all lemmas from the project, uses Roosterize's
107-
pre-trained model (at `./models/roosterize-ta`) to predict a lemma name
108-
for each lemma, and finally prints the lemma name update suggestions,
109-
i.e., the predicted lemma names that are different from to the existing ones.
110-
Below is an example of printed suggestions:
111-
```
112-
>>>>> Suggestions:
113-
infotheo/ecc_classic/bch.v: infotheo.ecc_classic.bch.BCH.BCH_PCM_altP1 -> inde_F2
114-
infotheo/ecc_classic/bch.v: infotheo.ecc_classic.bch.BCH.BCH_PCM_altP2 -> inde_mul
115-
infotheo/ecc_classic/bch.v: infotheo.ecc_classic.bch.BCH.PCM_altP -> F2_eq0
116-
...
120+
build the Coq project using a command provided by the project (usually
121+
`make`). Then, run this command to get the lemma name suggestions for
122+
the lemmas in a Coq document (.v file):
123+
124+
```
125+
python -m roosterize.main suggest_naming --file=PATH_TO_FILE
117126
```
118127

128+
Roosterize automatically infers the root directory of the project by
129+
finding `_CoqProject`, and reads `_CoqProject` to infer the SerAPI
130+
command line options (for mapping logical paths to directories, see
131+
[SerAPI's documentation][serapi-faq-link]). If you don't have a
132+
`_CoqProject` file, you need to provide an additional argument
133+
`--project_root=PATH_TO_PROJECT_ROOT`.
134+
135+
<!-- where `$PATH_TO_PROJECT` should be replaced with the path to the -->
136+
<!-- Coq project, and `$SERAPI_OPTIONS` should be replaced with the SerAPI -->
137+
<!-- command line options for mapping logical paths to directories -->
138+
<!-- (see [SerAPI's documentation][serapi-faq-link]). For example, -->
139+
<!-- if the logical path (inside Coq) for the project is `Verified`, -->
140+
<!-- you should set `SERAPI_OPTIONS="-R $PATH_TO_PROJECT,Verified"`. -->
141+
142+
The above command extracts all lemmas from the file, uses the
143+
pre-trained model to generate likely lemma names for each lemma, and
144+
finally prints the lemma name update suggestions, i.e., the generated
145+
lemma names that are different from to the existing ones. See an
146+
[example suggestion report](./docs/example-suggestion.txt).
147+
119148
For other usages and command line interfaces of Roosterize, please
120149
check the help:
121150
```
122151
python -m roosterize.main help
123152
```
124153

154+
[latest-release]: https://github.com/EngineeringSoftware/roosterize/releases/latest
125155
[serapi-faq-link]: https://github.com/ejgallego/coq-serapi/blob/v8.10/FAQ.md#does-serapi-support-coqs-command-line-flags
126156

127157
## Technique

0 commit comments

Comments
 (0)