You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|`args.txt`| Contains the arguments fed into the `energize.py` script to produce the output. |
295
+
|`job.csv`| Contains information about the HTCondor job that produced the output, such as the cluster, hostname, and start time. |
296
+
|`hparams.csv`| Contains the Rosetta hyperparameters used to compute the energies. |
297
+
|`energies.csv`| Contains the computed energies for each variant in the job. |
288
298
299
+
After the HTCondor run, transfer these log directories to your local machine for processing.
300
+
I recommend compressing them before the file transfer:
301
+
```commandline
302
+
tar -czf output.tar.gz output
303
+
```
304
+
Then untar the file on your local machine:
305
+
```commandline
306
+
tar -xf output.tar.gz
307
+
```
308
+
309
+
Make sure to extract the output into the same condor run directory that you produced with `condor.py` in the [Prepare an HTCondor run](#prepare-an-htcondor-run) section.
310
+
311
+
#### Parse the results files
312
+
313
+
The [process_run.py](code/process_run.py) script can be used to parse the results files and combine them into a single dataframe.
314
+
315
+
Run it with the following command, specifying the mode `stats` and the main run directory of the HTCondor run:
This database can now be used with the [metl](https://github.com/gitter-lab/metl) repository to create a processed Rosetta dataset and pretrain METL models.
0 commit comments