@@ -29,7 +29,7 @@ location `/tmp/tfhub_modules` (or whatever `os.path.join(tempfile.gettempdir(),
29
29
Users who prefer persistent caching across system reboots can instead set
30
30
` TFHUB_CACHE_DIR ` to a location in their home directory. For example, a user of
31
31
the bash shell on a Linux system can add a line like the following to
32
- ` ~/.bashrc `
32
+ ` ~/.bashrc ` :
33
33
34
34
``` bash
35
35
export TFHUB_CACHE_DIR=$HOME /.cache/tfhub_modules
@@ -41,7 +41,7 @@ persistent location, be aware that there is no automatic cleanup.
41
41
### Reading from remote storage
42
42
43
43
Users can instruct the ` tensorflow_hub ` library to directly read models from
44
- remote storage (GCS) instead of downloading the models locally with
44
+ remote storage (GCS) instead of downloading the models locally with:
45
45
46
46
``` shell
47
47
os.environ[" TFHUB_MODEL_LOAD_FORMAT" ] = " UNCOMPRESSED"
@@ -64,7 +64,7 @@ location by default. There are two workarounds for this situation:
64
64
The easiest solution is to instruct the ` tensorflow_hub ` library to read the
65
65
models from TF Hub's GCS bucket as explained above. Users with their own GCS
66
66
bucket can instead specify a directory in their bucket as the cache location
67
- with code like
67
+ with code like:
68
68
69
69
``` python
70
70
import os
@@ -83,4 +83,4 @@ load_options =
83
83
tf.saved_model.LoadOptions(experimental_io_device = ' /job:localhost' )
84
84
reloaded_model = hub.load(" https://tfhub.dev/..." , options = load_options)
85
85
```
86
- ** Note:** See more information regarding valid handles [ here] ( tf2_saved_model.md#model_handles ) .
86
+ ** Note:** See more information regarding valid handles [ here] ( tf2_saved_model.md#model_handles ) .
0 commit comments