Skip to content

Commit bbeb31a

Browse files
Merge pull request #1964 from RenuPatelGoogle:patch-5
PiperOrigin-RevId: 407797233
2 parents 9c485ed + 310021b commit bbeb31a

File tree

5 files changed

+9
-18
lines changed

5 files changed

+9
-18
lines changed

site/en/guide/extension_type.ipynb

Lines changed: 1 addition & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -350,8 +350,7 @@
350350
"source": [
351351
"### Printable representation\n",
352352
"\n",
353-
"`ExtensionType` adds a default printable representation method (`__repr__`) that includes the class name and the value for each field:\n",
354-
"\n"
353+
"`ExtensionType` adds a default printable representation method (`__repr__`) that includes the class name and the value for each field:\n"
355354
]
356355
},
357356
{
@@ -2115,15 +2114,11 @@
21152114
"colab": {
21162115
"collapsed_sections": [],
21172116
"name": "extension_type.ipynb",
2118-
"provenance": [],
21192117
"toc_visible": true
21202118
},
21212119
"kernelspec": {
21222120
"display_name": "Python 3",
21232121
"name": "python3"
2124-
},
2125-
"language_info": {
2126-
"name": "python"
21272122
}
21282123
},
21292124
"nbformat": 4,

site/en/guide/function.ipynb

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -272,7 +272,7 @@
272272
"id": "K7scSzLx662f"
273273
},
274274
"source": [
275-
"When we pass arguments of different types into a `Function`, both stages are run:\n"
275+
"When you pass arguments of different types into a `Function`, both stages are run:\n"
276276
]
277277
},
278278
{
@@ -1301,7 +1301,7 @@
13011301
"\n",
13021302
"`Function` creates a new `ConcreteFunction` when called with a new value of a Python argument. However, it does not do that for the Python closure, globals, or nonlocals of that `Function`. If their value changes in between calls to the `Function`, the `Function` will still use the values they had when it was traced. This is different from how regular Python functions work.\n",
13031303
"\n",
1304-
"For that reason, we recommend a functional programming style that uses arguments instead of closing over outer names."
1304+
"For that reason, you should follow a functional programming style that uses arguments instead of closing over outer names."
13051305
]
13061306
},
13071307
{
@@ -1439,7 +1439,7 @@
14391439
"source": [
14401440
"Using the same `Function` to evaluate the updated instance of the model will be buggy since the updated model has the [same cache key](#rules_of_tracing) as the original model.\n",
14411441
"\n",
1442-
"For that reason, we recommend that you write your `Function` to avoid depending on mutable object attributes or create new objects.\n",
1442+
"For that reason, you're recommended to write your `Function` to avoid depending on mutable object attributes or create new objects.\n",
14431443
"\n",
14441444
"If that is not possible, one workaround is to make new `Function`s each time you modify your object to force retracing:"
14451445
]
@@ -1690,7 +1690,6 @@
16901690
"colab": {
16911691
"collapsed_sections": [],
16921692
"name": "function.ipynb",
1693-
"provenance": [],
16941693
"toc_visible": true
16951694
},
16961695
"kernelspec": {

site/en/guide/migrate/saved_model.ipynb

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -748,7 +748,6 @@
748748
"colab": {
749749
"collapsed_sections": [],
750750
"name": "saved_model.ipynb",
751-
"provenance": [],
752751
"toc_visible": true
753752
},
754753
"kernelspec": {

site/en/guide/tensor.ipynb

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1491,7 +1491,6 @@
14911491
"Tce3stUlHN0L"
14921492
],
14931493
"name": "tensor.ipynb",
1494-
"provenance": [],
14951494
"toc_visible": true
14961495
},
14971496
"kernelspec": {

site/en/tutorials/keras/text_classification.ipynb

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -119,8 +119,7 @@
119119
"import tensorflow as tf\n",
120120
"\n",
121121
"from tensorflow.keras import layers\n",
122-
"from tensorflow.keras import losses\n",
123-
"from tensorflow.keras import preprocessing"
122+
"from tensorflow.keras import losses\n"
124123
]
125124
},
126125
{
@@ -286,7 +285,7 @@
286285
"batch_size = 32\n",
287286
"seed = 42\n",
288287
"\n",
289-
"raw_train_ds = tf.keras.preprocessing.text_dataset_from_directory(\n",
288+
"raw_train_ds = tf.keras.utils.text_dataset_from_directory(\n",
290289
" 'aclImdb/train', \n",
291290
" batch_size=batch_size, \n",
292291
" validation_split=0.2, \n",
@@ -366,7 +365,7 @@
366365
},
367366
"outputs": [],
368367
"source": [
369-
"raw_val_ds = tf.keras.preprocessing.text_dataset_from_directory(\n",
368+
"raw_val_ds = tf.keras.utils.text_dataset_from_directory(\n",
370369
" 'aclImdb/train', \n",
371370
" batch_size=batch_size, \n",
372371
" validation_split=0.2, \n",
@@ -382,7 +381,7 @@
382381
},
383382
"outputs": [],
384383
"source": [
385-
"raw_test_ds = tf.keras.preprocessing.text_dataset_from_directory(\n",
384+
"raw_test_ds = tf.keras.utils.text_dataset_from_directory(\n",
386385
" 'aclImdb/test', \n",
387386
" batch_size=batch_size)"
388387
]
@@ -395,7 +394,7 @@
395394
"source": [
396395
"### Prepare the dataset for training\n",
397396
"\n",
398-
"Next, you will standardize, tokenize, and vectorize the data using the helpful `preprocessing.TextVectorization` layer. \n",
397+
"Next, you will standardize, tokenize, and vectorize the data using the helpful `tf.keras.layers.TextVectorization` layer. \n",
399398
"\n",
400399
"Standardization refers to preprocessing the text, typically to remove punctuation or HTML elements to simplify the dataset. Tokenization refers to splitting strings into tokens (for example, splitting a sentence into individual words, by splitting on whitespace). Vectorization refers to converting tokens into numbers so they can be fed into a neural network. All of these tasks can be accomplished with this layer.\n",
401400
"\n",

0 commit comments

Comments
 (0)