|
37 | 37 | "id": "77z2OchJTk0l"
|
38 | 38 | },
|
39 | 39 | "source": [
|
40 |
| - "# Migration Examples: Canned Estimators\n", |
| 40 | + "# Migration examples: Canned Estimators\n", |
41 | 41 | "\n",
|
42 | 42 | "<table class=\"tfo-notebook-buttons\" align=\"left\">\n",
|
43 | 43 | " <td>\n",
|
|
67 | 67 | "source": [
|
68 | 68 | "Canned (or Premade) Estimators have traditionally been used in TensorFlow 1 as quick and easy ways to train models for a variety of typical use cases. TensorFlow 2 provides straightforward approximate substitutes for a number of them by way of Keras models. For those canned estimators that do not have built-in TensorFlow 2 substitutes, you can still build your own replacement fairly easily.\n",
|
69 | 69 | "\n",
|
70 |
| - "This guide walks through a few examples of direct equivalents and custom substitutions to demonstrate how TensorFlow 1's `tf.estimator`-derived models can be migrated to TF2 with Keras.\n", |
| 70 | + "This guide will walk you through a few examples of direct equivalents and custom substitutions to demonstrate how TensorFlow 1's `tf.estimator`-derived models can be migrated to TensorFlow 2 with Keras.\n", |
71 | 71 | "\n",
|
72 | 72 | "Namely, this guide includes examples for migrating:\n",
|
73 | 73 | "* From `tf.estimator`'s `LinearEstimator`, `Classifier` or `Regressor` in TensorFlow 1 to Keras `tf.compat.v1.keras.models.LinearModel` in TensorFlow 2\n",
|
|
183 | 183 | "id": "bYSgoezeMrpI"
|
184 | 184 | },
|
185 | 185 | "source": [
|
186 |
| - "and create a method to instantiate a simplistic sample optimizer to use with our various TensorFlow 1 Estimator and TensorFlow 2 Keras models." |
| 186 | + "and create a method to instantiate a simplistic sample optimizer to use with various TensorFlow 1 Estimator and TensorFlow 2 Keras models." |
187 | 187 | ]
|
188 | 188 | },
|
189 | 189 | {
|
|
226 | 226 | "id": "_O7fyhCnpvED"
|
227 | 227 | },
|
228 | 228 | "source": [
|
229 |
| - "### TF1: Using LinearEstimator" |
| 229 | + "### TensorFlow 1: Using LinearEstimator" |
230 | 230 | ]
|
231 | 231 | },
|
232 | 232 | {
|
|
270 | 270 | "id": "KEmzBjfnsxwT"
|
271 | 271 | },
|
272 | 272 | "source": [
|
273 |
| - "### TF2: Using Keras LinearModel" |
| 273 | + "### TensorFlow 2: Using Keras LinearModel" |
274 | 274 | ]
|
275 | 275 | },
|
276 | 276 | {
|
|
311 | 311 | "id": "YKl6XZ7Bp1t5"
|
312 | 312 | },
|
313 | 313 | "source": [
|
314 |
| - "### TF1: Using DNNEstimator" |
| 314 | + "### TensorFlow 1: Using DNNEstimator" |
315 | 315 | ]
|
316 | 316 | },
|
317 | 317 | {
|
|
320 | 320 | "id": "J7wJUmgypln8"
|
321 | 321 | },
|
322 | 322 | "source": [
|
323 |
| - "In TensorFlow 1, you can use `tf.estimator.DNNEstimator` to create a baseline DNN model for regression and classification problems." |
| 323 | + "In TensorFlow 1, you can use `tf.estimator.DNNEstimator` to create a baseline deep neural network (DNN) model for regression and classification problems." |
324 | 324 | ]
|
325 | 325 | },
|
326 | 326 | {
|
|
357 | 357 | "id": "6xJz6px6pln-"
|
358 | 358 | },
|
359 | 359 | "source": [
|
360 |
| - "### TF2: Using Keras to Create a Custom DNN Model" |
| 360 | + "### TensorFlow 2: Using Keras to create a custom DNN model" |
361 | 361 | ]
|
362 | 362 | },
|
363 | 363 | {
|
|
368 | 368 | "source": [
|
369 | 369 | "In TensorFlow 2, you can create a custom DNN model to substitute for one generated by `tf.estimator.DNNEstimator`, with similar levels of user-specified customization (for instance, as in the previous example, the ability to customize a chosen model optimizer).\n",
|
370 | 370 | "\n",
|
371 |
| - "A similar workflow can be used to replace `tf.estimator.experimental.RNNEstimator` with a Keras RNN Model. Keras provides a number of built-in, customizable choices by way of `tf.keras.layers.RNN`, `tf.keras.layers.LSTM`, and `tf.keras.layers.GRU` - see [here](https://www.tensorflow.org/guide/keras/rnn#built-in_rnn_layers_a_simple_example) for more details." |
| 371 | + "A similar workflow can be used to replace `tf.estimator.experimental.RNNEstimator` with a Keras recurrent neural network (RNN) model. Keras provides a number of built-in, customizable choices by way of `tf.keras.layers.RNN`, `tf.keras.layers.LSTM`, and `tf.keras.layers.GRU`. To learn more, check out the _Built-in RNN layers: a simple example_ section of [RNN with Keras guide](https://www.tensorflow.org/guide/keras/rnn)." |
372 | 372 | ]
|
373 | 373 | },
|
374 | 374 | {
|
|
413 | 413 | "id": "GfRaObf5g4TU"
|
414 | 414 | },
|
415 | 415 | "source": [
|
416 |
| - "### TF1: Using DNNLinearCombinedEstimator" |
| 416 | + "### TensorFlow 1: Using DNNLinearCombinedEstimator" |
417 | 417 | ]
|
418 | 418 | },
|
419 | 419 | {
|
|
464 | 464 | "id": "BeMikL5ug4TX"
|
465 | 465 | },
|
466 | 466 | "source": [
|
467 |
| - "### TF2: Using Keras WideDeepModel" |
| 467 | + "### TensorFlow 2: Using Keras WideDeepModel" |
468 | 468 | ]
|
469 | 469 | },
|
470 | 470 | {
|
|
477 | 477 | "\n",
|
478 | 478 | "This `WideDeepModel` is constructed on the basis of a constituent `LinearModel` and a custom DNN Model, both of which are discussed in the preceding two examples. A custom linear model can also be used in place of the built-in Keras `LinearModel` if desired.\n",
|
479 | 479 | "\n",
|
480 |
| - "If you would like to build your own model instead of a canned estimator, check out [how to build a `keras.Sequential` model](https://www.tensorflow.org/guide/keras/sequential_model). For more information on custom training and optimizers you can also checkout [this guide](https://www.tensorflow.org/tutorials/customization/custom_training_walkthrough)." |
| 480 | + "If you would like to build your own model instead of using a canned estimator, check out the [Keras Sequential model](https://www.tensorflow.org/guide/keras/sequential_model) guide. For more information on custom training and optimizers, check out the [Custom training: walkthrough](https://www.tensorflow.org/tutorials/customization/custom_training_walkthrough) guide." |
481 | 481 | ]
|
482 | 482 | },
|
483 | 483 | {
|
|
532 | 532 | "id": "_3mCQVDSeOKD"
|
533 | 533 | },
|
534 | 534 | "source": [
|
535 |
| - "### TF1: Using BoostedTreesEstimator" |
| 535 | + "### TensorFlow 1: Using BoostedTreesEstimator" |
536 | 536 | ]
|
537 | 537 | },
|
538 | 538 | {
|
|
578 | 578 | "id": "eNuLP6BeeOKF"
|
579 | 579 | },
|
580 | 580 | "source": [
|
581 |
| - "### TF2: Using TensorFlow Decision Forests" |
| 581 | + "### TensorFlow 2: Using TensorFlow Decision Forests" |
582 | 582 | ]
|
583 | 583 | },
|
584 | 584 | {
|
|
689 | 689 | "id": "Z22UJ5SUqToQ"
|
690 | 690 | },
|
691 | 691 | "source": [
|
692 |
| - "Gradient Boosted Trees is just one of the many decision forests algorithms available in TensorFlow Decision Forests. For example, Random Forests (available as [tfdf.keras.GradientBoostedTreesModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/RandomForestModel) is very resistant to overfitting) while CART (available as [tfdf.keras.CartModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/CartModel)) is great for model interpretation.\n", |
| 692 | + "Gradient Boosted Trees is just one of the many decision forest algorithms available in TensorFlow Decision Forests. For example, Random Forests (available as [tfdf.keras.GradientBoostedTreesModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/RandomForestModel) is very resistant to overfitting) while CART (available as [tfdf.keras.CartModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/CartModel)) is great for model interpretation.\n", |
693 | 693 | "\n",
|
694 |
| - "In the next example, we train and plot a Random Forest model." |
| 694 | + "In the next example, train and plot a Random Forest model." |
695 | 695 | ]
|
696 | 696 | },
|
697 | 697 | {
|
|
718 | 718 | "id": "Z0QYolhoZb_k"
|
719 | 719 | },
|
720 | 720 | "source": [
|
721 |
| - "Finally, in the next example, we train and evaluate a CART model." |
| 721 | + "In the final example, train and evaluate a CART model." |
722 | 722 | ]
|
723 | 723 | },
|
724 | 724 | {
|
|
0 commit comments