|
37 | 37 | "id": "77z2OchJTk0l"
|
38 | 38 | },
|
39 | 39 | "source": [
|
40 |
| - "# Migration Examples: Canned Estimators\n", |
| 40 | + "# Migration examples: Canned Estimators\n", |
41 | 41 | "\n",
|
42 | 42 | "<table class=\"tfo-notebook-buttons\" align=\"left\">\n",
|
43 | 43 | " <td>\n",
|
|
75 | 75 | "* From `tf.estimator`'s `DNNLinearCombinedEstimator`, `Classifier` or `Regressor` in TensorFlow 1 to `tf.compat.v1.keras.models.WideDeepModel` in TensorFlow 2\n",
|
76 | 76 | "* From `tf.estimator`'s `BoostedTreesEstimator`, `Classifier` or `Regressor` in TensorFlow 1 to `tfdf.keras.GradientBoostedTreesModel` in TensorFlow 2\n",
|
77 | 77 | "\n",
|
78 |
| - "A common precursor to the training of a model is feature preprocessing, which is done for TensorFlow 1 Estimator models with `tf.feature_column`. For more information on feature preprocessing in TensorFlow 2, see [this guide on migrating from feature columns to the Keras preprocessing layers API](migrating_feature_columns.ipynb)." |
| 78 | + "A common precursor to the training of a model is feature preprocessing, which is done for TensorFlow 1 Estimator models with `tf.feature_column`. For more information on feature preprocessing in TensorFlow 2, check out [this guide on migrating from feature columns to the Keras preprocessing layers API](migrating_feature_columns.ipynb)." |
79 | 79 | ]
|
80 | 80 | },
|
81 | 81 | {
|
|
357 | 357 | "id": "6xJz6px6pln-"
|
358 | 358 | },
|
359 | 359 | "source": [
|
360 |
| - "### TF2: Using Keras to Create a Custom DNN Model" |
| 360 | + "### TF2: Using Keras to create a custom DNN model" |
361 | 361 | ]
|
362 | 362 | },
|
363 | 363 | {
|
|
368 | 368 | "source": [
|
369 | 369 | "In TensorFlow 2, you can create a custom DNN model to substitute for one generated by `tf.estimator.DNNEstimator`, with similar levels of user-specified customization (for instance, as in the previous example, the ability to customize a chosen model optimizer).\n",
|
370 | 370 | "\n",
|
371 |
| - "A similar workflow can be used to replace `tf.estimator.experimental.RNNEstimator` with a Keras RNN Model. Keras provides a number of built-in, customizable choices by way of `tf.keras.layers.RNN`, `tf.keras.layers.LSTM`, and `tf.keras.layers.GRU` - see [here](https://www.tensorflow.org/guide/keras/rnn#built-in_rnn_layers_a_simple_example) for more details." |
| 371 | + "A similar workflow can be used to replace `tf.estimator.experimental.RNNEstimator` with a Keras RNN Model. Keras provides a number of built-in, customizable choices by way of `tf.keras.layers.RNN`, `tf.keras.layers.LSTM`, and `tf.keras.layers.GRU`. Refer to _Built-in RNN layers: a simple example_ in the [RNN with Keras guide](https://www.tensorflow.org/guide/keras/rnn#built-in_rnn_layers_a_simple_example) for more details." |
372 | 372 | ]
|
373 | 373 | },
|
374 | 374 | {
|
|
488 | 488 | },
|
489 | 489 | "outputs": [],
|
490 | 490 | "source": [
|
491 |
| - "# Create LinearModel and DNN Model as in Examples 1 and 2\n", |
| 491 | + "# Create a LinearModel and a DNN model as in Examples 1 and 2\n", |
492 | 492 | "optimizer = create_sample_optimizer('tf2')\n",
|
493 | 493 | "\n",
|
494 | 494 | "linear_model = tf.compat.v1.keras.experimental.LinearModel()\n",
|
|
620 | 620 | "id": "B1qTdAS-VpXk"
|
621 | 621 | },
|
622 | 622 | "source": [
|
623 |
| - "Create a TensorFlow dataset. Note that Decision Forests support natively many types of features and do not need pre-processing." |
| 623 | + "Create a TensorFlow dataset. Note that Decision Forests natively support many types of features and do not need pre-processing." |
624 | 624 | ]
|
625 | 625 | },
|
626 | 626 | {
|
|
634 | 634 | "train_dataframe = pd.read_csv('https://storage.googleapis.com/tf-datasets/titanic/train.csv')\n",
|
635 | 635 | "eval_dataframe = pd.read_csv('https://storage.googleapis.com/tf-datasets/titanic/eval.csv')\n",
|
636 | 636 | "\n",
|
637 |
| - "# Convert the Pandas Dataframes into TensorFlow datasets.\n", |
| 637 | + "# Convert the pandas Dataframes into TensorFlow datasets.\n", |
638 | 638 | "train_dataset = tfdf.keras.pd_dataframe_to_tf_dataset(train_dataframe, label=\"survived\")\n",
|
639 | 639 | "eval_dataset = tfdf.keras.pd_dataframe_to_tf_dataset(eval_dataframe, label=\"survived\")"
|
640 | 640 | ]
|
|
689 | 689 | "id": "Z22UJ5SUqToQ"
|
690 | 690 | },
|
691 | 691 | "source": [
|
692 |
| - "Gradient Boosted Trees is just one of the many decision forests algorithms avaiable in TensorFlow Decision Forests. For example, Random Forests (available as [tfdf.keras.GradientBoostedTreesModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/RandomForestModel) is very resistant to overfitting) while CART (available as [tfdf.keras.CartModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/CartModel)) is great for model interpretation.\n", |
| 692 | + "Gradient Boosted Trees is just one of the many decision forest algorithms available in TensorFlow Decision Forests. For example, Random Forests (available as [tfdf.keras.GradientBoostedTreesModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/RandomForestModel) is very resistant to overfitting) while CART (available as [tfdf.keras.CartModel](https://www.tensorflow.org/decision_forests/api_docs/python/tfdf/keras/CartModel)) is great for model interpretation.\n", |
693 | 693 | "\n",
|
694 |
| - "In the next example, we train and plot a Random Forest model." |
| 694 | + "In the next example, you'll train and plot a Random Forest model." |
695 | 695 | ]
|
696 | 696 | },
|
697 | 697 | {
|
|
718 | 718 | "id": "Z0QYolhoZb_k"
|
719 | 719 | },
|
720 | 720 | "source": [
|
721 |
| - "Finally, in the next example, we train and evaluate a CART model." |
| 721 | + "In the final example, you'll train and evaluate a CART model." |
722 | 722 | ]
|
723 | 723 | },
|
724 | 724 | {
|
|
0 commit comments