|
14 | 14 | },
|
15 | 15 | "source": [
|
16 | 16 | "<h1>Table of Contents<span class=\"tocSkip\"></span></h1>\n",
|
17 |
| - "<div class=\"toc\"><ul class=\"toc-item\"><li><span><a href=\"#Prerequisites\" data-toc-modified-id=\"Prerequisites-1\">Prerequisites</a></span></li><li><span><a href=\"#Introduction\" data-toc-modified-id=\"Introduction-2\">Introduction</a></span></li><li><span><a href=\"#Export-training-data-for-deep-learning\" data-toc-modified-id=\"Export-training-data-for-deep-learning-3\">Export training data for deep learning</a></span><ul class=\"toc-item\"><li><span><a href=\"#Import-ArcGIS-API-for-Python-and-get-connected-to-your-GIS\" data-toc-modified-id=\"Import-ArcGIS-API-for-Python-and-get-connected-to-your-GIS-3.1\">Import ArcGIS API for Python and get connected to your GIS</a></span></li><li><span><a href=\"#Prepare-data-that-will-be-used-for-training-data-export\" data-toc-modified-id=\"Prepare-data-that-will-be-used-for-training-data-export-3.2\">Prepare data that will be used for training data export</a></span></li><li><span><a href=\"#Specify-a-folder-name-in-raster-store-that-will-be-used-to-store-our-training-data\" data-toc-modified-id=\"Specify-a-folder-name-in-raster-store-that-will-be-used-to-store-our-training-data-3.3\">Specify a folder name in raster store that will be used to store our training data</a></span></li><li><span><a href=\"#Export-training-data-using-arcgis.learn\" data-toc-modified-id=\"Export-training-data-using-arcgis.learn-3.4\">Export training data using <code>arcgis.learn</code></a></span></li></ul></li><li><span><a href=\"#Model-training\" data-toc-modified-id=\"Model-training-4\">Model training</a></span><ul class=\"toc-item\"><li><span><a href=\"#Visualize-training-data\" data-toc-modified-id=\"Visualize-training-data-4.1\">Visualize training data</a></span></li><li><span><a href=\"#Load-model-architecture\" data-toc-modified-id=\"Load-model-architecture-4.2\">Load model architecture</a></span></li><li><span><a href=\"#Train-a-model-through-learning-rate-tuning-and-transfer-learning\" data-toc-modified-id=\"Train-a-model-through-learning-rate-tuning-and-transfer-learning-4.3\">Train a model through learning rate tuning and transfer learning</a></span></li><li><span><a href=\"#Visualize-classification-results-in-validation-set\" data-toc-modified-id=\"Visualize-classification-results-in-validation-set-4.4\">Visualize classification results in validation set</a></span></li></ul></li><li><span><a href=\"#Deployment-and-inference\" data-toc-modified-id=\"Deployment-and-inference-5\">Deployment and inference</a></span><ul class=\"toc-item\"><li><span><a href=\"#Locate-model-package\" data-toc-modified-id=\"Locate-model-package-5.1\">Locate model package</a></span></li><li><span><a href=\"#Model-inference\" data-toc-modified-id=\"Model-inference-5.2\">Model inference</a></span></li></ul></li><li><span><a href=\"#Visualize-land-cover-classification-on-map\" data-toc-modified-id=\"Visualize-land-cover-classification-on-map-6\">Visualize land cover classification on map</a></span></li></ul></div>" |
| 17 | + "<div class=\"toc\"><ul class=\"toc-item\"><li><span><a href=\"#Prerequisites\" data-toc-modified-id=\"Prerequisites-1\">Prerequisites</a></span></li><li><span><a href=\"#Introduction\" data-toc-modified-id=\"Introduction-2\">Introduction</a></span></li><li><span><a href=\"#Export-training-data-for-deep-learning\" data-toc-modified-id=\"Export-training-data-for-deep-learning-3\">Export training data for deep learning</a></span><ul class=\"toc-item\"><li><span><a href=\"#Import-ArcGIS-API-for-Python-and-get-connected-to-your-GIS\" data-toc-modified-id=\"Import-ArcGIS-API-for-Python-and-get-connected-to-your-GIS-3.1\">Import ArcGIS API for Python and get connected to your GIS</a></span></li><li><span><a href=\"#Preprocess-Training-data\" data-toc-modified-id=\"Preprocess-Training-data-3.2\">Preprocess Training data</a></span></li><li><span><a href=\"#Specify-a-folder-name-in-raster-store-that-will-be-used-to-store-our-training-data\" data-toc-modified-id=\"Specify-a-folder-name-in-raster-store-that-will-be-used-to-store-our-training-data-3.3\">Specify a folder name in raster store that will be used to store our training data</a></span></li><li><span><a href=\"#Export-training-data-using-arcgis.learn\" data-toc-modified-id=\"Export-training-data-using-arcgis.learn-3.4\">Export training data using <code>arcgis.learn</code></a></span></li></ul></li><li><span><a href=\"#Model-training\" data-toc-modified-id=\"Model-training-4\">Model training</a></span><ul class=\"toc-item\"><li><span><a href=\"#Visualize-training-data\" data-toc-modified-id=\"Visualize-training-data-4.1\">Visualize training data</a></span></li><li><span><a href=\"#Load-model-architecture\" data-toc-modified-id=\"Load-model-architecture-4.2\">Load model architecture</a></span></li><li><span><a href=\"#Train-a-model-through-learning-rate-tuning-and-transfer-learning\" data-toc-modified-id=\"Train-a-model-through-learning-rate-tuning-and-transfer-learning-4.3\">Train a model through learning rate tuning and transfer learning</a></span></li><li><span><a href=\"#Visualize-classification-results-in-validation-set\" data-toc-modified-id=\"Visualize-classification-results-in-validation-set-4.4\">Visualize classification results in validation set</a></span></li></ul></li><li><span><a href=\"#Deployment-and-inference\" data-toc-modified-id=\"Deployment-and-inference-5\">Deployment and inference</a></span><ul class=\"toc-item\"><li><span><a href=\"#Locate-model-package\" data-toc-modified-id=\"Locate-model-package-5.1\">Locate model package</a></span></li><li><span><a href=\"#Model-inference\" data-toc-modified-id=\"Model-inference-5.2\">Model inference</a></span></li></ul></li><li><span><a href=\"#Visualize-land-cover-classification-on-map\" data-toc-modified-id=\"Visualize-land-cover-classification-on-map-6\">Visualize land cover classification on map</a></span></li></ul></div>" |
18 | 18 | ]
|
19 | 19 | },
|
20 | 20 | {
|
|
79 | 79 | "cell_type": "markdown",
|
80 | 80 | "metadata": {},
|
81 | 81 | "source": [
|
82 |
| - "### Prepare data that will be used for training data export" |
| 82 | + "### Preprocess Training data" |
83 | 83 | ]
|
84 | 84 | },
|
85 | 85 | {
|
86 | 86 | "cell_type": "markdown",
|
87 | 87 | "metadata": {},
|
88 | 88 | "source": [
|
89 |
| - "To export training data, we need a _labeled imagery layer_ that contains the class label for each location, and a _raster input_ that contains all the original pixels and band information. In this land cover classification case, we will be using a subset of the one-meter resolution Kent county, Delaware, dataset as the labeled imagery layer and World Imagery: Color Infrared as the raster input." |
| 89 | + "To export training data, we need a _labeled imagery layer_ that contains the class label for each location, and a _raster input_ that contains all the original pixels and band information. In this land cover classification case, we will be using a subset of the one-meter resolution Kent county, Delaware, dataset as the labeled imagery layer and World Imagery: Color Infrared as the raster input.\n", |
| 90 | + "\n", |
| 91 | + "Notes: \n", |
| 92 | + "- The raster and labeled imagery layer both are required to be '8 bit Unsigned' when the raster is a RGB layer such as world imagery.\n", |
| 93 | + "- The Labeled imagery layer should be a thematic raster with pixel values corresponding to the label class value.\n", |
| 94 | + "- The pixel values should range from 1 to n, where n is the total number of classes.\n", |
| 95 | + "- Any NoData value should be mapped to 0, as portions of image with nodata values would be ignored in while exporting training data." |
90 | 96 | ]
|
91 | 97 | },
|
92 | 98 | {
|
|
227 | 233 | "cell_type": "markdown",
|
228 | 234 | "metadata": {},
|
229 | 235 | "source": [
|
230 |
| - "Make sure a raster store is ready on your raster analytics image server. This is where where the output subimages, also called chips, labels and metadata files are going to be stored." |
| 236 | + "Make sure a raster store is ready on your raster analytics image server. This is where the output sub images, also called chips, labels and metadata files are going to be stored." |
231 | 237 | ]
|
232 | 238 | },
|
233 | 239 | {
|
|
336 | 342 | "cell_type": "markdown",
|
337 | 343 | "metadata": {},
|
338 | 344 | "source": [
|
339 |
| - "With the feature class and raster layer, we are now ready to export training data using the export_training_data() method in arcgis.learn module. In addtion to feature class, raster layer, and output folder, we also need to speficy a few other parameters such as tile_size (size of the image chips), strid_size (distance to move each time when creating the next image chip), chip_format (TIFF, PNG, or JPEG), metadata format (how we are going to store those training labels). More detail can be found [here](https://pro.arcgis.com/en/pro-app/tool-reference/image-analyst/export-training-data-for-deep-learning.htm). \n", |
| 345 | + "With the feature class and raster layer, we are now ready to export training data using the export_training_data() method in arcgis.learn module. In addition to feature class, raster layer, and output folder, we also need to specify a few other parameters such as tile_size (size of the image chips), strid_size (distance to move each time when creating the next image chip), chip_format (TIFF, PNG, or JPEG), metadata format (how we are going to store those training labels). More detail can be found [here](https://pro.arcgis.com/en/pro-app/tool-reference/image-analyst/export-training-data-for-deep-learning.htm). \n", |
340 | 346 | "\n",
|
341 | 347 | "Depending on the size of your data, tile and stride size, and computing resources, this opertation can take 15mins~2hrs in our experiment. Also, do not re-run it if you already run it once unless you would like to update the setting."
|
342 | 348 | ]
|
|
543 | 549 | "metadata": {},
|
544 | 550 | "source": [
|
545 | 551 | "### Load model architecture\n",
|
546 |
| - "We will be using U-net, one of the well-recogonized image segmentation algorithm, for our land cover classification. U-Net is designed like an auto-encoder. It has an encoding path (“contracting”) paired with a decoding path (“expanding”) which gives it the “U” shape. However, in contrast to the autoencoder, U-Net predicts a pixelwise segmentation map of the input image rather than classifying the input image as a whole. For each pixel in the original image, it asks the question: “To which class does this pixel belong?”. U-Net passes the feature maps from each level of the contracting path over to the analogous level in the expanding path. These are similar to residual connections in a ResNet type model, and allow the classifier to consider features at various scales and complexities to make its decision." |
| 552 | + "We will be using U-net, one of the well-recognized image segmentation algorithm, for our land cover classification. U-Net is designed like an auto-encoder. It has an encoding path (“contracting”) paired with a decoding path (“expanding”) which gives it the “U” shape. However, in contrast to the autoencoder, U-Net predicts a pixelwise segmentation map of the input image rather than classifying the input image as a whole. For each pixel in the original image, it asks the question: “To which class does this pixel belong?”. U-Net passes the feature maps from each level of the contracting path over to the analogous level in the expanding path. These are similar to residual connections in a ResNet type model, and allow the classifier to consider features at various scales and complexities to make its decision." |
547 | 553 | ]
|
548 | 554 | },
|
549 | 555 | {
|
|
725 | 731 | "cell_type": "markdown",
|
726 | 732 | "metadata": {},
|
727 | 733 | "source": [
|
728 |
| - "As we can see, with only 10 epochs, we are already seeing reasonable results. Further improvment can be acheived through more sophisticated hyperparameter tuning. Let's save the model for further training or inference later. The model should be saved into a models folder in your folder. By default, it will be saved into your `data_path` that you specified in the very beginning of this notebook." |
| 734 | + "As we can see, with only 10 epochs, we are already seeing reasonable results. Further improvement can be achieved through more sophisticated hyperparameter tuning. Let's save the model for further training or inference later. The model should be saved into a models folder in your folder. By default, it will be saved into your `data_path` that you specified in the very beginning of this notebook." |
729 | 735 | ]
|
730 | 736 | },
|
731 | 737 | {
|
|
1024 | 1030 | "name": "python",
|
1025 | 1031 | "nbconvert_exporter": "python",
|
1026 | 1032 | "pygments_lexer": "ipython3",
|
1027 |
| - "version": "3.7.2" |
| 1033 | + "version": "3.6.9" |
1028 | 1034 | },
|
1029 | 1035 | "toc": {
|
1030 | 1036 | "base_numbering": 1,
|
|
0 commit comments