|
7 | 7 | "# Azure Cognitive Search sample \n", |
8 | 8 | "## Passing Images as Binary File References\n", |
9 | 9 | "\n", |
10 | | - "Cognitive Search skillsets that need to pass images to custom skills use a binary file reference to serialize the images to pass them to and from skills. This sample demosntrates an example of how skills can be configured to accept an image as an input from the skillset and return images as outputs to the skillset. This example does nothing more than segment an image based on the layout from OCR. The sole purpose of this sample is to demosntrate how you pass images to skills and how skills can return images.\n", |
| 10 | + "Cognitive Search skillsets that need to pass images to custom skills use a binary file reference to serialize the images to pass them to and from skills. This sample demonstrates an example of how skills can be configured to accept an image as an input from the skillset and return images as outputs to the skillset. This example does nothing more than segment an image based on the layout from OCR. The sole purpose of this sample is to demonstrate how you pass images to skills and how skills can return images.\n", |
11 | 11 | "\n" |
12 | 12 | ] |
13 | 13 | }, |
14 | 14 | { |
15 | 15 | "cell_type": "markdown", |
16 | 16 | "metadata": {}, |
17 | 17 | "source": [ |
18 | | - "### Configure service \n", |
| 18 | + "### Prerequisites \n", |
19 | 19 | "\n", |
20 | 20 | "Provision the required services:\n", |
21 | | - "1. Cognitive Search\n", |
22 | | - "3. Azure Functions (Or any other compute environment you plan to host your API endpoint on). \n", |
23 | | - "\n" |
| 21 | + "1. [Azure Cognitive Search](https://docs.microsoft.com/azure/search/search-create-service-portal)\n", |
| 22 | + "2. [Azure Functions](https://docs.microsoft.com/azure/azure-functions/) used for hosting an API endpoint.\n", |
| 23 | + "3. [Storage Account](https://docs.microsoft.com/azure/storage/blobs/)\n" |
24 | 24 | ] |
25 | 25 | }, |
26 | 26 | { |
27 | 27 | "cell_type": "markdown", |
28 | 28 | "metadata": {}, |
29 | 29 | "source": [ |
30 | 30 | "### Deploy the Azure functions app \n", |
31 | | - "The ```SplitImage``` folder contains an Azure function that will accept an input in the [custom skill format](https://docs.microsoft.com/en-us/azure/search/cognitive-search-custom-skill-web-api#skill-inputs). \n", |
| 31 | + "The ```SplitImage``` folder contains an Azure function that will accept an input in the [custom skill format](https://docs.microsoft.com/azure/search/cognitive-search-custom-skill-web-api#skill-inputs). \n", |
32 | 32 | "Each input record contains an image that is serialized as a ```Base64``` encoded string and the layout text returned from the OCR skill.\n", |
33 | 33 | "The skill then segments the image into smaller images based on the coordinates of the layout text. It then returns a list of images, each ```Base64``` encoded back to the skillset. While this is not very useful, you could build a [Custom Vision](https://github.com/Azure-Samples/azure-search-power-skills/tree/master/Vision/CustomVision) skill to perform a useful inference on your images.\n", |
34 | 34 | "\n", |
35 | | - "Follow the [Azure Functions tutorial](https://docs.microsoft.com/en-us/azure/developer/python/tutorial-vs-code-serverless-python-05) to deploy the function. Once the deployment completes, navigate to the function app in the portal, select the function (SplitImage) and click the Get Function Url button. Save the function url as we will use it in the next step." |
| 35 | + "Follow the [Azure Functions tutorial](https://docs.microsoft.com/azure/developer/python/tutorial-vs-code-serverless-python-05) to deploy the function. Once the deployment completes, navigate to the function app in the portal, select the function (SplitImage) and click the Get Function Url button. Save the function url as we will use it in the next step." |
36 | 36 | ] |
37 | 37 | }, |
38 | 38 | { |
|
66 | 66 | "api_key = 'your search service API key'\n", |
67 | 67 | "\n", |
68 | 68 | "# Leave the API version and content_type as they are listed here.\n", |
69 | | - "api_version = '2020-06-30-Preview'\n", |
| 69 | + "api_version = '2020-06-30'\n", |
70 | 70 | "content_type = 'application/json'\n", |
71 | 71 | "\n", |
72 | | - "# Replace with a cognitive services key.\n", |
| 72 | + "# Replace with a Cognitive Services all in one key.\n", |
73 | 73 | "cog_svcs_key = '' #Required only if processing more than 20 documents\n", |
74 | 74 | "cog_svcs_acct = 'your cog services account name'\n", |
75 | 75 | "\n", |
|
79 | 79 | "datasource_container = 'bfrsample' # Replace with the container containging your files\n", |
80 | 80 | "# This sample assumes you will use the same storage account for the datasource, knowledge store and indexer cache. The knowledge store will contain the projected images\n", |
81 | 81 | "know_store_cache = STORAGECONNSTRING\n", |
82 | | - "# Contaimer where the sliced images will be projected to\n", |
| 82 | + "# Container where the sliced images will be projected to\n", |
83 | 83 | "know_store_container = \"obfuscated\"\n", |
84 | 84 | "skill_uri = \"https://<skillname>.azurewebsites.net/api/SplitImage?code=CODE\"" |
85 | 85 | ] |
|
597 | 597 | "metadata": {}, |
598 | 598 | "source": [ |
599 | 599 | "### View Results\n", |
600 | | - "Once the indexer completes execution, you will find the image slices in the knowledge store. The following cell downloads a few images to validate the skill did work." |
| 600 | + "The following cell downloads the image so that you can verify skillset success." |
601 | 601 | ] |
602 | 602 | }, |
603 | 603 | { |
604 | 604 | "cell_type": "code", |
605 | | - "execution_count": null, |
| 605 | + "execution_count": 1, |
606 | 606 | "metadata": {}, |
607 | | - "outputs": [], |
| 607 | + "outputs": [ |
| 608 | + { |
| 609 | + "ename": "NameError", |
| 610 | + "evalue": "name 'STORAGECONNSTRING' is not defined", |
| 611 | + "output_type": "error", |
| 612 | + "traceback": [ |
| 613 | + "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", |
| 614 | + "\u001b[1;31mNameError\u001b[0m Traceback (most recent call last)", |
| 615 | + "\u001b[1;32m<ipython-input-1-7c149f3fc288>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m 3\u001b[0m \u001b[1;32mfrom\u001b[0m \u001b[0mazure\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mstorage\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mblob\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mContainerClient\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 4\u001b[0m \u001b[0mcount\u001b[0m \u001b[1;33m=\u001b[0m \u001b[1;36m0\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 5\u001b[1;33m \u001b[0mcontainer\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mContainerClient\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mfrom_connection_string\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mconn_str\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mSTORAGECONNSTRING\u001b[0m\u001b[1;33m,\u001b[0m \u001b[0mcontainer_name\u001b[0m\u001b[1;33m=\u001b[0m\u001b[0mknow_store_container\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m 6\u001b[0m \u001b[0mblob_list\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mcontainer\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mlist_blobs\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m 7\u001b[0m \u001b[1;32mfor\u001b[0m \u001b[0mblob\u001b[0m \u001b[1;32min\u001b[0m \u001b[0mblob_list\u001b[0m\u001b[1;33m:\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n", |
| 616 | + "\u001b[1;31mNameError\u001b[0m: name 'STORAGECONNSTRING' is not defined" |
| 617 | + ] |
| 618 | + } |
| 619 | + ], |
608 | 620 | "source": [ |
609 | 621 | "from IPython.display import Image\n", |
610 | 622 | "import base64\n", |
|
0 commit comments