You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/immersive-reader/tutorial-ios-picture-immersive-reader.md
+34-32Lines changed: 34 additions & 32 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,14 @@
1
1
---
2
2
title: "Tutorial: Create an iOS app that takes a photo and launches it in the Immersive Reader (Swift)"
3
3
titleSuffix: Azure AI services
4
-
description: In this tutorial, you will build an iOS app from scratch and add the Picture to Immersive Reader functionality.
4
+
description: Learn how to build an iOS app from scratch and add the Picture to Immersive Reader functionality.
5
5
#services: cognitive-services
6
-
author: rwallerms
6
+
author: sharmas
7
7
8
8
ms.service: azure-ai-immersive-reader
9
9
ms.topic: tutorial
10
-
ms.date: 01/14/2020
11
-
ms.author: rwaller
10
+
ms.date: 02/28/2024
11
+
ms.author: sharmas
12
12
#Customer intent: As a developer, I want to integrate two Azure AI services, the Immersive Reader and the Read API into my iOS application so that I can view any text from a photo in the Immersive Reader.
13
13
---
14
14
@@ -18,31 +18,30 @@ The [Immersive Reader](https://www.onenote.com/learningtools) is an inclusively
18
18
19
19
The [Azure AI Vision Read API](../../ai-services/computer-vision/overview-ocr.md) detects text content in an image using Microsoft's latest recognition models and converts the identified text into a machine-readable character stream.
20
20
21
-
In this tutorial, you will build an iOS app from scratch and integrate the Read API, and the Immersive Reader by using the Immersive Reader SDK. A full working sample of this tutorial is available [here](https://github.com/microsoft/immersive-reader-sdk/tree/master/js/samples/ios).
22
-
23
-
If you don't have an Azure subscription, create a [free account](https://azure.microsoft.com/free/cognitive-services/) before you begin.
21
+
In this tutorial, you build an iOS app from scratch and integrate the Read API and the Immersive Reader by using the Immersive Reader SDK. A full working sample of this tutorial is available [on GitHub](https://github.com/microsoft/immersive-reader-sdk/tree/master/js/samples/ios).
* An Immersive Reader resource configured for Microsoft Entra authentication. Follow [these instructions](./how-to-create-immersive-reader.md) to get set up. You will need some of the values created here when configuring the sample project properties. Save the output of your session into a text file for future reference.
29
-
* Usage of this sample requires an Azure subscription to the Azure AI Vision service. [Create an Azure AI Vision resource in the Azure portal](https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision).
25
+
* An Azure subscription. You can [create one for free](https://azure.microsoft.com/free/ai-services/).
26
+
* MacOS and [Xcode](https://apps.apple.com/us/app/xcode/id497799835?mt=12).
27
+
* An Immersive Reader resource configured for Microsoft Entra authentication. Follow [these instructions](how-to-create-immersive-reader.md) to get set up.
28
+
* A subscription to the Azure AI Vision service. Create an [Azure AI Vision resource in the Azure portal](https://portal.azure.com/#create/Microsoft.CognitiveServicesComputerVision).
:::image type="content" source="media/ios/xcode-create-project.png" alt-text="Screenshot of the Create a new Xcode project screen.":::
36
35
37
36
Choose **Single View App**.
38
37
39
-

38
+
:::image type="content" source="media/ios/xcode-single-view-app.png" alt-text="Screenshot of the template gallery to select a single view app.":::
40
39
41
40
## Get the SDK CocoaPod
42
41
43
42
The easiest way to use the Immersive Reader SDK is via CocoaPods. To install via Cocoapods:
44
43
45
-
1.[Install CocoaPods](http://guides.cocoapods.org/using/getting-started.html) - Follow the getting started guide to install Cocoapods.
44
+
1.Follow the [guide to install Cocoapods](http://guides.cocoapods.org/using/getting-started.html).
46
45
47
46
2. Create a Podfile by running `pod init` in your Xcode project's root directory.
48
47
@@ -68,20 +67,20 @@ The easiest way to use the Immersive Reader SDK is via CocoaPods. To install via
68
67
69
68
## Acquire a Microsoft Entra authentication token
70
69
71
-
You need some values from the Microsoft Entra authentication configuration prerequisite step above for this part. Refer back to the text file you saved of that session.
70
+
You need some values from the Microsoft Entra authentication configuration step in the prerequisites section. Refer back to the text file you saved from that session.
72
71
73
72
````text
74
73
TenantId => Azure subscription TenantId
75
-
ClientId => Azure AD ApplicationId
76
-
ClientSecret => Azure AD Application Service Principal password
74
+
ClientId => Microsoft Entra ApplicationId
75
+
ClientSecret => Microsoft Entra Application Service Principal password
77
76
Subdomain => Immersive Reader resource subdomain (resource 'Name' if the resource was created in the Azure portal, or 'CustomSubDomain' option if the resource was created with Azure CLI PowerShell. Check the Azure portal for the subdomain on the Endpoint in the resource Overview page, for example, 'https://[SUBDOMAIN].cognitiveservices.azure.com/')
78
77
````
79
78
80
-
In the main project folder, which contains the ViewController.swift file, create a Swift class file called Constants.swift. Replace the class with the following code, adding in your values where applicable. Keep this file as a local file that only exists on your machine and be sure not to commit this file into source control, as it contains secrets that should not be made public. It is recommended that you do not keep secrets in your app. Instead, we recommend using a backend service to obtain the token, where the secrets can be kept outside of the app and off of the device. The backend API endpoint should be secured behind some form of authentication (for example, [OAuth](https://oauth.net/2/)) to prevent unauthorized users from obtaining tokens to use against your Immersive Reader service and billing; that work is beyond the scope of this tutorial.
79
+
In the main project folder, which contains the *ViewController.swift* file, create a Swift class file called `Constants.swift`. Replace the class with the following code, adding in your values where applicable. Keep this file as a local file that only exists on your machine and be sure not to commit this file into source control because it contains secrets that shouldn't be made public. We recommended that you don't keep secrets in your app. Instead, use a backend service to obtain the token, where the secrets can be kept outside of the app and off of the device. The backend API endpoint should be secured behind some form of authentication (for example, [OAuth](https://oauth.net/2/)) to prevent unauthorized users from obtaining tokens to use against your Immersive Reader service and billing; that work is beyond the scope of this tutorial.
81
80
82
81
## Set up the app to run without a storyboard
83
82
84
-
Open AppDelegate.swift and replace the file with the following code.
83
+
Open *AppDelegate.swift* and replace the file with the following code.
85
84
86
85
```swift
87
86
importUIKit
@@ -135,7 +134,7 @@ class AppDelegate: UIResponder, UIApplicationDelegate {
135
134
136
135
## Add functionality for taking and uploading photos
137
136
138
-
Rename ViewController.swift to PictureLaunchViewController.swift and replace the file with the following code.
137
+
Rename *ViewController.swift* to *PictureLaunchViewController.swift* and replace the file with the following code.
139
138
140
139
```swift
141
140
importUIKit
@@ -369,13 +368,13 @@ class PictureLaunchViewController: UIViewController, UINavigationControllerDeleg
369
368
})
370
369
}
371
370
372
-
/// Retrieves the token for the Immersive Reader using Azure Active Directory authentication
371
+
/// Retrieves the token for the Immersive Reader using Microsoft Entra authentication
373
372
///
374
373
/// - Parameters:
375
-
/// -onSuccess: A closure that gets called when the token is successfully recieved using Azure Active Directory authentication.
376
-
/// -theToken: The token for the Immersive Reader recieved using Azure Active Directory authentication.
377
-
/// -onFailure: A closure that gets called when the token fails to be obtained from the Azure Active Directory Authentication.
378
-
/// -theError: The error that occurred when the token fails to be obtained from the Azure Active Directory Authentication.
374
+
/// -onSuccess: A closure that gets called when the token is successfully received using Microsoft Entra authentication.
375
+
/// -theToken: The token for the Immersive Reader received using Microsoft Entra authentication.
376
+
/// -onFailure: A closure that gets called when the token fails to be obtained from the Microsoft Entra authentication.
377
+
/// -theError: The error that occurred when the token fails to be obtained from the Microsoft Entra authentication.
let tokenForm ="grant_type=client_credentials&resource=https://cognitiveservices.azure.com/&client_id="+ Constants.clientId+"&client_secret="+ Constants.clientSecret
@@ -553,19 +552,22 @@ class PictureLaunchViewController: UIViewController, UINavigationControllerDeleg
553
552
## Build and run the app
554
553
555
554
Set the archive scheme in Xcode by selecting a simulator or device target.
:::image type="content" source="media/ios/picture-to-immersive-reader-ipad-app.png" alt-text="Screenshot of the sample app with text to be read.":::
564
565
565
-
Inside the app, take or upload a photo of text by pressing the 'Take Photo' button or 'Choose Photo from Library' button and the Immersive Reader will then launch displaying the text from the photo.
566
+
Take or upload a photo of text by pressing the **Take Photo** button or **Choose Photo from Library** button. The Immersive Reader then launches and displays the text from the photo.
0 commit comments