You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/app-service/includes/deploy-intelligent-apps/deploy-intelligent-apps-linux-dotnet-pivot.md
+8-7Lines changed: 8 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -10,12 +10,12 @@ ms.author: jefmarti
10
10
11
11
You can use Azure App Service to work with popular AI frameworks like LangChain and Semantic Kernel connected to OpenAI for creating intelligent apps. In the following tutorial, we're adding an Azure OpenAI service using Semantic Kernel to a .NET 8 Blazor web application.
12
12
13
-
####Prerequisites
13
+
## Prerequisites
14
14
15
15
- An [Azure OpenAI resource](/azure/ai-services/openai/quickstart?pivots=programming-language-csharp&tabs=command-line%2Cpython#set-up) or an [OpenAI account](https://platform.openai.com/overview).
16
16
- A .NET 8 Blazor Web App. Create the application with a template [here](https://dotnet.microsoft.com/learn/aspnet/blazor-tutorial/intro).
17
17
18
-
###Setup Blazor web app
18
+
##1. Setup Blazor web app
19
19
20
20
For this Blazor web application, we're building off the Blazor [template](https://dotnet.microsoft.com/learn/aspnet/blazor-tutorial/intro) and creating a new razor page that can send and receive requests to an Azure OpenAI OR OpenAI service using Semantic Kernel.
21
21
@@ -128,7 +128,7 @@ For OpenAI:
128
128
}
129
129
```
130
130
131
-
### Semantic Kernel
131
+
## 2. Semantic Kernel
132
132
133
133
Semantic Kernel is an open-source SDK that enables you to easily develop AI agents to work with your existing code. You can use Semantic Kernel with Azure OpenAI and OpenAI models.
134
134
@@ -137,7 +137,7 @@ To create the OpenAI client, we'll first start by installing Semantic Kernel.
137
137
ToinstallSemanticKernel, browsetheNuGetpackagemanagerinVisualStudioandinstallthe **Microsoft.SemanticKernel** package. ForNuGetPackageManagerinstructions, see [here](/nuget/consume-packages/install-use-packages-visual-studio#find-and-install-a-package). ForCLIinstructions, see [here](/nuget/consume-packages/install-use-packages-dotnet-cli).
@@ -159,7 +159,7 @@ To initialize the Kernel, add the following code to the *OpenAI.razor* file.
159
159
160
160
Here we're adding the using statement and creating the Kernel in a method that we can use when we send the request to the service.
161
161
162
-
###Add your AI service
162
+
##4. Add your AI service
163
163
164
164
Once the Kernel is initialized, we can add our chosen AI service to the kernel. Here we define our model and pass in our key and endpoint information to be consumed by the chosen model. If you plan to use managed identity with Azure OpenAI, add the service using the example in the next section.
165
165
@@ -225,7 +225,7 @@ Once the credentials are added to the application, you'll then need to enable ma
225
225
226
226
Your web app is now added as a cognitive service OpenAI user and can communicate to your Azure OpenAI resource.
227
227
228
-
###Configure prompt and create semantic function
228
+
##5. Configure prompt and create semantic function
229
229
230
230
Now that our chosen OpenAI service client is created with the correct keys we can add a function to handle the prompt. With Semantic Kernel you can handle prompts by the use of a semantic function, which turn the prompt and the prompt configuration settings into a function the Kernel can execute. Learn more on configuring prompts [here](/semantic-kernel/prompts/configure-prompts?tabs=Csharp).
231
231
@@ -348,7 +348,7 @@ Here's the example in its completed form. In this example, use the Azure OpenAI
348
348
349
349
Now save the application and follow the next steps to deploy it to App Service. If you would like to test it locally first at this step, you can swap out the config values at with the literal string values of your OpenAI service. For example: string modelId = 'gpt-4-turbo';
350
350
351
-
###Deploy to App Service
351
+
##5. Deploy to App Service
352
352
353
353
If you have followed the steps above, you're ready to deploy to App Service. If you run into any issues remember that you need to have done the following: grant your app access to your Key Vault, add the app settings with key vault references as your values. App Service resolves the app settings in your application that match what you've added in the portal.
354
354
@@ -357,3 +357,4 @@ If you have followed the steps above, you're ready to deploy to App Service. If
357
357
Although optional, it's highly recommended that you also add authentication to your web app when using an Azure OpenAI or OpenAI service. This can add a level of security with no other code. Learn how to enable authentication for your web app [here](../../scenario-secure-app-authentication-app-service.md).
358
358
359
359
Once deployed, browse to the web app and navigate to the OpenAI tab. Enter a query to the service and you should see a populated response from the server. The tutorial is now complete and you now know how to use OpenAI services to create intelligent applications.
Copy file name to clipboardExpand all lines: articles/azure-functions/durable/durable-functions-configure-managed-identity.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ If you don't have an existing Durable Functions project deployed in Azure, we re
37
37
## Local development
38
38
39
39
### Use Azure Storage emulator
40
-
When developing locally, it's recommended that you use Azurite, which is Azure Storage's local emulator. Configure your app to the emulator by specifying `"AzureWebJobsStorage": "UseDevelopmentStorage = true"` in the local.settings.json.
40
+
When developing locally, it's recommended that you use Azurite, which is Azure Storage's local emulator. Configure your app to the emulator by specifying `"AzureWebJobsStorage": "UseDevelopmentStorage=true"` in the local.settings.json.
41
41
42
42
### Identity-based connections for local development
43
43
Strictly speaking, a managed identity is only available to apps when executing on Azure. However, you can still configure a locally running app to use identity-based connection by using your developer credentials to authenticate against Azure resources. Then, when deployed on Azure, the app will utilize your managed identity configuration instead.
title: Migrate from OpenShift SDN to OVN-Kubernetes
3
+
description: Discover how to migrate from OpenShift SDN to OVN-Kubernetes.
4
+
author: johnmarco
5
+
ms.author: johnmarc
6
+
ms.service: azure-redhat-openshift
7
+
ms.topic: how-to
8
+
ms.date: 02/17/2025
9
+
---
10
+
11
+
# Migrate from OpenShift SDN to OVN-Kubernetes
12
+
13
+
OpenShift SDN, a component of Red Hat OpenShift Networking, is a network plugin that uses software-defined networking (SDN) to create a unified network for your cluster. This network allows communication between pods across the OpenShift Container Platform. OpenShift SDN manages this network by configuring an overlay network using Open vSwitch (OVS).
14
+
15
+
OpenShift SDN has been deprecated since version 4.14 and will no longer be supported starting with version 4.17. Therefore, if your cluster is using OpenShift SDN, you must migrate to OVN-Kubernetes before upgrading to any minor OpenShift version beyond 4.16.
16
+
17
+
## Migrating to OVN-Kubernetes for Azure Red Hat OpenShift
18
+
19
+
If your Azure Red Hat OpenShift (ARO) cluster is using the OpenShift SDN network plugin, you must migrate to the OVN-Kubernetes plugin before updating to version 4.17.
20
+
21
+
OVN-Kuberentes has been the default network plugin starting with ARO version 4.11. If you installed your cluster with version 4.11 or later, you likely don't need to perform a migration.
22
+
23
+
OpenShift SDN remains supported on Azure Red Hat OpenShift through version 4.16. See the [Azure Red Hat OpenShift release calendar](support-lifecycle.md#azure-red-hat-openshift-release-calendar) for end-of-life dates.
24
+
25
+
1. To determine which network plugin your cluster currently uses, run the following command:
26
+
27
+
```
28
+
oc get network.operator.openshift.io cluster -o jsonpath='{.spec.defaultNetwork.type}'
29
+
```
30
+
31
+
If you see an output such as `OpenShiftSDN`, proceed to the next step because you'll need to migrate.
32
+
33
+
1. See [Limited live migration to the OVN-Kubernetes network plugin overview](https://docs.openshift.com/container-platform/4.16/networking/ovn_kubernetes_network_provider/migrate-from-openshift-sdn.html#nw-ovn-kubernetes-live-migration-about_migrate-from-openshift-sdn) for steps to perform the migration.
34
+
35
+
> [!IMPORTANT]
36
+
> Azure Red Hat OpenShift only supports the limited live migration process. Don't use the offline migration process.
37
+
>
38
+
39
+
## Next steps
40
+
41
+
- [Learn more about OVN-Kubernetes network provider](concepts-ovn-kubernetes.md).
42
+
- [Learn more about the OVN-Kubernetes network plugin](https://docs.openshift.com/container-platform/4.17/networking/ovn_kubernetes_network_provider/about-ovn-kubernetes.html).
> |[Microsoft.Resources](../permissions/management-and-governance.md#microsoftresources)/deployments/read | Get list of update deployment |
2149
-
> |[Microsoft.Resources](../permissions/management-and-governance.md#microsoftresources)/deployments/write | Create or update an update deployment |
2150
-
> |[Microsoft.Resources](../permissions/management-and-governance.md#microsoftresources)/deployments/operation statuses | Get a list of update deployment operation statuses |
> |[Microsoft.Maintenance](../permissions/management-and-governance.md#microsoftmaintenance)/configurationAssignments/maintenanceScope/InGuestPatch/write | Create or update a maintenance configuration assignment for InGuestPatch maintenance scope. |
0 commit comments