You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Conversation summarization takes structured text for analysis. For more information, see [data and service limits](../concepts/data-limits.md).
158
-
* Conversation summarization accepts text in English. For more information, see [language support](language-support.md?tabs=conversation-summarization).
158
+
* Conversation summarization works with various spoken languages. For more information, see [language support](language-support.md?tabs=conversation-summarization).
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/deploy-models-phi-3.md
+23-7Lines changed: 23 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ description: Learn how to deploy Phi-3 family of small language models with Azur
5
5
manager: scottpolly
6
6
ms.service: azure-ai-studio
7
7
ms.topic: how-to
8
-
ms.date: 5/21/2024
8
+
ms.date: 07/01/2024
9
9
ms.reviewer: kritifaujdar
10
10
reviewer: fkriti
11
11
ms.author: mopeakande
@@ -25,20 +25,35 @@ The Phi-3 family of SLMs is a collection of instruction-tuned generative text mo
25
25
26
26
# [Phi-3-mini](#tab/phi-3-mini)
27
27
28
-
Phi-3 Mini is a 3.8B parameters, lightweight, state-of-the-art open model built upon datasets used for Phi-2—synthetic data and filtered websites—with a focus on high-quality, reasoning-dense data. The model belongs to the Phi-3 model family, and the Mini version comes in two variants, 4K and 128K, which is the context length (in tokens) that the model can support.
28
+
Phi-3 Mini is a 3.8B parameters, lightweight, state-of-the-art open model. Phi-3-Mini was trained with Phi-3 datasets that include both synthetic data and the filtered, publicly-available websites data, with a focus on high quality and reasoning-dense properties.
29
+
30
+
The model belongs to the Phi-3 model family, and the Mini version comes in two variants, 4K and 128K, which denote the context length (in tokens) that each model variant can support.
The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. When assessed against benchmarks that test common sense, language understanding, math, code, long context and logical reasoning, Phi-3Mini-4K-Instruct and Phi-3Mini-128K-Instruct showcased a robust and state-of-the-art performance among models with less than 13 billion parameters.
35
+
The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. When assessed against benchmarks that test common sense, language understanding, math, code, long context and logical reasoning, Phi-3-Mini-4K-Instruct and Phi-3-Mini-128K-Instruct showcased a robust and state-of-the-art performance among models with less than 13 billion parameters.
34
36
35
37
# [Phi-3-medium](#tab/phi-3-medium)
36
-
Phi-3 Medium is a 14B parameters, lightweight, state-of-the-art open model built upon datasets used for Phi-2—synthetic data and filtered publicly available websites—with a focus on high-quality, reasoning-dense data. The model belongs to the Phi-3 model family, and the Medium version comes in two variants, 4K and 128K, which is the context length (in tokens) that the model can support.
38
+
Phi-3 Medium is a 14B parameters, lightweight, state-of-the-art open model. Phi-3-Medium was trained with Phi-3 datasets that include both synthetic data and the filtered, publicly-available websites data, with a focus on high quality and reasoning-dense properties.
39
+
40
+
The model belongs to the Phi-3 model family, and the Medium version comes in two variants, 4K and 128K, which denote the context length (in tokens) that each model variant can support.
37
41
38
42
- Phi-3-medium-4k-Instruct
39
43
- Phi-3-medium-128k-Instruct
40
44
41
-
The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures.
45
+
The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. When assessed against benchmarks that test common sense, language understanding, math, code, long context and logical reasoning, Phi-3-Medium-4k-Instruct and Phi-3-Medium-128k-Instruct showcased a robust and state-of-the-art performance among models with less than 13 billion parameters.
46
+
47
+
# [Phi-3-small](#tab/phi-3-small)
48
+
49
+
Phi-3-Small is a 7B parameters, lightweight, state-of-the-art open model. Phi-3-Small was trained with Phi-3 datasets that include both synthetic data and the filtered, publicly-available websites data, with a focus on high quality and reasoning-dense properties.
50
+
51
+
The model belongs to the Phi-3 model family, and the Small version comes in two variants, 8K and 128K, which denote the context length (in tokens) that each model variant can support.
52
+
53
+
- Phi-3-small-8k-Instruct
54
+
- Phi-3-small-128k-Instruct
55
+
56
+
The model underwent a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures. When assessed against benchmarks that test common sense, language understanding, math, code, long context and logical reasoning, Phi-3-Small-8k-Instruct and Phi-3-Small-128k-Instruct showcased a robust and state-of-the-art performance among models with less than 13 billion parameters.
42
57
43
58
---
44
59
@@ -54,7 +69,8 @@ Certain models in the model catalog can be deployed as a serverless API with pay
54
69
* East US 2
55
70
* Sweden Central
56
71
57
-
For a list of regions that are available for each of the models supporting serverless API endpoint deployments, see [Region availability for models in serverless API endpoints](deploy-models-serverless-availability.md).
72
+
For a list of regions that are available for each of the models supporting serverless API endpoint deployments, see [Region availability for models in serverless API endpoints](deploy-models-serverless-availability.md).
73
+
58
74
- An [Azure AI Studio project](../how-to/create-projects.md).
59
75
- Azure role-based access controls (Azure RBAC) are used to grant access to operations in Azure AI Studio. To perform the steps in this article, your user account must be assigned the __Azure AI Developer role__ on the resource group. For more information on permissions, see [Role-based access control in Azure AI Studio](../concepts/rbac-ai-studio.md).
60
76
@@ -80,7 +96,7 @@ To create a deployment:
80
96
1. Search for and select **Phi-3-mini-4k-Instruct** to open the model's Details page.
81
97
1. Select **Confirm**, and choose the option **Serverless API** to open a serverless API deployment window for the model.
82
98
83
-
1. Select the project in which you want to deploy your model. To deploy the Phi-3 model, your project must be in the *EastUS2* or *Sweden Central* region.
99
+
1. Select the project in which you want to deploy your model. To deploy the Phi-3 model, your project must belong to one of the regions listed in the [prerequisites](#prerequisites) section.
84
100
85
101
1. Select the **Pricing and terms** tab to learn about pricing for the selected model.
Copy file name to clipboardExpand all lines: articles/azure-monitor/logs/logs-dedicated-clusters.md
-5Lines changed: 0 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -595,11 +595,6 @@ After you create your cluster resource and it's fully provisioned, you can edit
595
595
>[!IMPORTANT]
596
596
>Cluster update should not include both identity and key identifier details in the same operation. If you need to update both, the update should be in two consecutive operations.
597
597
598
-
<!--
599
-
> [!NOTE]
600
-
> The *billingType* property isn't supported in CLI.
description: Learn how to assign access to a workload owner of an Amazon Web Service or Google Cloud Project connector.
4
+
ms.author: elkrieger
5
+
author: Elazark
6
+
ms.topic: how-to
7
+
ms.date: 07/01/2024
8
+
#customer intent: As a workload owner, I want to learn how to assign access to my AWS or GCP connector so that I can view the suggested recommendations provided by Defender for Cloud.
9
+
---
10
+
11
+
# Assign access to workload owners
12
+
13
+
When you onboard your AWS or GCP environments, Defender for Cloud automatically creates a security connector as an Azure resource inside the connected subscription and resource group. Defender for cloud also creates the identity provider as an IAM role it requires during the onboarding process.
14
+
15
+
16
+
Assign permission to users, on specific security connectors, below the parent connector? Yes, you can. You need to determine to which AWS accounts or GCP projects you want users to have access to. Meaning, you need to identify the security connectors that correspond to the AWS account or GCP project to which you want to assign users access.
17
+
18
+
## Prerequisites
19
+
20
+
- An Azure account. If you don't already have an Azure account, you can [create your Azure free account today](https://azure.microsoft.com/free/).
21
+
22
+
- At least one security connector for [Azure](connect-azure-subscription.md), [AWS](quickstart-onboard-aws.md) or [GCP](quickstart-onboard-gcp.md).
23
+
24
+
## Configure permissions on the security connector
25
+
26
+
Permissions for security connectors are managed through Azure role-based access control (RBAC). You can assign roles to users, groups, and applications at a subscription, resource group, or resource level.
27
+
28
+
1. Sign in to the [Azure portal](https://portal.azure.com/).
29
+
30
+
1. Navigate to **Microsoft Defender for Cloud** > **Environment settings**.
31
+
32
+
1. Locate the relevant AWS or GCP connector.
33
+
34
+
1. Assign permissions to the workload owners with All resources or the Azure Resource Graph option in the Azure portal.
35
+
36
+
### [All resources](#tab/all-resources)
37
+
38
+
1. Search for and select **All resources**.
39
+
40
+
:::image type="content" source="media/assign-access-to-workload/all-resources.png" alt-text="Screenshot that shows you how to search for and select all resources." lightbox="media/assign-access-to-workload/all-resources.png":::
:::image type="content" source="media/assign-access-to-workload/show-hidden-types.png" alt-text="Screenshot that shows you where on the screen to find the show hidden types option." lightbox="media/assign-access-to-workload/show-hidden-types.png":::
45
+
46
+
1. Select the **Types equals all** filter.
47
+
48
+
1. Enter `securityconnector` in the value field and add a check to the `microsoft.security/securityconnectors`.
49
+
50
+
:::image type="content" source="media/assign-access-to-workload/security-connector.png" alt-text="Screenshot that shows where the field is located and where to enter the value on the screen." lightbox="media/assign-access-to-workload/security-connector.png":::
1. Search for and select **Resource Graph Explorer**.
60
+
61
+
:::image type="content" source="media/assign-access-to-workload/resource-graph-explorer.png" alt-text="Screenshot that shows you how to search for and select resource graph explorer." lightbox="media/assign-access-to-workload/resource-graph-explorer.png":::
62
+
63
+
1. Copy and paste the following query to locate the security connector:
64
+
65
+
### [AWS](#tab/aws)
66
+
67
+
```bash
68
+
resources
69
+
| where type == "microsoft.security/securityconnectors"
:::image type="content" source="media/assign-access-to-workload/formatted-results.png" alt-text="Screenshot that shows where the formatted results toggle is located on the screen." lightbox="media/assign-access-to-workload/formatted-results.png":::
92
+
93
+
1. Select the relevant subscription and resource group to locate the relevant security connector.
94
+
95
+
---
96
+
97
+
1. Select **Access control (IAM)**.
98
+
99
+
:::image type="content" source="media/assign-access-to-workload/control-i-am.png" alt-text="Screenshot that shows where to select Access control IAM in the resource you selected." lightbox="media/assign-access-to-workload/control-i-am.png":::
100
+
101
+
1. Select **+Add**>**Add role assignment**.
102
+
103
+
1. Select the desired role.
104
+
105
+
1. Select **Next**.
106
+
107
+
1. Select **+ Select members**.
108
+
109
+
:::image type="content" source="media/assign-access-to-workload/select-members.png" alt-text="Screenshot that shows where the button is on the screen to select the + select members button.":::
110
+
111
+
1. Search for and selectthe relevant user or group.
112
+
113
+
1. Select the **Select** button.
114
+
115
+
1. Select **Next**.
116
+
117
+
1. Select **Review + assign**.
118
+
119
+
1. Review the information.
120
+
121
+
1. Select **Review + assign**.
122
+
123
+
After setting the permission forthe security connector, the workload owners will be able to view recommendationsin Defender for Cloud for the AWS and GCP resources that are associated with the security connector.
0 commit comments