Skip to content

Commit 8ff0b10

Browse files
authored
Rename module to ai-stack (#4)
Signed-off-by: Roman Schwarz <rs@cloudeteer.de>
1 parent a0fa921 commit 8ff0b10

File tree

7 files changed

+50
-50
lines changed

7 files changed

+50
-50
lines changed

README.md

Lines changed: 44 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,11 @@
55
66
---
77

8-
# terraform-azurerm-azure-ai-foundry-hub
8+
# terraform-azurerm-ai-stack
99

10-
[![Terraform Registry](https://img.shields.io/badge/Terraform%20Registry-launchpad-7B42BC?style=for-the-badge&logo=terraform&logoColor=A067DA)](https://registry.terraform.io/modules/cloudeteer/launchpad/azurerm)
11-
[![OpenTofu Registry](https://img.shields.io/badge/OpenTofu%20Registry-launchpad-4B4B77?style=for-the-badge&logo=opentofu)](https://search.opentofu.org/module/cloudeteer/launchpad/azurerm)
12-
[![SemVer](https://img.shields.io/badge/SemVer-2.0.0-F77F00?style=for-the-badge)](https://github.com/cloudeteer/terraform-azurerm-launchpad/releases)
10+
[![Terraform Registry](https://img.shields.io/badge/Terraform%20Registry-ai-stack-7B42BC?style=for-the-badge&logo=terraform&logoColor=A067DA)](https://registry.terraform.io/modules/cloudeteer/ai-stack/azurerm)
11+
[![OpenTofu Registry](https://img.shields.io/badge/OpenTofu%20Registry-ai-stack-4B4B77?style=for-the-badge&logo=opentofu)](https://search.opentofu.org/module/cloudeteer/ai-stack/azurerm)
12+
[![SemVer](https://img.shields.io/badge/SemVer-2.0.0-F77F00?style=for-the-badge)](https://github.com/cloudeteer/terraform-azurerm-ai-stack/releases)
1313

1414
This Terraform module is composed of several submodules, which are combined in the primary module to provide a complete solution. Each submodule can also be deployed independently—see [./modules/](./modules/) for details.
1515

@@ -115,7 +115,7 @@ resource "azurerm_resource_group" "example" {
115115
}
116116
117117
module "example" {
118-
source = "cloudeteer/azure-ai-foundry-hub/azurerm"
118+
source = "cloudeteer/ai-stack/azurerm"
119119
120120
basename = trimprefix(azurerm_resource_group.example.name, "rg-")
121121
location = azurerm_resource_group.example.location
@@ -186,24 +186,24 @@ Description: The principal ID of a user or group of AI Developers who will have
186186

187187
The following roles will be assigned to the given principal ID:
188188

189-
| Role | Scope |
190-
| ---------------------------------------- | ------------------------ |
191-
| Azure AI Developer | AI Foundry Hub |
192-
| Azure AI Developer | AI Foundry Project |
193-
| Contributor | Developer Resource Group |
194-
| Storage Blob Data Contributor | Storage Account |
195-
| Storage File Data Privileged Contributor | Storage Account |
196-
| Cognitive Services Contributor | AI Service |
197-
| Cognitive Services OpenAI Contributor | AI Service |
198-
| Cognitive Services User | AI Service |
199-
| User Access Administrator | AI Service |
200-
| Search Index Data Contributor | AI Search Service |
201-
| Search Service Contributor | AI Search Service |
202-
203-
| Argument | Description |
204-
| ----------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- |
205-
| `isolation_mode` | Isolation mode for the managed network of a machine learning workspace. Possible values are `AllowOnlyApprovedOutbound`, `AllowInternetOutbound`, or `Disabled`. |
206-
| `public_network_access` | Whether requests from Public Network are allowed. |
189+
| Role | Scope |
190+
| -- | -- |
191+
| Azure AI Developer | AI Foundry Hub |
192+
| Azure AI Developer | AI Foundry Project |
193+
| Contributor | Developer Resource Group |
194+
| Storage Blob Data Contributor | Storage Account |
195+
| Storage File Data Privileged Contributor | Storage Account |
196+
| Cognitive Services Contributor | AI Service |
197+
| Cognitive Services OpenAI Contributor | AI Service |
198+
| Cognitive Services User | AI Service |
199+
| User Access Administrator | AI Service |
200+
| Search Index Data Contributor | AI Search Service |
201+
| Search Service Contributor | AI Search Service |
202+
203+
Argument | Description
204+
-- | --
205+
`isolation_mode` | Isolation mode for the managed network of a machine learning workspace. Possible values are `AllowOnlyApprovedOutbound`, `AllowInternetOutbound`, or `Disabled`.
206+
`public_network_access` | Whether requests from Public Network are allowed.
207207

208208
**NOTE**: The `User Access Administrator` role is assigned with the condition that only the `Cognitive Services OpenAI User` role can be assigned to user principals. This is necessary to successfully deploy a Web App on top of an AI Model through the AI Foundry Hub.
209209

@@ -223,13 +223,13 @@ Default: `[]`
223223

224224
Description: If set to `true` (default), the following mandatory Azure role assignments will be created:
225225

226-
| Role | Scope | Principal |
227-
| ------------------------------------- | ----------------- | -------------------------- |
228-
| Cognitive Services OpenAI Contributor | AI Service | AI Search Service Identity |
229-
| Search Index Data Reader | AI Search Service | AI Service Identity |
230-
| Search Service Contributor | AI Search Service | AI Service Identity |
231-
| Storage Blob Data Contributor | Storage Account | AI Service Identity |
232-
| Storage Blob Data Reader | Storage Account | AI Search Service Identity |
226+
| Role | Scope | Principal |
227+
| -- | -- | -- |
228+
| Cognitive Services OpenAI Contributor | AI Service | AI Search Service Identity |
229+
| Search Index Data Reader | AI Search Service | AI Service Identity |
230+
| Search Service Contributor | AI Search Service | AI Service Identity |
231+
| Storage Blob Data Contributor | Storage Account | AI Service Identity |
232+
| Storage Blob Data Reader | Storage Account | AI Search Service Identity |
233233

234234
**NOTE**: If set to `false`, these role assignments must be created manually to ensure the AI Foundry Hub Project functions correctly.
235235

@@ -259,10 +259,10 @@ Description: Network configuration for the AI Hub.
259259

260260
Optional arguments:
261261

262-
| Argument | Description |
263-
| ----------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- |
264-
| `isolation_mode` | Isolation mode for the managed network of a machine learning workspace. Possible values are `AllowOnlyApprovedOutbound`, `AllowInternetOutbound`, or `Disabled`. |
265-
| `public_network_access` | Whether requests from Public Network are allowed. |
262+
Argument | Description
263+
-- | --
264+
`isolation_mode` | Isolation mode for the managed network of a machine learning workspace. Possible values are `AllowOnlyApprovedOutbound`, `AllowInternetOutbound`, or `Disabled`.
265+
`public_network_access` | Whether requests from Public Network are allowed.
266266

267267
**NOTE**:
268268

@@ -299,19 +299,19 @@ Description: A list of models to deploy to the workspace.
299299

300300
Required parameters:
301301

302-
| Parameter | Description |
303-
| --------- | --------------------------------------------------------------------------------------------------------------- |
304-
| `name` | The name of the Cognitive Services Account Deployment model. Changing this forces a new resource to be created. |
302+
Parameter | Description
303+
-- | --
304+
`name` | The name of the Cognitive Services Account Deployment model. Changing this forces a new resource to be created.
305305

306306
Optional parameters:
307307

308-
| Parameter | Description |
309-
| ----------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
310-
| `deployment_name` | The name to assign to the model deployment. If not specified, the value of `name` will be used by default. This property allows you to customize the deployment resource name independently from the model name. |
311-
| `format` | The format of the Cognitive Services Account Deployment model. Changing this forces a new resource to be created. Possible value is `OpenAI`. |
312-
| `sku_capacity` | Tokens-per-Minute (TPM). The unit of measure for this field is in the thousands of Tokens-per-Minute. Defaults to `1` which means that the limitation is `1000` tokens per minute. If the resources SKU supports scale in/out then the capacity field should be included in the resources' configuration. If the scale in/out is not supported by the resources SKU then this field can be safely omitted. For more information about TPM please see the [product documentation](https://learn.microsoft.com/azure/ai-services/openai/how-to/quota?tabs=rest). |
313-
| `sku_name` | The name of the SKU. Possible values include `Standard`, `DataZoneStandard`, `DataZoneProvisionedManaged`, `GlobalBatch`, `GlobalProvisionedManaged`, `GlobalStandard`, and `ProvisionedManaged`. |
314-
| `version` | The version of Cognitive Services Account Deployment model. If `version` is not specified, the default version of the model at the time will be assigned. |
308+
Parameter | Description
309+
-- | --
310+
`deployment_name` | The name to assign to the model deployment. If not specified, the value of `name` will be used by default. This property allows you to customize the deployment resource name independently from the model name.
311+
`format` | The format of the Cognitive Services Account Deployment model. Changing this forces a new resource to be created. Possible value is `OpenAI`.
312+
`sku_capacity` | Tokens-per-Minute (TPM). The unit of measure for this field is in the thousands of Tokens-per-Minute. Defaults to `1` which means that the limitation is `1000` tokens per minute. If the resources SKU supports scale in/out then the capacity field should be included in the resources' configuration. If the scale in/out is not supported by the resources SKU then this field can be safely omitted. For more information about TPM please see the [product documentation](https://learn.microsoft.com/azure/ai-services/openai/how-to/quota?tabs=rest).
313+
`sku_name` | The name of the SKU. Possible values include `Standard`, `DataZoneStandard`, `DataZoneProvisionedManaged`, `GlobalBatch`, `GlobalProvisionedManaged`, `GlobalStandard`, and `ProvisionedManaged`.
314+
`version` | The version of Cognitive Services Account Deployment model. If `version` is not specified, the default version of the model at the time will be assigned.
315315

316316
**Note**: `DataZoneProvisionedManaged`, `GlobalProvisionedManaged`, and `ProvisionedManaged` are purchased on-demand at an hourly basis based on the number of deployed PTUs, with substantial term discount available via the purchase of Azure Reservations. Currently, this step cannot be completed using Terraform. For more details, please refer to the [provisioned throughput onboarding documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/provisioned-throughput-onboarding).
317317

examples/usage/main.tf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ resource "azurerm_resource_group" "example" {
66
}
77

88
module "example" {
9-
source = "cloudeteer/azure-ai-foundry-hub/azurerm"
9+
source = "cloudeteer/ai-stack/azurerm"
1010

1111
basename = trimprefix(azurerm_resource_group.example.name, "rg-")
1212
location = azurerm_resource_group.example.location

modules/ai-foundry-core/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ This example demonstrates the usage of this Terraform module with default settin
55

66
```hcl
77
module "ai_foundry_core" {
8-
source = "cloudeteer/azure-ai-foundry-hub/azurerm//modules/ai-foundry-core"
8+
source = "cloudeteer/ai-stack/azurerm//modules/ai-foundry-core"
99
1010
name = var.basename
1111
location = var.location

modules/ai-foundry-core/examples/usage/main.tf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
module "ai_foundry_core" {
2-
source = "cloudeteer/azure-ai-foundry-hub/azurerm//modules/ai-foundry-core"
2+
source = "cloudeteer/ai-stack/azurerm//modules/ai-foundry-core"
33

44
name = var.basename
55
location = var.location

modules/ai-foundry-services/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ This example demonstrates the usage of this Terraform module with default settin
55

66
```hcl
77
module "ai_foundry_services" {
8-
source = "cloudeteer/azure-ai-foundry-hub/azurerm//modules/ai-foundry-services"
8+
source = "cloudeteer/ai-stack/azurerm//modules/ai-foundry-services"
99
1010
name = var.basename
1111
location = var.location

modules/ai-foundry-services/examples/usage/main.tf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
module "ai_foundry_services" {
2-
source = "cloudeteer/azure-ai-foundry-hub/azurerm//modules/ai-foundry-services"
2+
source = "cloudeteer/ai-stack/azurerm//modules/ai-foundry-services"
33

44
name = var.basename
55
location = var.location

tests/remote/main.tftest.hcl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
#
22
# !! WIP !!
33
# The remote test is currently not working and needs to be fixed ASAP!
4-
# https://github.com/cloudeteer/terraform-azurerm-azure-ai-foundry-hub/issues/2
4+
# https://github.com/cloudeteer/terraform-azurerm-ai-stack/issues/2
55
#
66

77
# run "remote" {

0 commit comments

Comments
 (0)