diff --git a/.gitignore b/.gitignore index 486ad1e55271..31d9092c1e2c 100644 --- a/.gitignore +++ b/.gitignore @@ -233,6 +233,7 @@ launchSettings.json /tools/Modules/tmp /tools/Az/Az.psm1 /tools/AzPreview/AzPreview.psm1 +/tools/Mcp/.logs /Azure.PowerShell.sln # Added due to scan diff --git a/tools/Mcp/README.md b/tools/Mcp/README.md index 38353aa1bb1c..f74084f81f46 100644 --- a/tools/Mcp/README.md +++ b/tools/Mcp/README.md @@ -1,42 +1,64 @@ # Azure PowerShell Codegen MCP Server -A Model Context Protocol (MCP) server that provides tools for generating and managing Azure PowerShell modules using AutoRest. This server helps automate common tasks in the Azure PowerShell code generation process, including handling polymorphism, model directives, and code generation. +A Model Context Protocol (MCP) server that provides tools for generating and managing Azure PowerShell modules using AutoRest. It now also orchestrates help‑driven example generation, CRUD test scaffolding, and an opinionated partner workflow to keep outputs deterministic and consistent. ## Overview This MCP server is designed to work with Azure PowerShell module development workflows. It provides specialized tools for: -- **AutoRest Code Generation**: Generate PowerShell modules from OpenAPI specifications +- **Module Scaffolding**: Interactive selection of service → provider → API version and creation of the `.Autorest` structure +- **AutoRest Code Generation**: Generate PowerShell modules from OpenAPI specifications (reset/generate/build sequence) +- **Example Generation**: Create example scripts from swagger example JSON while filtering strictly to parameters documented in help markdown +- **Test Generation**: Produce per‑resource CRUD test files (idempotent, includes negative test) using the same help‑driven parameter filtering +- **Help‑Driven Parameter Filtering**: Only parameters present in the generated help (`/src//help/*.md`) are allowed in examples/tests - **Model Management**: Handle model directives like `no-inline` and `model-cmdlet` -- **Polymorphism Support**: Automatically detect and configure polymorphic types -- **YAML Configuration**: Parse and manipulate AutoRest configuration files +- **Polymorphism Support**: Automatically detect and configure parent/child discriminator relationships +- **YAML Configuration Utilities**: Parse and manipulate AutoRest configuration blocks +- **Partner Workflow Prompt**: A single prompt that encodes the end‑to‑end deterministic workflow ## Features ### Available Tools -1. **generate-autorest** - - Generates PowerShell code using AutoRest - - Parameters: `workingDirectory` (absolute path to README.md) - -2. **no-inline** - - Converts flattened models to non-inline parameters - - Parameters: `modelNames` (array of model names to make non-inline) - - Useful for complex nested models that shouldn't be flattened - -3. **model-cmdlet** - - Creates `New-` cmdlets for specified models - - Parameters: `modelNames` (array of model names) - - Generates cmdlets with naming pattern: `New-Az{SubjectPrefix}{ModelName}Object` - -4. **polymorphism** - - Handles polymorphic type detection and configuration - - Parameters: `workingDirectory` (absolute path to README.md) - - Automatically identifies parent-child type relationships +1. **setup-module-structure** + - Interactive service → provider → API version selection and module name capture + - Scaffolds `src//.Autorest/` plus initial `README.md` + - Output placeholder `{0}` = module name + +2. **generate-autorest** + - Executes Autorest reset, generate, and PowerShell build steps within the given working directory + - Parameters: `workingDirectory` (absolute path to the Autorest folder containing README.md) + - Output placeholder `{0}` = working directory + +3. **create-example** + - Downloads swagger example JSON, filters parameters to those documented in help markdown (`/src//help/.md`), and writes example scripts under `examples/` + - Parameters: `workingDirectory` + - Output placeholders: `{0}` = harvested specs path, `{1}` = examples dir, `{2}` = reference ideal example dirs + +4. **create-test** + - Generates new `.Crud.Tests.ps1` files (does not modify stubs) with Create/Get/List/Update/Delete/Negative blocks, using help‑filtered parameters + - Parameters: `workingDirectory` + - Output placeholders: `{0}` = harvested specs path, `{1}` = test dir, `{2}` = reference ideal test dirs + +5. **polymorphism** + - Detects discriminator parents and child model names to aid directive insertion + - Parameters: `workingDirectory` + - Output placeholders: `{0}` = parents, `{1}` = children, `{2}` = working directory + +6. **no-inline** + - Lists models to be marked `no-inline` (caller inserts directive into README Autorest YAML) + - Parameters: `modelNames` (array) + - Output `{0}` = comma-separated model list + +7. **model-cmdlet** + - Lists models for which `New-` object construction cmdlets should be added via directives + - Parameters: `modelNames` (array) + - Output `{0}` = comma-separated model list ### Available Prompts -- **create-greeting**: Generate customized greeting messages (example prompt) +- **partner-module-workflow**: Canonical end‑to‑end instruction set (module structure → generation → examples → tests → regeneration) +- **create-greeting**: Sample/demo greeting prompt ## Installation diff --git a/tools/Mcp/src/CodegenServer.ts b/tools/Mcp/src/CodegenServer.ts index e2a10375fc4d..96e7a1b79031 100644 --- a/tools/Mcp/src/CodegenServer.ts +++ b/tools/Mcp/src/CodegenServer.ts @@ -1,14 +1,16 @@ import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js"; import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js"; import { z } from "zod"; -import { responseSchema, toolParameterSchema, toolSchema, promptSchema } from "./types.js"; +import { responseSchema, toolParameterSchema, toolSchema, promptSchema, resourceSchema } from "./types.js"; import { ToolsService } from "./services/toolsService.js"; import { PromptsService } from "./services/promptsService.js"; +import { ResourcesService } from "./services/resourcesService.js"; import { readFileSync } from "fs"; import path from "path"; import { fileURLToPath } from "url"; import { RequestOptions } from "https"; import { ElicitRequest, ElicitResult } from "@modelcontextprotocol/sdk/types.js"; +import { logger } from "./services/logger.js"; const __dirname = path.dirname(fileURLToPath(import.meta.url)); const srcPath = path.resolve(__dirname, "..", "src"); @@ -37,6 +39,7 @@ export class CodegenServer { this.initResponses(); this.initTools(); this.initPrompts(); + this.initResources(); } // dummy method for sending sampling request @@ -74,6 +77,9 @@ export class CodegenServer { await this._mcp.connect(transport); } + public getResponseTemplate(name: string): string | undefined { + return this._responses.get(name); + } initTools() { const toolsService = ToolsService.getInstance().setServer(this); @@ -100,7 +106,44 @@ export class CodegenServer { schema.name, schema.description, parameter, - (args: any) => callback(args) + async (args: any) => { + const correlationId = `${schema.name}-${Date.now()}-${Math.random().toString(16).slice(2,8)}`; + logger.debug('Prompt started', { prompt: schema.name, correlationId }); + try { + const result = await callback(args); + logger.info('Prompt completed', { prompt: schema.name, correlationId }); + return result; + } catch (err: any) { + logger.error('Prompt failed', { prompt: schema.name, correlationId }, err); + throw err; + } + } + ); + } + } + + initResources() { + const resourcesService = ResourcesService.getInstance().setServer(this); + const resourcesSchemas = (specs.resources || []) as resourceSchema[]; + for (const schema of resourcesSchemas) { + const parameter = resourcesService.createResourceParametersFromSchema(schema.parameters || []); + const callback = resourcesService.getResources(schema.callbackName, this._responses.get(schema.name)); + this._mcp.resource( + schema.name, + schema.description, + parameter, + async (args: any) => { + const correlationId = `${schema.name}-${Date.now()}-${Math.random().toString(16).slice(2,8)}`; + logger.debug('Resource requested', { resource: schema.name, correlationId }); + try { + const result = await callback(args); + logger.info('Resource provided', { resource: schema.name, correlationId }); + return result; + } catch (err: any) { + logger.error('Resource failed', { resource: schema.name, correlationId }, err); + throw err; + } + } ); } } @@ -110,14 +153,16 @@ export class CodegenServer { let text = response.text; if (text.startsWith("@file:")) { const relPath = text.replace("@file:", ""); - const absPath = path.join(srcPath, "specs", relPath); + const absPath = path.join(srcPath, relPath); try { text = readFileSync(absPath, "utf-8"); - } catch (e) { - console.error(`Failed to load prompt file ${absPath}:`, e); + } catch (e: any) { + logger.error(`Failed to load prompt file`, { absPath }, e as Error); } } this._responses.set(response.name, text); }); } + + } diff --git a/tools/Mcp/src/assets/autorest-readme-template.md b/tools/Mcp/src/assets/autorest-readme-template.md new file mode 100644 index 000000000000..d24e3d1fb91d --- /dev/null +++ b/tools/Mcp/src/assets/autorest-readme-template.md @@ -0,0 +1,63 @@ + +# Az.{moduleName} +This directory contains the PowerShell module for the {moduleName} service. + +--- +## Info +- Modifiable: yes +- Generated: all +- Committed: yes +- Packaged: yes + +--- +## Detail +This module was primarily generated via [AutoRest](https://github.com/Azure/autorest) using the [PowerShell](https://github.com/Azure/autorest.powershell) extension. + +## Module Requirements +- [Az.Accounts module](https://www.powershellgallery.com/packages/Az.Accounts/), version 2.7.5 or greater + +## Authentication +AutoRest does not generate authentication code for the module. Authentication is handled via Az.Accounts by altering the HTTP payload before it is sent. + +## Development +For information on how to develop for `Az.{moduleName}`, see [how-to.md](how-to.md). + + +--- +### AutoRest Configuration +> see https://aka.ms/autorest + +```yaml + +commit: {commitId} + +require: + - $(this-folder)/../../readme.azure.noprofile.md + - $(repo)/specification/{serviceSpecs}/readme.md + +try-require: + - $(repo)/specification/{serviceSpecs}/readme.powershell.md + +input-file: + - $(repo)/{swaggerFileSpecs} + +module-version: 0.1.0 + +title: {moduleName} +service-name: {moduleName} +subject-prefix: $(service-name) + +directive: + + - where: + variant: ^(Create|Update)(?!.*?(Expanded|JsonFilePath|JsonString)) + remove: true + + - where: + variant: ^CreateViaIdentity$|^CreateViaIdentityExpanded$ + remove: true + + - where: + verb: Set + remove: true +``` diff --git a/tools/Mcp/src/assets/example-instructions.md b/tools/Mcp/src/assets/example-instructions.md new file mode 100644 index 000000000000..7b806760fb4c --- /dev/null +++ b/tools/Mcp/src/assets/example-instructions.md @@ -0,0 +1,43 @@ +## LLM Example Generation Directions + +You have just called tool `create-example` for a freshly generated module. + +Inputs: +- `{0}` = source swagger example JSON directory (read only) +- `{1}` = target examples directory (write here only) +- `{2}` = reference example dirs (style cues; may be empty) +- helpDir = parentOf({1}) with `.Autorest` removed + `/help` (read only) + +Goal: Produce minimal, runnable PowerShell example scripts for each relevant cmdlet using ONLY parameters documented in help. + +Algorithm (repeat per cmdlet needed): +1. Open `helpDir/.md`. +2. Collect allowed params = (a) params in first syntax line(s) in code fences + (b) every `### -ParamName` heading. Exclude `CommonParameters`. +3. For each swagger JSON in `{0}` referencing this cmdlet, map its fields to allowed params; drop non‑allowed silently. +4. Order parameters: required (in the order of the first syntax signature) then optional alphabetical. +5. Build one minimal example. Add a second variant ONLY if it demonstrates distinct optional parameters. + +Rules: +* Never invent or rename parameters; casing must match help. +* Value selection precedence (per allowed parameter): + 1. If the swagger example JSON (source `{0}`) contains a concrete value for that parameter (after mapping), use that value directly. + 2. If the swagger value is obviously redacted (e.g. `"string"`, `""`, `"XXXX"`, empty, or null) then fall back to a stable placeholder instead of using the dummy. + 3. Otherwise (no concrete usable value) use a stable placeholder: ``, ``, ``, ``, etc. +* Do not substitute placeholders where a good swagger value exists. +* If no allowed params remain after filtering, create/leave an empty file or a single comment line. +* Do not copy help prose; output only script lines (and brief inline comments if helpful). +* Mirror formatting style hints (indentation, spacing) from reference dirs `{2}` without copying their literal values. + +Output Handling: +- Modify/create files ONLY under `{1}`; no other directories. +- Preserve existing example files, updating parameter sets/order as needed. + +Quick Validation Checklist (stop if any fail): +1. All parameters exist in help. +2. Required parameters present & ordered first. +3. No swagger‑only or duplicate parameters. +4. Placeholders consistent. +5. No redundant variant scripts. + +Produce the final example script contents now; do not restate these instructions. + diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksAccessConnector.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksAccessConnector.md new file mode 100644 index 000000000000..10b2c575a472 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksAccessConnector.md @@ -0,0 +1,177 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/get-azdatabricksaccessconnector +schema: 2.0.0 +--- + +# Get-AzDatabricksAccessConnector + +## SYNOPSIS +Gets an Azure Databricks Access Connector. + +## SYNTAX + +### List1 (Default) +``` +Get-AzDatabricksAccessConnector [-SubscriptionId ] [-DefaultProfile ] + [] +``` + +### Get +``` +Get-AzDatabricksAccessConnector -Name -ResourceGroupName [-SubscriptionId ] + [-DefaultProfile ] [] +``` + +### List +``` +Get-AzDatabricksAccessConnector -ResourceGroupName [-SubscriptionId ] + [-DefaultProfile ] [] +``` + +### GetViaIdentity +``` +Get-AzDatabricksAccessConnector -InputObject [-DefaultProfile ] + [] +``` + +## DESCRIPTION +Gets an Azure Databricks Access Connector. + +## EXAMPLES + +### Example 1: List all access connectors under a subscription. +```powershell +Get-AzDatabricksAccessConnector +``` + +```output +Location Name ResourceGroupName +-------- ---- ----------------- +eastus azps-databricks-accessconnector azps_test_gp_db +``` + +This command lists all access connectors under a subscription. + +### Example 2: List all access connectors under a resource group. +```powershell +Get-AzDatabricksAccessConnector -ResourceGroupName azps_test_gp_db +``` + +```output +Location Name ResourceGroupName +-------- ---- ----------------- +eastus azps-databricks-accessconnector azps_test_gp_db +``` + +This command lists all access connectors under a resource group. + +### Example 3: Get a access connectors by name. +```powershell +Get-AzDatabricksAccessConnector -ResourceGroupName azps_test_gp_db -Name azps-databricks-accessconnector +``` + +```output +Location Name ResourceGroupName +-------- ---- ----------------- +eastus azps-databricks-accessconnector azps_test_gp_db +``` + +This command gets a access connectors by name. + +## PARAMETERS + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity Parameter +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: GetViaIdentity +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the Azure Databricks Access Connector. + +```yaml +Type: System.String +Parameter Sets: Get +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: Get, List +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String[] +Parameter Sets: List1, Get, List +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IAccessConnector + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksOutboundNetworkDependenciesEndpoint.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksOutboundNetworkDependenciesEndpoint.md new file mode 100644 index 000000000000..a594d207aaf6 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksOutboundNetworkDependenciesEndpoint.md @@ -0,0 +1,126 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/get-azdatabricksoutboundnetworkdependenciesendpoint +schema: 2.0.0 +--- + +# Get-AzDatabricksOutboundNetworkDependenciesEndpoint + +## SYNOPSIS +Gets the list of endpoints that VNET Injected Workspace calls Azure Databricks Control Plane. +You must configure outbound access with these endpoints. +For more information, see https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr + +## SYNTAX + +``` +Get-AzDatabricksOutboundNetworkDependenciesEndpoint -ResourceGroupName -WorkspaceName + [-SubscriptionId ] [-DefaultProfile ] + [] +``` + +## DESCRIPTION +Gets the list of endpoints that VNET Injected Workspace calls Azure Databricks Control Plane. +You must configure outbound access with these endpoints. +For more information, see https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr + +## EXAMPLES + +### Example 1: Gets the list of endpoints that VNET Injected Workspace calls Azure Databricks Control Plane. +```powershell +Get-AzDatabricksOutboundNetworkDependenciesEndpoint -ResourceGroupName azps_test_gp_db -WorkspaceName azps-databricks-workspace-t2 +``` + +```output +Category +-------- +Webapp +Control Plane NAT +Extended infrastructure +Azure Storage +Azure My SQL +Azure Servicebus +``` + +This command gets the list of endpoints that VNET Injected Workspace calls Azure Databricks Control Plane. +You must configure outbound access with these endpoints. +For more information, see https://learn.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr + +## PARAMETERS + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String[] +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WorkspaceName +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IOutboundEnvironmentEndpoint + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksVNetPeering.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksVNetPeering.md new file mode 100644 index 000000000000..1cbc3cd9e420 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksVNetPeering.md @@ -0,0 +1,189 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/get-azdatabricksvnetpeering +schema: 2.0.0 +--- + +# Get-AzDatabricksVNetPeering + +## SYNOPSIS +Gets the workspace vNet Peering. + +## SYNTAX + +### List (Default) +``` +Get-AzDatabricksVNetPeering -ResourceGroupName [-SubscriptionId ] -WorkspaceName + [-DefaultProfile ] [] +``` + +### Get +``` +Get-AzDatabricksVNetPeering -Name -ResourceGroupName [-SubscriptionId ] + -WorkspaceName [-DefaultProfile ] [-PassThru] + [] +``` + +### GetViaIdentity +``` +Get-AzDatabricksVNetPeering -InputObject [-DefaultProfile ] [-PassThru] + [] +``` + +## DESCRIPTION +Gets the workspace vNet Peering. + +## EXAMPLES + +### Example 1: List all vnet peering under a databricks. +```powershell +Get-AzDatabricksVNetPeering -WorkspaceName azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db +``` + +```output +Name ResourceGroupName +---- ----------------- +vnet-peering-t1 azps_test_gp_db +``` + +This command lists all vnet peering under a databricks. + +### Example 2: Get a vnet peering. +```powershell +Get-AzDatabricksVNetPeering -WorkspaceName azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db -Name vnet-peering-t1 +``` + +```output +Name ResourceGroupName +---- ----------------- +vnet-peering-t1 azps_test_gp_db +``` + +This command gets a vnet peering. + +## PARAMETERS + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity Parameter +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: GetViaIdentity +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the workspace vNet peering. + +```yaml +Type: System.String +Parameter Sets: Get +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PassThru +Returns true when the command succeeds + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: Get, GetViaIdentity +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: List, Get +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String[] +Parameter Sets: List, Get +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WorkspaceName +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: List, Get +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IVirtualNetworkPeering + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksWorkspace.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksWorkspace.md new file mode 100644 index 000000000000..53ab130b9a0c --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Get-AzDatabricksWorkspace.md @@ -0,0 +1,181 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/get-azdatabricksworkspace +schema: 2.0.0 +--- + +# Get-AzDatabricksWorkspace + +## SYNOPSIS +Gets the workspace. + +## SYNTAX + +### List1 (Default) +``` +Get-AzDatabricksWorkspace [-SubscriptionId ] [-DefaultProfile ] + [] +``` + +### Get +``` +Get-AzDatabricksWorkspace -Name -ResourceGroupName [-SubscriptionId ] + [-DefaultProfile ] [] +``` + +### List +``` +Get-AzDatabricksWorkspace -ResourceGroupName [-SubscriptionId ] [-DefaultProfile ] + [] +``` + +### GetViaIdentity +``` +Get-AzDatabricksWorkspace -InputObject [-DefaultProfile ] + [] +``` + +## DESCRIPTION +Gets the workspace. + +## EXAMPLES + +### Example 1: Get a Databricks workspace with name. +```powershell +Get-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t3 +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t3 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t3 +``` + +This command gets a Databricks workspace in a resource group. + +### Example 2: List all Databricks workspaces in a subscription. +```powershell +Get-AzDatabricksWorkspace +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t1 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t1 +azps-databricks-workspace-t2 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t2 +azps-databricks-workspace-t3 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t3 +``` + +This command lists all Databricks workspaces in a subscription. + +### Example 3: List all Databricks workspaces in a resource group. +```powershell +Get-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t1 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t1 +azps-databricks-workspace-t2 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t2 +azps-databricks-workspace-t3 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t3 +``` + +This command lists all Databricks workspaces in a resource group. + +## PARAMETERS + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity Parameter +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: GetViaIdentity +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: Get +Aliases: WorkspaceName + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: Get, List +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String[] +Parameter Sets: List1, Get, List +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IWorkspace + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksAccessConnector.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksAccessConnector.md new file mode 100644 index 000000000000..7d07a0d568ce --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksAccessConnector.md @@ -0,0 +1,239 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/new-azdatabricksaccessconnector +schema: 2.0.0 +--- + +# New-AzDatabricksAccessConnector + +## SYNOPSIS +Creates or updates Azure Databricks Access Connector. + +## SYNTAX + +``` +New-AzDatabricksAccessConnector -Name -ResourceGroupName [-SubscriptionId ] + -Location [-IdentityType ] [-Tag ] + [-UserAssignedIdentity ] [-DefaultProfile ] [-AsJob] [-NoWait] + [-WhatIf] [-Confirm] [] +``` + +## DESCRIPTION +Creates or updates Azure Databricks Access Connector. + +## EXAMPLES + +### Example 1: Creates or updates azure databricks accessConnector. +```powershell +New-AzDatabricksAccessConnector -ResourceGroupName azps_test_gp_db -Name azps-databricks-accessconnector -Location eastus -IdentityType 'SystemAssigned' +``` + +```output +Location Name ResourceGroupName +-------- ---- ----------------- +eastus azps-databricks-accessconnector azps_test_gp_db +``` + +This command creates or updates azure databricks accessConnector. + +## PARAMETERS + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -IdentityType +Type of managed service identity (where both SystemAssigned and UserAssigned types are allowed). + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.ManagedServiceIdentityType +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Location +The geo-location where the resource lives + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Name +The name of the Azure Databricks Access Connector. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Tag +Resource tags. + +```yaml +Type: System.Collections.Hashtable +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -UserAssignedIdentity +The set of user assigned identities associated with the resource. +The userAssignedIdentities dictionary keys will be ARM resource ids in the form: '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.ManagedIdentity/userAssignedIdentities/{identityName}. +The dictionary values can be empty objects ({}) in requests. + +```yaml +Type: System.Collections.Hashtable +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IAccessConnector + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksVNetPeering.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksVNetPeering.md new file mode 100644 index 000000000000..7b8eabffa0fc --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksVNetPeering.md @@ -0,0 +1,317 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/new-azdatabricksvnetpeering +schema: 2.0.0 +--- + +# New-AzDatabricksVNetPeering + +## SYNOPSIS +Creates vNet Peering for workspace. + +## SYNTAX + +``` +New-AzDatabricksVNetPeering -Name -ResourceGroupName -WorkspaceName + [-SubscriptionId ] [-AllowForwardedTraffic] [-AllowGatewayTransit] [-AllowVirtualNetworkAccess] + [-DatabricksAddressSpacePrefix ] [-DatabricksVirtualNetworkId ] + [-RemoteAddressSpacePrefix ] [-RemoteVirtualNetworkId ] [-UseRemoteGateway] + [-DefaultProfile ] [-AsJob] [-NoWait] [-WhatIf] [-Confirm] + [] +``` + +## DESCRIPTION +Creates vNet Peering for workspace. + +## EXAMPLES + +### Example 1: Create a vnet peering for databricks. +```powershell +New-AzDatabricksVNetPeering -Name vnet-peering-t1 -WorkspaceName azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db -RemoteVirtualNetworkId '/subscriptions/{subId}/resourceGroups/azps_test_gp_db/providers/Microsoft.Network/virtualNetworks/azps-VNnet-t1' +``` + +```output +Name ResourceGroupName +---- ----------------- +vnet-peering-t1 azps_test_gp_db +``` + +This command creates a vnet peering for databricks. + +## PARAMETERS + +### -AllowForwardedTraffic +Whether the forwarded traffic from the VMs in the local virtual network will be allowed/disallowed in remote virtual network. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AllowGatewayTransit +If gateway links can be used in remote virtual networking to link to this virtual network. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AllowVirtualNetworkAccess +Whether the VMs in the local virtual network space would be able to access the VMs in remote virtual network space. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DatabricksAddressSpacePrefix +A list of address blocks reserved for this virtual network in CIDR notation. + +```yaml +Type: System.String[] +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DatabricksVirtualNetworkId +The Id of the databricks virtual network. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Name +The name of the workspace vNet peering. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RemoteAddressSpacePrefix +A list of address blocks reserved for this virtual network in CIDR notation. + +```yaml +Type: System.String[] +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RemoteVirtualNetworkId +The Id of the remote virtual network. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -UseRemoteGateway +If remote gateways can be used on this virtual network. +If the flag is set to true, and allowGatewayTransit on remote peering is also true, virtual network will use gateways of remote virtual network for transit. +Only one peering can have this flag set to true. +This flag cannot be set if virtual network already has a gateway. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WorkspaceName +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IVirtualNetworkPeering + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksWorkspace.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksWorkspace.md new file mode 100644 index 000000000000..2d17fb3e7d3b --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksWorkspace.md @@ -0,0 +1,915 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/new-azdatabricksworkspace +schema: 2.0.0 +--- + +# New-AzDatabricksWorkspace + +## SYNOPSIS +Creates a new workspace. + +## SYNTAX + +``` +New-AzDatabricksWorkspace -Name -ResourceGroupName [-SubscriptionId ] + -Location [-ManagedResourceGroupName ] [-AmlWorkspaceId ] + [-Authorization ] [-DefaultCatalogInitialType ] + [-EnableNoPublicIP] [-EncryptionKeyName ] [-EncryptionKeySource ] + [-EncryptionKeyVaultUri ] [-EncryptionKeyVersion ] [-LoadBalancerBackendPoolName ] + [-LoadBalancerId ] [-ManagedDiskKeySource ] + [-ManagedDiskKeyVaultPropertiesKeyName ] [-ManagedDiskKeyVaultPropertiesKeyVaultUri ] + [-ManagedDiskKeyVaultPropertiesKeyVersion ] [-ManagedDiskRotationToLatestKeyVersionEnabled] + [-ManagedServiceKeySource ] [-ManagedServicesKeyVaultPropertiesKeyName ] + [-ManagedServicesKeyVaultPropertiesKeyVaultUri ] + [-ManagedServicesKeyVaultPropertiesKeyVersion ] [-NatGatewayName ] [-PrepareEncryption] + [-PrivateSubnetName ] [-PublicIPName ] [-PublicNetworkAccess ] + [-PublicSubnetName ] [-RequireInfrastructureEncryption] [-RequiredNsgRule ] + [-Sku ] [-SkuTier ] [-StorageAccountName ] [-StorageAccountSku ] + [-Tag ] [-UiDefinitionUri ] [-VirtualNetworkId ] [-VnetAddressPrefix ] + [-EnhancedSecurityMonitoring ] + [-AutomaticClusterUpdate ] [-ComplianceStandard ] + [-EnhancedSecurityCompliance ] [-AccessConnectorId ] + [-AccessConnectorIdentityType ] [-AccessConnectorUserAssignedIdentityId ] + [-DefaultStorageFirewall ] [-DefaultProfile ] [-AsJob] [-NoWait] + [-WhatIf] [-Confirm] [] +``` + +## DESCRIPTION +Creates a new workspace. + +## EXAMPLES + +### Example 1: Create a Databricks workspace. +```powershell +New-AzDatabricksWorkspace -Name azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db -Location eastus -ManagedResourceGroupName azps_test_gp_kv_t1 -Sku Premium +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t1 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t1 +``` + +This command creates a Databricks workspace. + +### Example 2: Create a Databricks workspace with a customized virtual network. +```powershell +$dlg = New-AzDelegation -Name dbrdl -ServiceName "Microsoft.Databricks/workspaces" +$rdpRule = New-AzNetworkSecurityRuleConfig -Name azps-network-security-rule -Description "Allow RDP" -Access Allow -Protocol Tcp -Direction Inbound -Priority 100 -SourceAddressPrefix Internet -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange 3389 +$networkSecurityGroup = New-AzNetworkSecurityGroup -ResourceGroupName azps_test_gp_db -Location eastus -Name azps-network-security-group -SecurityRules $rdpRule +$kvSubnet = New-AzVirtualNetworkSubnetConfig -Name azps-vnetwork-sub-kv -AddressPrefix "110.0.1.0/24" -ServiceEndpoint "Microsoft.KeyVault" +$priSubnet = New-AzVirtualNetworkSubnetConfig -Name azps-vnetwork-sub-pri -AddressPrefix "110.0.2.0/24" -NetworkSecurityGroup $networkSecurityGroup -Delegation $dlg +$pubSubnet = New-AzVirtualNetworkSubnetConfig -Name azps-vnetwork-sub-pub -AddressPrefix "110.0.3.0/24" -NetworkSecurityGroup $networkSecurityGroup -Delegation $dlg +$testVN = New-AzVirtualNetwork -Name azps-virtual-network -ResourceGroupName azps_test_gp_db -Location eastus -AddressPrefix "110.0.0.0/16" -Subnet $kvSubnet,$priSubnet,$pubSubnet +$vNetResId = (Get-AzVirtualNetwork -Name azps-virtual-network -ResourceGroupName azps_test_gp_db).Subnets[0].Id +$ruleSet = New-AzKeyVaultNetworkRuleSetObject -DefaultAction Allow -Bypass AzureServices -IpAddressRange "110.0.1.0/24" -VirtualNetworkResourceId $vNetResId +New-AzKeyVault -ResourceGroupName azps_test_gp_db -VaultName azps-keyvault -NetworkRuleSet $ruleSet -Location eastus -Sku 'Premium' -EnablePurgeProtection +New-AzDatabricksWorkspace -Name azps-databricks-workspace-t2 -ResourceGroupName azps_test_gp_db -Location eastus -ManagedResourceGroupName azps_test_gp_kv_t2 -VirtualNetworkId $testVN.Id -PrivateSubnetName $priSubnet.Name -PublicSubnetName $pubSubnet.Name -Sku Premium +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t2 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t2 +``` + +This command creates a Databricks workspace with customized virtual network in a resource group. + +### Example 3: Create a Databricks workspace with enable encryption. +```powershell +New-AzDatabricksWorkspace -Name azps-databricks-workspace-t3 -ResourceGroupName azps_test_gp_db -Location eastus -PrepareEncryption -ManagedResourceGroupName azps_test_gp_kv_t3 -Sku premium +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t3 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t3 +``` + +This command creates a Databricks workspace and sets it to prepare for encryption. +Please refer to the examples of Update-AzDatabricksWorkspace for more settings to encryption. + +## PARAMETERS + +### -AccessConnectorId +The resource ID of Azure Databricks Access Connector Resource. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AccessConnectorIdentityType +The identity type of the Access Connector Resource. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.IdentityType +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AccessConnectorUserAssignedIdentityId +The resource ID of the User Assigned Identity associated with the Access Connector Resource. +This is required for type 'UserAssigned' and not valid for type 'SystemAssigned'. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AmlWorkspaceId +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Authorization +The workspace provider authorizations. +To construct, see NOTES section for AUTHORIZATION properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IWorkspaceProviderAuthorization[] +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AutomaticClusterUpdate +Status of automated cluster updates feature. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.AutomaticClusterUpdateValue +Parameter Sets: (All) +Aliases: AutomaticClusterUpdateValue + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ComplianceStandard +Compliance standards associated with the workspace. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.ComplianceStandard[] +Parameter Sets: (All) +Aliases: ComplianceSecurityProfileComplianceStandard + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultCatalogInitialType +Defines the initial type of the default catalog. +Possible values (case-insensitive): HiveMetastore, UnityCatalog + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.InitialType +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultStorageFirewall +Gets or Sets Default Storage Firewall configuration information + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.DefaultStorageFirewall +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EnableNoPublicIP +The value which should be used for this field. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeyName +The name of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeySource +The encryption keySource (provider). +Possible values (case-insensitive): Default, Microsoft.Keyvault + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.KeySource +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeyVaultUri +The Uri of KeyVault. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeyVersion +The version of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EnhancedSecurityCompliance +Status of Compliance Security Profile feature. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.ComplianceSecurityProfileValue +Parameter Sets: (All) +Aliases: ComplianceSecurityProfileValue + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EnhancedSecurityMonitoring +Status of Enhanced Security Monitoring feature. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.EnhancedSecurityMonitoringValue +Parameter Sets: (All) +Aliases: EnhancedSecurityMonitoringValue + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -LoadBalancerBackendPoolName +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -LoadBalancerId +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Location +The geo-location where the resource lives + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeySource +The encryption keySource (provider). +Possible values (case-insensitive): Microsoft.Keyvault + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.EncryptionKeySource +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeyVaultPropertiesKeyName +The name of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeyVaultPropertiesKeyVaultUri +The URI of KeyVault. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeyVaultPropertiesKeyVersion +The version of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskRotationToLatestKeyVersionEnabled +Indicate whether the latest key version should be automatically used for Managed Disk Encryption. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedResourceGroupName +The managed resource group Id. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServiceKeySource +The encryption keySource (provider). +Possible values (case-insensitive): Microsoft.Keyvault + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.EncryptionKeySource +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServicesKeyVaultPropertiesKeyName +The name of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServicesKeyVaultPropertiesKeyVaultUri +The Uri of KeyVault. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServicesKeyVaultPropertiesKeyVersion +The version of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Name +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: WorkspaceName + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NatGatewayName +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PrepareEncryption +The value which should be used for this field. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PrivateSubnetName +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PublicIPName +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PublicNetworkAccess +The network access type for accessing workspace. +Set value to disabled to access workspace only via private link. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.PublicNetworkAccess +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PublicSubnetName +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RequiredNsgRule +Gets or sets a value indicating whether data plane (clusters) to control plane communication happen over private endpoint. +Supported values are 'AllRules' and 'NoAzureDatabricksRules'. +'NoAzureServiceRules' value is for internal use only. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.RequiredNsgRules +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RequireInfrastructureEncryption +The value which should be used for this field. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Sku +The SKU name. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SkuTier +The SKU tier. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -StorageAccountName +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -StorageAccountSku +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Tag +Resource tags. + +```yaml +Type: System.Collections.Hashtable +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -UiDefinitionUri +The blob URI where the UI definition file is located. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -VirtualNetworkId +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -VnetAddressPrefix +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IWorkspace + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksWorkspaceProviderAuthorizationObject.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksWorkspaceProviderAuthorizationObject.md new file mode 100644 index 000000000000..1d83466872a2 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/New-AzDatabricksWorkspaceProviderAuthorizationObject.md @@ -0,0 +1,84 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/Az.Databricks/new-AzDatabricksWorkspaceProviderAuthorizationObject +schema: 2.0.0 +--- + +# New-AzDatabricksWorkspaceProviderAuthorizationObject + +## SYNOPSIS +Create an in-memory object for WorkspaceProviderAuthorization. + +## SYNTAX + +``` +New-AzDatabricksWorkspaceProviderAuthorizationObject -PrincipalId -RoleDefinitionId + [] +``` + +## DESCRIPTION +Create an in-memory object for WorkspaceProviderAuthorization. + +## EXAMPLES + +### Example 1: Create an in-memory object for WorkspaceProviderAuthorization. +```powershell +New-AzDatabricksWorkspaceProviderAuthorizationObject -PrincipalId 024d7367-0890-4ad3-8140-e37374722820 -RoleDefinitionId 2124844c-7e23-48cc-bc52-a3af25f7a4ae +``` + +```output +PrincipalId RoleDefinitionId +----------- ---------------- +024d7367-0890-4ad3-8140-e37374722820 2124844c-7e23-48cc-bc52-a3af25f7a4ae +``` + +Create an in-memory object for WorkspaceProviderAuthorization. + +## PARAMETERS + +### -PrincipalId +The provider's principal identifier. +This is the identity that the provider will use to call ARM to manage the workspace resources. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RoleDefinitionId +The provider's role definition identifier. +This role will define all the permissions that the provider must have on the workspace's container resource group. +This role definition cannot have permission to delete the resource group. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.WorkspaceProviderAuthorization + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksAccessConnector.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksAccessConnector.md new file mode 100644 index 000000000000..ffeec43eeb83 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksAccessConnector.md @@ -0,0 +1,217 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/remove-azdatabricksaccessconnector +schema: 2.0.0 +--- + +# Remove-AzDatabricksAccessConnector + +## SYNOPSIS +Deletes the Azure Databricks Access Connector. + +## SYNTAX + +### Delete (Default) +``` +Remove-AzDatabricksAccessConnector -Name -ResourceGroupName [-SubscriptionId ] + [-DefaultProfile ] [-AsJob] [-NoWait] [-PassThru] [-WhatIf] + [-Confirm] [] +``` + +### DeleteViaIdentity +``` +Remove-AzDatabricksAccessConnector -InputObject [-DefaultProfile ] [-AsJob] + [-NoWait] [-PassThru] [-WhatIf] [-Confirm] [] +``` + +## DESCRIPTION +Deletes the Azure Databricks Access Connector. + +## EXAMPLES + +### Example 1: Deletes the azure databricks accessConnector. +```powershell +Remove-AzDatabricksAccessConnector -ResourceGroupName azps_test_gp_db -Name azps-databricks-accessconnector +``` + +This command deletes the azure databricks accessConnector. + +### Example 2: Deletes the azure databricks accessConnector by pipeline. +```powershell +Get-AzDatabricksAccessConnector -ResourceGroupName azps_test_gp_db -Name azps-databricks-accessconnector | Remove-AzDatabricksAccessConnector +``` + +This command deletes the azure databricks accessConnector by pipeline. + +## PARAMETERS + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity Parameter +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: DeleteViaIdentity +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the Azure Databricks Access Connector. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PassThru +Returns true when the command succeeds + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### System.Boolean + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksVNetPeering.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksVNetPeering.md new file mode 100644 index 000000000000..4ec972648ab7 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksVNetPeering.md @@ -0,0 +1,232 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/remove-azdatabricksvnetpeering +schema: 2.0.0 +--- + +# Remove-AzDatabricksVNetPeering + +## SYNOPSIS +Deletes the workspace vNetPeering. + +## SYNTAX + +### Delete (Default) +``` +Remove-AzDatabricksVNetPeering -Name -ResourceGroupName [-SubscriptionId ] + -WorkspaceName [-DefaultProfile ] [-AsJob] [-NoWait] [-PassThru] + [-WhatIf] [-Confirm] [] +``` + +### DeleteViaIdentity +``` +Remove-AzDatabricksVNetPeering -InputObject [-DefaultProfile ] [-AsJob] + [-NoWait] [-PassThru] [-WhatIf] [-Confirm] [] +``` + +## DESCRIPTION +Deletes the workspace vNetPeering. + +## EXAMPLES + +### Example 1: Remove a vnet peering of databricks by name. +```powershell +Remove-AzDatabricksVNetPeering -Name vnet-peering-t1 -WorkspaceName azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db +``` + +This command removes a vnet peering of databricks by name. + +### Example 2: Remove a vnet peering of databricks by object. +```powershell +Get-AzDatabricksVNetPeering -Name vnet-peering-t1 -WorkspaceName azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db | Remove-AzDatabricksVNetPeering +``` + +This command removes a vnet peering of databricks by object. + +## PARAMETERS + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity Parameter +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: DeleteViaIdentity +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the workspace vNet peering. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PassThru +Returns true when the command succeeds + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WorkspaceName +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### System.Boolean + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksWorkspace.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksWorkspace.md new file mode 100644 index 000000000000..9fdfa9fe5a20 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Remove-AzDatabricksWorkspace.md @@ -0,0 +1,233 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/remove-azdatabricksworkspace +schema: 2.0.0 +--- + +# Remove-AzDatabricksWorkspace + +## SYNOPSIS +Deletes the workspace. + +## SYNTAX + +### Delete (Default) +``` +Remove-AzDatabricksWorkspace -Name -ResourceGroupName [-SubscriptionId ] + [-ForceDeletion] [-DefaultProfile ] [-AsJob] [-NoWait] [-PassThru] + [-WhatIf] [-Confirm] [] +``` + +### DeleteViaIdentity +``` +Remove-AzDatabricksWorkspace -InputObject [-ForceDeletion] [-DefaultProfile ] + [-AsJob] [-NoWait] [-PassThru] [-WhatIf] [-Confirm] [] +``` + +## DESCRIPTION +Deletes the workspace. + +## EXAMPLES + +### Example 1: Remove a Databricks workspace. +```powershell +Remove-AzDatabricksWorkspace -Name azps-databricks-workspace -ResourceGroupName azps_test_gp_db +``` + +This command removes a Databricks workspace from a resource group. + +### Example 2: Remove a Databricks workspace by object. +```powershell +Get-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t3 | Remove-AzDatabricksWorkspace +``` + +This command removes a Databricks workspace from a resource group. + +## PARAMETERS + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ForceDeletion +Optional parameter to retain default unity catalog data. +By default the data will retained if Uc is enabled on the workspace. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity Parameter +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: DeleteViaIdentity +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: WorkspaceName + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PassThru +Returns true when the command succeeds + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String +Parameter Sets: Delete +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### System.Boolean + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksAccessConnector.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksAccessConnector.md new file mode 100644 index 000000000000..f78a0c5a8782 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksAccessConnector.md @@ -0,0 +1,264 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/update-azdatabricksaccessconnector +schema: 2.0.0 +--- + +# Update-AzDatabricksAccessConnector + +## SYNOPSIS +Updates an Azure Databricks Access Connector. + +## SYNTAX + +### UpdateExpanded (Default) +``` +Update-AzDatabricksAccessConnector -Name -ResourceGroupName [-SubscriptionId ] + [-IdentityType ] [-IdentityUserAssignedIdentity ] [-Tag ] + [-DefaultProfile ] [-AsJob] [-NoWait] [-WhatIf] [-Confirm] + [] +``` + +### UpdateViaIdentityExpanded +``` +Update-AzDatabricksAccessConnector -InputObject + [-IdentityType ] [-IdentityUserAssignedIdentity ] [-Tag ] + [-DefaultProfile ] [-AsJob] [-NoWait] [-WhatIf] [-Confirm] + [] +``` + +## DESCRIPTION +Updates an Azure Databricks Access Connector. + +## EXAMPLES + +### Example 1: Updates an azure databricks accessConnector. +```powershell +Update-AzDatabricksAccessConnector -ResourceGroupName azps_test_gp_db -Name azps-databricks-accessconnector -Tag @{'key'='value'} +``` + +```output +Location Name ResourceGroupName +-------- ---- ----------------- +eastus azps-databricks-accessconnector azps_test_gp_db +``` + +This command updates an azure databricks accessConnector. + +### Example 2: Updates an azure databricks accessConnector by pipeline. +```powershell +Get-AzDatabricksAccessConnector -ResourceGroupName azps_test_gp_db -Name azps-databricks-accessconnector | Update-AzDatabricksAccessConnector -Tag @{'key'='value'} +``` + +```output +Location Name ResourceGroupName +-------- ---- ----------------- +eastus azps-databricks-accessconnector azps_test_gp_db +``` + +This command updates an azure databricks accessConnector by pipeline. + +## PARAMETERS + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -IdentityType +Type of managed service identity (where both SystemAssigned and UserAssigned types are allowed). + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.ManagedServiceIdentityType +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -IdentityUserAssignedIdentity +The set of user assigned identities associated with the resource. +The userAssignedIdentities dictionary keys will be ARM resource ids in the form: '/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.ManagedIdentity/userAssignedIdentities/{identityName}. +The dictionary values can be empty objects ({}) in requests. + +```yaml +Type: System.Collections.Hashtable +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity Parameter +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: UpdateViaIdentityExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the Azure Databricks Access Connector. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. +The value must be an UUID. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Tag +Resource tags. + +```yaml +Type: System.Collections.Hashtable +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IAccessConnector + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksVNetPeering.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksVNetPeering.md new file mode 100644 index 000000000000..2db4e1fea5c4 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksVNetPeering.md @@ -0,0 +1,362 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/update-azdatabricksvnetpeering +schema: 2.0.0 +--- + +# Update-AzDatabricksVNetPeering + +## SYNOPSIS +Update vNet Peering for workspace. + +## SYNTAX + +### UpdateExpanded (Default) +``` +Update-AzDatabricksVNetPeering -Name -ResourceGroupName -WorkspaceName + [-SubscriptionId ] [-AllowForwardedTraffic ] [-AllowGatewayTransit ] + [-AllowVirtualNetworkAccess ] [-DatabricksAddressSpacePrefix ] + [-DatabricksVirtualNetworkId ] [-RemoteAddressSpacePrefix ] + [-RemoteVirtualNetworkId ] [-UseRemoteGateway ] [-DefaultProfile ] [-AsJob] + [-NoWait] [-WhatIf] [-Confirm] [] +``` + +### UpdateViaIdentityExpanded +``` +Update-AzDatabricksVNetPeering -InputObject [-AllowForwardedTraffic ] + [-AllowGatewayTransit ] [-AllowVirtualNetworkAccess ] + [-DatabricksAddressSpacePrefix ] [-DatabricksVirtualNetworkId ] + [-RemoteAddressSpacePrefix ] [-RemoteVirtualNetworkId ] [-UseRemoteGateway ] + [-DefaultProfile ] [-AsJob] [-NoWait] [-WhatIf] [-Confirm] + [] +``` + +## DESCRIPTION +Update vNet Peering for workspace. + +## EXAMPLES + +### Example 1: Update AllowForwardedTraffic of vnet peering. +```powershell +Update-AzDatabricksVNetPeering -Name vnet-peering-t1 -WorkspaceName azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db -AllowForwardedTraffic $True +``` + +```output +Name ResourceGroupName +---- ----------------- +vnet-peering-t1 azps_test_gp_db +``` + +This command updates AllowForwardedTraffic of vnet peering. + +### Example 2: Update AllowForwardedTraffic of vnet peering by object. +```powershell +Get-AzDatabricksVNetPeering -WorkspaceName azps-databricks-workspace-t1 -ResourceGroupName azps_test_gp_db -Name vnet-peering-t1 | Update-AzDatabricksVNetPeering -AllowGatewayTransit $true +``` + +```output +Name ResourceGroupName +---- ----------------- +vnet-peering-t1 azps_test_gp_db +``` + +This command updates AllowForwardedTraffic of vnet peering by object. + +## PARAMETERS + +### -AllowForwardedTraffic +[System.Management.Automation.SwitchParameter] +Whether the forwarded traffic from the VMs in the local virtual network will be allowed/disallowed in remote virtual network. + +```yaml +Type: System.Boolean +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AllowGatewayTransit +[System.Management.Automation.SwitchParameter] +If gateway links can be used in remote virtual networking to link to this virtual network. + +```yaml +Type: System.Boolean +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AllowVirtualNetworkAccess +[System.Management.Automation.SwitchParameter] +Whether the VMs in the local virtual network space would be able to access the VMs in remote virtual network space. + +```yaml +Type: System.Boolean +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DatabricksAddressSpacePrefix +A list of address blocks reserved for this virtual network in CIDR notation. + +```yaml +Type: System.String[] +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DatabricksVirtualNetworkId +The Id of the databricks virtual network. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The DefaultProfile parameter is not functional. +Use the SubscriptionId parameter when available if executing the cmdlet against a different subscription. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity parameter. +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: UpdateViaIdentityExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -Name +The name of the VNetPeering. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: PeeringName + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RemoteAddressSpacePrefix +A list of address blocks reserved for this virtual network in CIDR notation. + +```yaml +Type: System.String[] +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RemoteVirtualNetworkId +The Id of the remote virtual network. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -UseRemoteGateway +[System.Management.Automation.SwitchParameter] +If remote gateways can be used on this virtual network. +If the flag is set to true, and allowGatewayTransit on remote peering is also true, virtual network will use gateways of remote virtual network for transit. +Only one peering can have this flag set to true. +This flag cannot be set if virtual network already has a gateway. + +```yaml +Type: System.Boolean +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WorkspaceName +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IVirtualNetworkPeering + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksWorkspace.md b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksWorkspace.md new file mode 100644 index 000000000000..174152dd46df --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/examples/Update-AzDatabricksWorkspace.md @@ -0,0 +1,799 @@ +--- +external help file: Az.Databricks-help.xml +Module Name: Az.Databricks +online version: https://learn.microsoft.com/powershell/module/az.databricks/update-azdatabricksworkspace +schema: 2.0.0 +--- + +# Update-AzDatabricksWorkspace + +## SYNOPSIS +Updates a workspace. + +## SYNTAX + +### UpdateExpanded (Default) +``` +Update-AzDatabricksWorkspace -Name -ResourceGroupName [-SubscriptionId ] + [-PrepareEncryption] [-EncryptionKeySource ] [-EncryptionKeyVaultUri ] + [-EncryptionKeyName ] [-EncryptionKeyVersion ] [-KeyVaultKeyName ] + [-KeyVaultKeyVersion ] [-KeyVaultUri ] [-AmlWorkspaceId ] [-SkuTier ] + [-Authorization ] [-DefaultCatalogInitialType ] + [-ManagedDiskKeySource ] [-ManagedDiskKeyVaultPropertiesKeyName ] + [-ManagedDiskKeyVaultPropertiesKeyVaultUri ] [-ManagedDiskKeyVaultPropertiesKeyVersion ] + [-ManagedDiskRotationToLatestKeyVersionEnabled] [-ManagedServiceKeySource ] + [-ManagedServicesKeyVaultPropertiesKeyName ] [-ManagedServicesKeyVaultPropertiesKeyVaultUri ] + [-ManagedServicesKeyVaultPropertiesKeyVersion ] [-UiDefinitionUri ] [-Tag ] + [-RequiredNsgRule ] [-PublicNetworkAccess ] [-EnableNoPublicIP] + [-EnhancedSecurityMonitoring ] + [-AutomaticClusterUpdate ] [-ComplianceStandard ] + [-EnhancedSecurityCompliance ] [-AccessConnectorId ] + [-AccessConnectorIdentityType ] [-AccessConnectorUserAssignedIdentityId ] + [-DefaultStorageFirewall ] [-DefaultProfile ] [-AsJob] [-NoWait] + [-WhatIf] [-Confirm] [] +``` + +### UpdateViaIdentityExpanded +``` +Update-AzDatabricksWorkspace -InputObject [-PrepareEncryption] + [-EncryptionKeySource ] [-EncryptionKeyVaultUri ] [-EncryptionKeyName ] + [-EncryptionKeyVersion ] [-KeyVaultKeyName ] [-KeyVaultKeyVersion ] + [-KeyVaultUri ] [-AmlWorkspaceId ] [-SkuTier ] + [-Authorization ] [-DefaultCatalogInitialType ] + [-ManagedDiskKeySource ] [-ManagedDiskKeyVaultPropertiesKeyName ] + [-ManagedDiskKeyVaultPropertiesKeyVaultUri ] [-ManagedDiskKeyVaultPropertiesKeyVersion ] + [-ManagedDiskRotationToLatestKeyVersionEnabled] [-ManagedServiceKeySource ] + [-ManagedServicesKeyVaultPropertiesKeyName ] [-ManagedServicesKeyVaultPropertiesKeyVaultUri ] + [-ManagedServicesKeyVaultPropertiesKeyVersion ] [-UiDefinitionUri ] [-Tag ] + [-RequiredNsgRule ] [-PublicNetworkAccess ] [-EnableNoPublicIP] + [-EnhancedSecurityMonitoring ] + [-AutomaticClusterUpdate ] [-ComplianceStandard ] + [-EnhancedSecurityCompliance ] [-AccessConnectorId ] + [-AccessConnectorIdentityType ] [-AccessConnectorUserAssignedIdentityId ] + [-DefaultStorageFirewall ] [-DefaultProfile ] [-AsJob] [-NoWait] + [-WhatIf] [-Confirm] [] +``` + +## DESCRIPTION +Updates a workspace. + +## EXAMPLES + +### Example 1: Updates the tags of a Databricks workspace. +```powershell +Get-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t1 | Update-AzDatabricksWorkspace -Tag @{"key"="value"} +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t1 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t1 +``` + +This command updates the tags of a Databricks workspace. + +### Example 2: Enable encryption on a Databricks workspace. +```powershell +Update-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t2 -PrepareEncryption +$updWsp = Get-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t2 +Set-AzKeyVaultAccessPolicy -VaultName azps-keyvault -ObjectId $updWsp.StorageAccountIdentityPrincipalId -PermissionsToKeys wrapkey,unwrapkey,get +Update-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t2 -EncryptionKeySource 'Microsoft.KeyVault' -EncryptionKeyVaultUri https://azps-keyvault.vault.azure.net/ -EncryptionKeyName azps-k1 -EncryptionKeyVersion a563a8021cba47109d93bd6d690621a7 +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t2 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t2 +``` + +Enabling encryption on a Databricks workspace takes three steps: +1.Please make sure that KeyVault has Purge protection enabled. +2.Update the workspace with `-PrepareEncryption` (if it was not created so). +3.Find `StorageAccountIdentityPrincipalId` in the output of the last step and grant key permissions to the principal. +4.Update the workspace again to fill in information about the encryption key: + - `-EncryptionKeySource` + - `-EncryptionKeyVaultUri` + - `-EncryptionKeyName` + - `-EncryptionKeyVersion` +5.Important! Please read the information in the following document in detail: https://learn.microsoft.com/en-us/azure/databricks/security/keys/cmk-managed-services-azure/customer-managed-key-managed-services-azure?WT.mc_id=Portal-Microsoft_Azure_Databricks#--use-the-azure-portal + +### Example 3: Disable encryption on a Databricks workspace. +```powershell +Update-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t3 -EncryptionKeySource 'Default' +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t3 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t3 +``` + +To disable encryption, simply set `-EncryptionKeySource` to `'Default'`. + +### Example 4: Update NsgRule of the Databricks workspace. +```powershell +Update-AzDatabricksWorkspace -ResourceGroupName azps_test_gp_db -Name azps-databricks-workspace-t2 -RequiredNsgRule 'AllRules' +``` + +```output +Name ResourceGroupName Location Managed Resource Group ID +---- ----------------- -------- ------------------------- +azps-databricks-workspace-t2 azps_test_gp_db eastus /subscriptions/{subId}/resourceGroups/azps_test_gp_kv_t2 +``` + +This command updates NsgRule of the Databricks workspace. + +## PARAMETERS + +### -AccessConnectorId +The resource ID of Azure Databricks Access Connector Resource. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AccessConnectorIdentityType +The identity type of the Access Connector Resource. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.IdentityType +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AccessConnectorUserAssignedIdentityId +The resource ID of the User Assigned Identity associated with the Access Connector Resource. +This is required for type 'UserAssigned' and not valid for type 'SystemAssigned'. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AmlWorkspaceId +The value which should be used for this field. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AsJob +Run the command as a job + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Authorization +The workspace provider authorizations. +To construct, see NOTES section for AUTHORIZATION properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IWorkspaceProviderAuthorization[] +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -AutomaticClusterUpdate +Status of automated cluster updates feature. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.AutomaticClusterUpdateValue +Parameter Sets: (All) +Aliases: AutomaticClusterUpdateValue + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ComplianceStandard +Compliance standards associated with the workspace. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.ComplianceStandard[] +Parameter Sets: (All) +Aliases: ComplianceSecurityProfileComplianceStandard + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultCatalogInitialType +Defines the initial type of the default catalog. +Possible values (case-insensitive): HiveMetastore, UnityCatalog + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.InitialType +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultProfile +The credentials, account, tenant, and subscription used for communication with Azure. + +```yaml +Type: System.Management.Automation.PSObject +Parameter Sets: (All) +Aliases: AzureRMContext, AzureCredential + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -DefaultStorageFirewall +Gets or Sets Default Storage Firewall configuration information + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.DefaultStorageFirewall +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EnableNoPublicIP +The value which should be used for this field. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeyName +The name of Key Vault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeySource +The encryption keySource (provider). +Possible values (case-insensitive): Default, Microsoft.Keyvault + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.KeySource +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeyVaultUri +The URI (DNS name) of the Key Vault. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EncryptionKeyVersion +The version of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EnhancedSecurityCompliance +Status of Compliance Security Profile feature. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.ComplianceSecurityProfileValue +Parameter Sets: (All) +Aliases: ComplianceSecurityProfileValue + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -EnhancedSecurityMonitoring +Status of Enhanced Security Monitoring feature. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.EnhancedSecurityMonitoringValue +Parameter Sets: (All) +Aliases: EnhancedSecurityMonitoringValue + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -InputObject +Identity parameter. +To construct, see NOTES section for INPUTOBJECT properties and create a hash table. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity +Parameter Sets: UpdateViaIdentityExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: True (ByValue) +Accept wildcard characters: False +``` + +### -KeyVaultKeyName +The name of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -KeyVaultKeyVersion +The version of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -KeyVaultUri +The Uri of KeyVault. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeySource +The encryption keySource (provider). +Possible values (case-insensitive): Microsoft.Keyvault + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.EncryptionKeySource +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeyVaultPropertiesKeyName +The name of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeyVaultPropertiesKeyVaultUri +The URI of KeyVault. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskKeyVaultPropertiesKeyVersion +The version of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedDiskRotationToLatestKeyVersionEnabled +Indicate whether the latest key version should be automatically used for Managed Disk Encryption. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServiceKeySource +The encryption keySource (provider). +Possible values (case-insensitive): Microsoft.Keyvault + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.EncryptionKeySource +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServicesKeyVaultPropertiesKeyName +The name of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServicesKeyVaultPropertiesKeyVaultUri +The Uri of KeyVault. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ManagedServicesKeyVaultPropertiesKeyVersion +The version of KeyVault key. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Name +The name of the workspace. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: WorkspaceName + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -NoWait +Run the command asynchronously + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PrepareEncryption +Prepare the workspace for encryption. +Enables the Managed Identity for managed storage account. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -PublicNetworkAccess +The network access type for accessing workspace. +Set value to disabled to access workspace only via private link. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.PublicNetworkAccess +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -RequiredNsgRule +Gets or sets a value indicating whether data plane (clusters) to control plane communication happen over private endpoint. +Supported values are 'AllRules' and 'NoAzureDatabricksRules'. +'NoAzureServiceRules' value is for internal use only. + +```yaml +Type: Microsoft.Azure.PowerShell.Cmdlets.Databricks.Support.RequiredNsgRules +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -ResourceGroupName +The name of the resource group. +The name is case insensitive. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: True +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SkuTier +The SKU tier. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -SubscriptionId +The ID of the target subscription. + +```yaml +Type: System.String +Parameter Sets: UpdateExpanded +Aliases: + +Required: False +Position: Named +Default value: (Get-AzContext).Subscription.Id +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Tag +Resource tags. + +```yaml +Type: System.Collections.Hashtable +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -UiDefinitionUri +The blob URI where the UI definition file is located. + +```yaml +Type: System.String +Parameter Sets: (All) +Aliases: + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -Confirm +Prompts you for confirmation before running the cmdlet. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: cf + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### -WhatIf +Shows what would happen if the cmdlet runs. +The cmdlet is not run. + +```yaml +Type: System.Management.Automation.SwitchParameter +Parameter Sets: (All) +Aliases: wi + +Required: False +Position: Named +Default value: None +Accept pipeline input: False +Accept wildcard characters: False +``` + +### CommonParameters +This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). + +## INPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.IDatabricksIdentity + +## OUTPUTS + +### Microsoft.Azure.PowerShell.Cmdlets.Databricks.Models.Api20240501.IWorkspace + +## NOTES + +## RELATED LINKS diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/metadata.md b/tools/Mcp/src/assets/ideal-modules/Databricks/metadata.md new file mode 100644 index 000000000000..51eb8e7928cb --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/metadata.md @@ -0,0 +1,57 @@ +--- +Module Name: Az.Databricks +Module Guid: fd603f36-03d8-47f4-9f7c-c13a78761936 +Download Help Link: https://learn.microsoft.com/powershell/module/az.databricks +Help Version: 1.0.0.0 +Locale: en-US +--- + +# Az.Databricks Module +## Description +Microsoft Azure PowerShell: Databricks cmdlets + +## Az.Databricks Cmdlets +### [Get-AzDatabricksAccessConnector](Get-AzDatabricksAccessConnector.md) +Gets an Azure Databricks Access Connector. + +### [Get-AzDatabricksOutboundNetworkDependenciesEndpoint](Get-AzDatabricksOutboundNetworkDependenciesEndpoint.md) +Gets the list of endpoints that VNET Injected Workspace calls Azure Databricks Control Plane. +You must configure outbound access with these endpoints. +For more information, see https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr + +### [Get-AzDatabricksVNetPeering](Get-AzDatabricksVNetPeering.md) +Gets the workspace vNet Peering. + +### [Get-AzDatabricksWorkspace](Get-AzDatabricksWorkspace.md) +Gets the workspace. + +### [New-AzDatabricksAccessConnector](New-AzDatabricksAccessConnector.md) +Creates or updates Azure Databricks Access Connector. + +### [New-AzDatabricksVNetPeering](New-AzDatabricksVNetPeering.md) +Creates vNet Peering for workspace. + +### [New-AzDatabricksWorkspace](New-AzDatabricksWorkspace.md) +Creates a new workspace. + +### [New-AzDatabricksWorkspaceProviderAuthorizationObject](New-AzDatabricksWorkspaceProviderAuthorizationObject.md) +Create an in-memory object for WorkspaceProviderAuthorization. + +### [Remove-AzDatabricksAccessConnector](Remove-AzDatabricksAccessConnector.md) +Deletes the Azure Databricks Access Connector. + +### [Remove-AzDatabricksVNetPeering](Remove-AzDatabricksVNetPeering.md) +Deletes the workspace vNetPeering. + +### [Remove-AzDatabricksWorkspace](Remove-AzDatabricksWorkspace.md) +Deletes the workspace. + +### [Update-AzDatabricksAccessConnector](Update-AzDatabricksAccessConnector.md) +Updates an Azure Databricks Access Connector. + +### [Update-AzDatabricksVNetPeering](Update-AzDatabricksVNetPeering.md) +Update vNet Peering for workspace. + +### [Update-AzDatabricksWorkspace](Update-AzDatabricksWorkspace.md) +Updates a workspace. + diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksAccessConnector.Tests.ps1 b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksAccessConnector.Tests.ps1 new file mode 100644 index 000000000000..8b04726171bd --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksAccessConnector.Tests.ps1 @@ -0,0 +1,65 @@ +if (($null -eq $TestName) -or ($TestName -contains 'AzDatabricksAccessConnector')) { + $loadEnvPath = Join-Path $PSScriptRoot 'loadEnv.ps1' + if (-Not (Test-Path -Path $loadEnvPath)) { + $loadEnvPath = Join-Path $PSScriptRoot '..\loadEnv.ps1' + } + . ($loadEnvPath) + $TestRecordingFile = Join-Path $PSScriptRoot 'AzDatabricksAccessConnector.Recording.json' + $currentPath = $PSScriptRoot + while (-not $mockingPath) { + $mockingPath = Get-ChildItem -Path $currentPath -Recurse -Include 'HttpPipelineMocking.ps1' -File + $currentPath = Split-Path -Path $currentPath -Parent + } + . ($mockingPath | Select-Object -First 1).FullName +} + +Describe 'AzDatabricksAccessConnector' { + It 'CreateExpanded' { + { + $config = New-AzDatabricksAccessConnector -ResourceGroupName $env.resourceGroup -Name $env.accessConnectorName1 -Location $env.location -IdentityType 'SystemAssigned' + $config.Name | Should -Be $env.accessConnectorName1 + } | Should -Not -Throw + } + + It 'List1' { + { + $config = Get-AzDatabricksAccessConnector -ResourceGroupName $env.resourceGroup + $config.Count | Should -BeGreaterThan 0 + } | Should -Not -Throw + } + + It 'Get' { + { + $config = Get-AzDatabricksAccessConnector -ResourceGroupName $env.resourceGroup -Name $env.accessConnectorName1 + $config.Name | Should -Be $env.accessConnectorName1 + } | Should -Not -Throw + } + + It 'List' { + { + $config = Get-AzDatabricksAccessConnector + $config.Count | Should -BeGreaterThan 0 + } | Should -Not -Throw + } + + It 'UpdateExpanded' { + { + $config = Update-AzDatabricksAccessConnector -ResourceGroupName $env.resourceGroup -Name $env.accessConnectorName1 -Tag @{'key' = 'value' } + $config.Name | Should -Be $env.accessConnectorName1 + } | Should -Not -Throw + } + + It 'UpdateViaIdentityExpanded' { + { + $config = Get-AzDatabricksAccessConnector -ResourceGroupName $env.resourceGroup -Name $env.accessConnectorName1 + $config = Update-AzDatabricksAccessConnector -InputObject $config -Tag @{'key' = 'value' } + $config.Name | Should -Be $env.accessConnectorName1 + } | Should -Not -Throw + } + + It 'Delete' { + { + Remove-AzDatabricksAccessConnector -ResourceGroupName $env.resourceGroup -Name $env.accessConnectorName1 + } | Should -Not -Throw + } +} \ No newline at end of file diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksVNetPeering.Tests.ps1 b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksVNetPeering.Tests.ps1 new file mode 100644 index 000000000000..d23158b163bc --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksVNetPeering.Tests.ps1 @@ -0,0 +1,58 @@ +if (($null -eq $TestName) -or ($TestName -contains 'AzDatabricksVNetPeering')) { + $loadEnvPath = Join-Path $PSScriptRoot 'loadEnv.ps1' + if (-Not (Test-Path -Path $loadEnvPath)) { + $loadEnvPath = Join-Path $PSScriptRoot '..\loadEnv.ps1' + } + . ($loadEnvPath) + $TestRecordingFile = Join-Path $PSScriptRoot 'AzDatabricksVNetPeering.Recording.json' + $currentPath = $PSScriptRoot + while (-not $mockingPath) { + $mockingPath = Get-ChildItem -Path $currentPath -Recurse -Include 'HttpPipelineMocking.ps1' -File + $currentPath = Split-Path -Path $currentPath -Parent + } + . ($mockingPath | Select-Object -First 1).FullName +} + +Describe 'AzDatabricksVNetPeering' { + It 'CreateExpanded' { + { + $config = New-AzDatabricksVNetPeering -Name $env.vNetName1 -WorkspaceName $env.workSpaceName3 -ResourceGroupName $env.resourceGroup -RemoteVirtualNetworkId "/subscriptions/$($env.SubscriptionId)/resourceGroups/$($env.resourceGroup)/providers/Microsoft.Network/virtualNetworks/$($env.vNetName)" + $config.Name | Should -Be $env.vNetName1 + } | Should -Not -Throw + } + + It 'List' { + { + $config = Get-AzDatabricksVNetPeering -WorkspaceName $env.workSpaceName3 -ResourceGroupName $env.resourceGroup + $config.Count | Should -BeGreaterThan 0 + } | Should -Not -Throw + } + + It 'Get' -Skip { + { + $config = Get-AzDatabricksVNetPeering -WorkspaceName $env.workSpaceName3 -ResourceGroupName $env.resourceGroup -Name $env.vNetName1 + $config.Name | Should -Be $env.vNetName1 + } | Should -Not -Throw + } + + It 'UpdateExpanded' -Skip { + { + $config = Update-AzDatabricksVNetPeering -WorkspaceName $env.workSpaceName3 -ResourceGroupName $env.resourceGroup -Name $env.vNetName1 -AllowForwardedTraffic $True + $config.Name | Should -Be $env.vNetName1 + } | Should -Not -Throw + } + + It 'UpdateViaIdentityExpanded' -Skip { + { + $config = Get-AzDatabricksVNetPeering -WorkspaceName $env.workSpaceName3 -ResourceGroupName $env.resourceGroup -Name $env.vNetName1 + $config = Update-AzDatabricksVNetPeering -InputObject $config -AllowForwardedTraffic $True + $config.Name | Should -Be $env.vNetName1 + } | Should -Not -Throw + } + + It 'Delete' { + { + Remove-AzDatabricksVNetPeering -WorkspaceName $env.workSpaceName3 -ResourceGroupName $env.resourceGroup -Name $env.vNetName1 + } | Should -Not -Throw + } +} \ No newline at end of file diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksWorkspace.Tests.ps1 b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksWorkspace.Tests.ps1 new file mode 100644 index 000000000000..62806d2d23b2 --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/AzDatabricksWorkspace.Tests.ps1 @@ -0,0 +1,86 @@ +if (($null -eq $TestName) -or ($TestName -contains 'AzDatabricksWorkspace')) { + $loadEnvPath = Join-Path $PSScriptRoot 'loadEnv.ps1' + if (-Not (Test-Path -Path $loadEnvPath)) { + $loadEnvPath = Join-Path $PSScriptRoot '..\loadEnv.ps1' + } + . ($loadEnvPath) + $TestRecordingFile = Join-Path $PSScriptRoot 'AzDatabricksWorkspace.Recording.json' + $currentPath = $PSScriptRoot + while (-not $mockingPath) { + $mockingPath = Get-ChildItem -Path $currentPath -Recurse -Include 'HttpPipelineMocking.ps1' -File + $currentPath = Split-Path -Path $currentPath -Parent + } + . ($mockingPath | Select-Object -First 1).FullName +} + +Describe 'AzDatabricksWorkspace' { + It 'CreateExpanded' { + { + $config = New-AzDatabricksWorkspace -Name $env.workSpaceName2 -ResourceGroupName $env.resourceGroup -Location $env.location -Sku premium + $config.Name | Should -Be $env.workSpaceName2 + } | Should -Not -Throw + } + + It 'List' { + { + $config = Get-AzDatabricksWorkspace -ResourceGroupName $env.resourceGroup + $config.Count | Should -BeGreaterThan 0 + } | Should -Not -Throw + } + + It 'List1' { + { + $config = Get-AzDatabricksWorkspace + $config.Count | Should -BeGreaterThan 0 + } | Should -Not -Throw + } + + It 'Get' { + { + $config = Get-AzDatabricksWorkspace -Name $env.workSpaceName2 -ResourceGroupName $env.resourceGroup + $config.Name | Should -Be $env.workSpaceName2 + } | Should -Not -Throw + } + + It 'OutboundNetworkDependenciesEndpointList' { + { + $config = Get-AzDatabricksOutboundNetworkDependenciesEndpoint -WorkspaceName $env.workSpaceName1 -ResourceGroupName $env.resourceGroup + $config.Count | Should -BeGreaterThan 0 + } | Should -Not -Throw + } + + It 'UpdateExpanded' { + { + $config = Update-AzDatabricksWorkspace -Name $env.workSpaceName2 -ResourceGroupName $env.resourceGroup -Tag @{"key" = "value" } + $config.Name | Should -Be $env.workSpaceName2 + } | Should -Not -Throw + } + + It 'UpdateViaIdentityExpanded' { + { + $config = Get-AzDatabricksWorkspace -Name $env.workSpaceName2 -ResourceGroupName $env.resourceGroup + $config = Update-AzDatabricksWorkspace -InputObject $config -Tag @{"key" = "value" } + $config.Name | Should -Be $env.workSpaceName2 + } | Should -Not -Throw + } + + It 'UpdateRequiredNsgRule-EnableNoPublicIP-PublicNetworkAccess' { + { + $config = Update-AzDatabricksWorkspace -Name $env.workSpaceName1 -ResourceGroupName $env.resourceGroup -RequiredNsgRule 'AllRules' -EnableNoPublicIP:$false -PublicNetworkAccess 'Enabled' -Tag @{"key" = "value" } + $config.RequiredNsgRule | Should -Be 'AllRules' + $config.EnableNoPublicIP | Should -Be 'false' + $config.PublicNetworkAccess | Should -Be 'Enabled' + + $config = Update-AzDatabricksWorkspace -Name $env.workSpaceName1 -ResourceGroupName $env.resourceGroup -RequiredNsgRule 'NoAzureDatabricksRules' -EnableNoPublicIP:$true -PublicNetworkAccess 'Disabled' + $config.RequiredNsgRule | Should -Be 'NoAzureDatabricksRules' + $config.EnableNoPublicIP | Should -Be 'true' + $config.PublicNetworkAccess | Should -Be 'Disabled' + } + } + + It 'Delete' { + { + Remove-AzDatabricksWorkspace -Name $env.workSpaceName2 -ResourceGroupName $env.resourceGroup + } | Should -Not -Throw + } +} \ No newline at end of file diff --git a/tools/Mcp/src/assets/ideal-modules/Databricks/tests/utils.ps1 b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/utils.ps1 new file mode 100644 index 000000000000..0f321ab21a7c --- /dev/null +++ b/tools/Mcp/src/assets/ideal-modules/Databricks/tests/utils.ps1 @@ -0,0 +1,114 @@ +function RandomString([bool]$allChars, [int32]$len) { + if ($allChars) { + return -join ((33..126) | Get-Random -Count $len | ForEach-Object {[char]$_}) + } else { + return -join ((48..57) + (97..122) | Get-Random -Count $len | ForEach-Object {[char]$_}) + } +} +function Start-TestSleep { + [CmdletBinding(DefaultParameterSetName = 'SleepBySeconds')] + param( + [parameter(Mandatory = $true, Position = 0, ParameterSetName = 'SleepBySeconds')] + [ValidateRange(0.0, 2147483.0)] + [double] $Seconds, + + [parameter(Mandatory = $true, ParameterSetName = 'SleepByMilliseconds')] + [ValidateRange('NonNegative')] + [Alias('ms')] + [int] $Milliseconds + ) + + if ($TestMode -ne 'playback') { + switch ($PSCmdlet.ParameterSetName) { + 'SleepBySeconds' { + Start-Sleep -Seconds $Seconds + } + 'SleepByMilliseconds' { + Start-Sleep -Milliseconds $Milliseconds + } + } + } +} + +$env = @{} +if ($UsePreviousConfigForRecord) { + $previousEnv = Get-Content (Join-Path $PSScriptRoot 'env.json') | ConvertFrom-Json + $previousEnv.psobject.properties | Foreach-Object { $env[$_.Name] = $_.Value } +} +# Add script method called AddWithCache to $env, when useCache is set true, it will try to get the value from the $env first. +# example: $val = $env.AddWithCache('key', $val, $true) +$env | Add-Member -Type ScriptMethod -Value { param( [string]$key, [object]$val, [bool]$useCache) if ($this.Contains($key) -and $useCache) { return $this[$key] } else { $this[$key] = $val; return $val } } -Name 'AddWithCache' +function setupEnv() { + # Preload subscriptionId and tenant from context, which will be used in test + # as default. You could change them if needed. + $env.SubscriptionId = (Get-AzContext).Subscription.Id + $env.Tenant = (Get-AzContext).Tenant.Id + + $workSpaceName1 = RandomString -allChars $false -len 6 + $workSpaceName2 = RandomString -allChars $false -len 6 + $workSpaceName3 = RandomString -allChars $false -len 6 + $vNetName1 = RandomString -allChars $false -len 6 + $accessConnectorName1 = RandomString -allChars $false -len 6 + + $env.Add("workSpaceName1", $workSpaceName1) + $env.Add("workSpaceName2", $workSpaceName2) + $env.Add("workSpaceName3", $workSpaceName3) + $env.Add("vNetName1", $vNetName1) + $env.Add("accessConnectorName1", $accessConnectorName1) + + $networkSecurityRuleName = RandomString -allChars $false -len 6 + $networkSecurityGroupName = RandomString -allChars $false -len 6 + $vNetSubnetName1 = RandomString -allChars $false -len 6 + $vNetSubnetName2 = RandomString -allChars $false -len 6 + $vNetSubnetName3 = RandomString -allChars $false -len 6 + $vNetName = RandomString -allChars $false -len 6 + $keyVaultName = "azps" + (RandomString -allChars $false -len 6) + + $env.Add("networkSecurityRuleName", $networkSecurityRuleName) + $env.Add("networkSecurityGroupName", $networkSecurityGroupName) + $env.Add("vNetSubnetName1", $vNetSubnetName1) + $env.Add("vNetSubnetName2", $vNetSubnetName2) + $env.Add("vNetSubnetName3", $vNetSubnetName3) + $env.Add("vNetName", $vNetName) + $env.Add("keyVaultName", $keyVaultName) + + write-host "start to create test group" + $env.Add("location", "eastus") + $resourceGroup = "auto-test-databricks-" + (RandomString -allChars $false -len 2) + $env.Add("resourceGroup", $resourceGroup) + New-AzResourceGroup -Name $env.resourceGroup -Location $env.location + + $dlg = New-AzDelegation -Name dbrdl -ServiceName "Microsoft.Databricks/workspaces" + + write-host "start to create NetworkSecurity env" + $rdpRule = New-AzNetworkSecurityRuleConfig -Name $env.networkSecurityRuleName -Description "Allow RDP" -Access Allow -Protocol Tcp -Direction Inbound -Priority 100 -SourceAddressPrefix Internet -SourcePortRange * -DestinationAddressPrefix * -DestinationPortRange 3389 + $networkSecurityGroup = New-AzNetworkSecurityGroup -ResourceGroupName $env.resourceGroup -Location $env.location -Name $env.networkSecurityGroupName -SecurityRules $rdpRule + $kvSubnet = New-AzVirtualNetworkSubnetConfig -Name $env.vNetSubnetName1 -AddressPrefix "110.0.1.0/24" -ServiceEndpoint "Microsoft.KeyVault" + $priSubnet = New-AzVirtualNetworkSubnetConfig -Name $env.vNetSubnetName2 -AddressPrefix "110.0.2.0/24" -NetworkSecurityGroup $networkSecurityGroup -Delegation $dlg + $pubSubnet = New-AzVirtualNetworkSubnetConfig -Name $env.vNetSubnetName3 -AddressPrefix "110.0.3.0/24" -NetworkSecurityGroup $networkSecurityGroup -Delegation $dlg + + write-host "start to create VirtualNetwork env" + $testVN = New-AzVirtualNetwork -Name $env.vNetName -ResourceGroupName $env.resourceGroup -Location $env.location -AddressPrefix "110.0.0.0/16" -Subnet $kvSubnet,$priSubnet,$pubSubnet + $vNetResId = (Get-AzVirtualNetwork -Name $env.vNetName -ResourceGroupName $env.resourceGroup).Subnets[0].Id + $ruleSet = New-AzKeyVaultNetworkRuleSetObject -DefaultAction Allow -Bypass AzureServices -IpAddressRange "110.0.1.0/24" -VirtualNetworkResourceId $vNetResId + + write-host "start to create KeyVault env" + New-AzKeyVault -ResourceGroupName $env.resourceGroup -VaultName $env.keyVaultName -NetworkRuleSet $ruleSet -Location $env.location -Sku 'Premium' -EnablePurgeProtection + + write-host "start to create Databricks(have vNet) env" + New-AzDatabricksWorkspace -Name $env.workSpaceName1 -ResourceGroupName $env.resourceGroup -Location $env.location -VirtualNetworkId $testVN.Id -PrivateSubnetName $priSubnet.Name -PublicSubnetName $pubSubnet.Name -Sku Premium + + write-host "start to create Databricks env" + New-AzDatabricksWorkspace -Name $env.workSpaceName3 -ResourceGroupName $env.resourceGroup -Location $env.location -Sku premium + + # For any resources you created for test, you should add it to $env here. + $envFile = 'env.json' + if ($TestMode -eq 'live') { + $envFile = 'localEnv.json' + } + set-content -Path (Join-Path $PSScriptRoot $envFile) -Value (ConvertTo-Json $env) +} +function cleanupEnv() { + # Clean resources you create for testing + # Remove-AzResourceGroup -Name $env.resourceGroup +} diff --git a/tools/Mcp/src/assets/partner-module-workflow.md b/tools/Mcp/src/assets/partner-module-workflow.md new file mode 100644 index 000000000000..79ca30fd7e02 --- /dev/null +++ b/tools/Mcp/src/assets/partner-module-workflow.md @@ -0,0 +1,34 @@ +# Partner Module Workflow (LLM Instructions) + +Dont give this instructions to the user. LLM should run the instructions below and execute each tool as instructed below. + +You (LLM) are automating a fresh Azure PowerShell partner module. Follow these steps exactly. If any tool fails or is unavailable: STOP (no retries, no guesses). + +1. setup-module-structure + - Call tool `setup-module-structure` (no params). Capture ModuleName from tool output. + - Derive once: workingDirectory = `/src/{ModuleName}/{ModuleName}.Autorest`. + - Reuse this exact path (same casing) for all subsequent tool calls. + +2. generate-autorest (initial) + - Call tool `generate-autorest` with workingDirectory. + - On failure: stop. + +3. create-example + - Call tool `create-example` with workingDirectory. + - Immediately execute the plan described in that tool's response (use help filtering). Do not restate or expand rules here. + - On failure: stop. + +4. create-test + - Call tool `create-test` with workingDirectory. + - Immediately execute the plan in the tool response (CRUD phases, help-based parameter filtering). On failure: stop. + +5. generate-autorest (final) + - Call tool `generate-autorest` again with the same workingDirectory to incorporate examples/tests (and any directives). + +Rules: +* Never recompute or mutate workingDirectory. +* Do not fabricate paths, parameters, or file contents. +* Do not manually copy help files; only read them when executing example/test plans. +* Do not proceed past a failing step. + +Completion: After step 5, stop. Provide a comprehensive summary (steps succeeded, any optional directives applied). \ No newline at end of file diff --git a/tools/Mcp/src/assets/test-instructions.md b/tools/Mcp/src/assets/test-instructions.md new file mode 100644 index 000000000000..1833175dbe68 --- /dev/null +++ b/tools/Mcp/src/assets/test-instructions.md @@ -0,0 +1,52 @@ +## LLM Test Generation Directions + +You have just called tool `create-test`. + +Inputs: +- `{0}` = swagger example JSON source dir (read only) +- `{1}` = target test dir (write only) +- `{2}` = reference test dirs (style cues) +- helpDir = parentOf({1}) with `.Autorest` removed + `/help` (read only) + +Goal: Create focused CRUD (+ negative) test scripts for each top-level resource using ONLY help-documented parameters. + +File Strategy: +- Do NOT edit existing stub files. +- Create new `.Crud.Tests.ps1` per resource group (skip if no allowed params after filtering). + +Phases (include only those supported): +Create → Get → List → Update/Set → Delete/Remove → Negative. + +Parameter Filtering (same as examples): +1. Allowed params = syntax line params + `### -ParamName` headings in help; exclude `CommonParameters`. +2. Drop swagger-only fields silently. + +Implementation Pattern inside each file: +1. Dot-source common `utils.ps1` (if present) for shared env setup. +2. Create: capture returned object; store name/id for reuse. +3. Get: assert key props (Name, Id format, ProvisioningState). Use precise assertions, not whole-object dumps. +4. List: ensure resource present (filter by name/id). +5. Update/Set (if available): change minimal field; assert only that field changed. +6. Delete/Remove: remove resource; confirm absence or specific NotFound. +7. Negative: one meaningful invalid input; assert expected error pattern/text. + +Rules: +* No invented params or renaming; casing must match help. +* Parameter value precedence (for Create / Update phases and any param reuse): + 1. Use the concrete value from the corresponding swagger example JSON (source `{0}`) if present for the mapped allowed parameter. + 2. If the swagger value is clearly a placeholder/dummy (`"string"`, `""`, `"XXXX"`, empty, null), fall back to a stable placeholder (``, ``, ``, etc.). + 3. If no swagger value exists, use the stable placeholder directly. +* Do not overwrite a good concrete swagger value with a placeholder. +* Reuse variable names consistently across phases. +* Ensure cleanup for every created resource. +* Skip generating a file if nothing valid to test. +* Keep tests deterministic; avoid random sleeps or nondeterministic waits. + +Quick Validation Checklist: +1. Params all in help. +2. Each phase present only if supported. +3. Assertions targeted & minimal. +4. Resource cleaned up. +5. Exactly one clear negative case (if meaningful). + +Output: Write only to `{1}`. Do not modify examples or help files. Produce final test file contents now. diff --git a/tools/Mcp/src/index.ts b/tools/Mcp/src/index.ts index a6fe0e5a776c..f0ab9f18e760 100644 --- a/tools/Mcp/src/index.ts +++ b/tools/Mcp/src/index.ts @@ -1,16 +1,16 @@ -import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js" +import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js"; import { CodegenServer } from "./CodegenServer.js"; -import * as utils from "./services/utils.js"; -import { yamlContent } from "./types.js"; +import { logger } from "./services/logger.js"; const server = CodegenServer.getInstance(); async function main() { + logger.info("Server startup begin"); server.init(); const transport = new StdioServerTransport(); await server.connect(transport); - const time = `Codegen MCP Server running on stdio at ${new Date()}`; - console.log(time); + logger.info("Codegen MCP Server startup complete"); + logger.info("Server listening (stdio)"); // const yaml = utils.getYamlContentFromReadMe("C:/workspace/azure-powershell/tools/Mcp/test/README.md") as yamlContent; // console.log(yaml['input-file']) @@ -21,6 +21,6 @@ async function main() { } main().catch((error) => { - console.error("Fatal error in main():", error); + logger.error("Fatal error in main()", undefined, error as Error); process.exit(1); -}) \ No newline at end of file +}); \ No newline at end of file diff --git a/tools/Mcp/src/services/logger.ts b/tools/Mcp/src/services/logger.ts new file mode 100644 index 000000000000..54ce0828ee9d --- /dev/null +++ b/tools/Mcp/src/services/logger.ts @@ -0,0 +1,142 @@ +/* + * Lightweight structured logger for the MCP Codegen server. + * + * Design goals: + * - Never emit on stdout (protocol channel) – only stderr. + * - Optional JSON line format for machine ingest (set MCP_LOG_JSON=1). + * - Human readable fallback when MCP_LOG_JSON not enabled. + * - Log levels with env control (MCP_LOG_LEVEL: debug|info|warn|error). + * - Daily rotating file output stored under tools/Mcp/.logs (gitignored). + * - Minimal runtime overhead when level filters out the message. + */ +/* NOTE: This file has been updated to support daily file rotation logging. */ + +export type LogLevel = 'debug' | 'info' | 'warn' | 'error'; + +const LEVEL_ORDER: Record = { + debug: 10, + info: 20, + warn: 30, + error: 40, +}; + +let envLevel = (process.env.MCP_LOG_LEVEL || 'info').toLowerCase() as LogLevel; +let activeLevel: LogLevel = ['debug','info','warn','error'].includes(envLevel) ? envLevel : 'info'; +let jsonMode = process.env.MCP_LOG_JSON === '1' || process.env.MCP_LOG_JSON === 'true'; + +let seq = 0; // monotonically increasing sequence number for log correlation + +function levelEnabled(_level: LogLevel): boolean { + return true; +} + +function formatTs(d: Date): string { + return d.toISOString(); +} + +import fs from 'fs'; +import path from 'path'; + +// In ESM, __dirname is not defined. We deliberately rely on process.cwd(), which +// for the MCP server is expected to be the tools/Mcp directory. This avoids the +// need for fileURLToPath and remains robust when transpiled to build/. +// If the working directory differs, set MCP_LOG_ROOT to override. +const LOG_ROOT = process.env.MCP_LOG_ROOT ? path.resolve(process.env.MCP_LOG_ROOT) : process.cwd(); + +let currentDateStr: string | null = null; +let logStream: fs.WriteStream | null = null; +const logsDir = path.join(LOG_ROOT, '.logs'); + +function ensureStream(d: Date) { + const ds = d.toISOString().slice(0,10); // YYYY-MM-DD + if (ds !== currentDateStr || !logStream) { + currentDateStr = ds; + if (!fs.existsSync(logsDir)) fs.mkdirSync(logsDir, { recursive: true }); + if (logStream) { try { logStream.end(); } catch { /* ignore */ } } + const file = path.join(logsDir, `${ds}.log`); + logStream = fs.createWriteStream(file, { flags: 'a', encoding: 'utf-8' }); + } +} + +function writeLine(obj: any, fallback: string, ts: Date) { + ensureStream(ts); + if (jsonMode) { + try { + const ordered = Object.keys(obj).sort().reduce((acc: any, k) => { acc[k] = obj[k]; return acc; }, {} as any); + logStream!.write(JSON.stringify(ordered) + '\n'); + return; + } catch { /* fall back */ } + } + logStream!.write(fallback + '\n'); +} + +export interface LogContext { + // Arbitrary supplemental data – keep it small to avoid stdout noise. + [k: string]: any; +} + +// Timing helpers removed (wall clock timestamps only now) +export interface TimingHandle { end: () => number; startTime: bigint; } +export function startTimer(): TimingHandle { return { startTime: BigInt(0), end: () => 0 }; } + +const bufferLimit = 1000; +const ringBuffer: any[] = []; + +function baseLog(level: LogLevel, msg: string, ctx?: LogContext, err?: Error) { + if (!levelEnabled(level)) return; + const ts = new Date(); + const record: any = { + seq: ++seq, + ts: formatTs(ts), + level, + msg, + }; + if (ctx) record.ctx = ctx; + if (err) record.error = { name: err.name, message: err.message, stack: err.stack }; + ringBuffer.push(record); + if (ringBuffer.length > bufferLimit) ringBuffer.shift(); + const fallback = `[${record.ts}] [${level.toUpperCase()}] ${msg}` + (ctx ? ` ${JSON.stringify(ctx)}` : '') + (err ? ` error=${err.message}` : ''); + writeLine(record, fallback, ts); +} + +function reconfigure(opts: { level?: LogLevel; json?: boolean }) { + if (opts.level && LEVEL_ORDER[opts.level] !== undefined) { + activeLevel = opts.level; + } + if (typeof opts.json === 'boolean') { + jsonMode = opts.json; + } +} + +export const logger = { + get level() { return activeLevel; }, + get jsonMode() { return jsonMode; }, + setLevel(level: LogLevel) { reconfigure({ level }); }, + setJsonMode(json: boolean) { reconfigure({ json }); }, + reconfigure, + debug: (msg: string, ctx?: LogContext) => baseLog('debug', msg, ctx), + info: (msg: string, ctx?: LogContext) => baseLog('info', msg, ctx), + warn: (msg: string, ctx?: LogContext) => baseLog('warn', msg, ctx), + error: (msg: string, ctx?: LogContext, err?: Error) => baseLog('error', msg, ctx, err), + timed(label: string, fn: () => Promise, ctx?: LogContext): Promise { + baseLog('debug', `${label} started`, ctx); + return fn().then(r => { baseLog('info', `${label} finished`, ctx); return r; }) + .catch(e => { baseLog('error', `${label} failed`, ctx, e as Error); throw e; }); + }, + recent(limit: number = 200) { return ringBuffer.slice(-limit); } +}; + +// Convenience wrapper for synchronous code blocks. +export function timedSync(label: string, fn: () => T, ctx?: LogContext): T { + baseLog('debug', `${label} started`, ctx); + try { + const result = fn(); + baseLog('info', `${label} finished`, ctx); + return result; + } catch (err: any) { + baseLog('error', `${label} failed`, ctx, err); + throw err; + } +} + +export default logger; diff --git a/tools/Mcp/src/services/promptsService.ts b/tools/Mcp/src/services/promptsService.ts index 6b39b1778004..7b803356722c 100644 --- a/tools/Mcp/src/services/promptsService.ts +++ b/tools/Mcp/src/services/promptsService.ts @@ -102,18 +102,3 @@ export class PromptsService { return []; }; } - - -// Some Testing Specs: - - // { - // "name": "partner-module-workflow", - // "description": "Full autonomous workflow instructions to generate a partner Azure PowerShell module via Autorest.", - // "parameters": [ - // {"name": "serviceName", "description": "Service name placeholder. This also often corresponds with the Name of the Powershell Module.", "type": "string", "optional": true}, - // {"name": "commitId", "description": "Commit id of the swagger from azure-rest-api-specs", "type": "string", "optional": true}, - // {"name": "serviceSpecs", "description": "Service specs path under specification. Path of a swagger upto the resource-manager.", "type": "string", "optional": true}, - // {"name": "swaggerFileSpecs", "description": "Swagger JSON relative path. Entire path of the swagger down to the openapi file.", "type": "string", "optional": true} - // ], - // "callbackName": "createPartnerModuleWorkflow" - // } \ No newline at end of file diff --git a/tools/Mcp/src/services/resourcesService.ts b/tools/Mcp/src/services/resourcesService.ts index e69de29bb2d1..89c8e433e08c 100644 --- a/tools/Mcp/src/services/resourcesService.ts +++ b/tools/Mcp/src/services/resourcesService.ts @@ -0,0 +1,60 @@ +import { z, ZodRawShape } from "zod"; +import { resourceSchema } from "../types.js"; +import { CodegenServer } from "../CodegenServer.js"; + +export class ResourcesService { + private static _instance: ResourcesService; + private _server: CodegenServer | null = null; + private constructor() {} + + static getInstance(): ResourcesService { + if (!ResourcesService._instance) { + ResourcesService._instance = new ResourcesService(); + } + return ResourcesService._instance; + } + + setServer(server: CodegenServer): ResourcesService { + this._server = server; + return this; + } + + getResources(name: string, responseTemplate: string | undefined) { + let func; + switch (name) { + case "autorestReadmeTemplate": + func = this.autorestReadmeTemplate; + break; + default: + throw new Error(`Resource ${name} not found`); + } + return this.constructCallback(func, responseTemplate); + } + + constructCallback(fn: (arr: Args) => Promise, responseTemplate: string | undefined) { + return async (args: Args) => { + const content = await fn(args); + return { + contents: [ + { + uri: `resource://template`, + mimeType: "text/plain", + text: content + } + ] + }; + }; + } + + createResourceParametersFromSchema(schemas: any[]) { + // Resources typically don't have parameters in MCP, but keeping for consistency + const parameter: { [k: string]: any } = {}; + return parameter; + } + + autorestReadmeTemplate = async (args: Args): Promise => { + const template = this._server?.getResponseTemplate('autorest-readme-template'); + return template || "Template Not Found!"; + }; + +} \ No newline at end of file diff --git a/tools/Mcp/src/services/toolsService.ts b/tools/Mcp/src/services/toolsService.ts index ab0c7e4f3822..1f8d2d73c1d6 100644 --- a/tools/Mcp/src/services/toolsService.ts +++ b/tools/Mcp/src/services/toolsService.ts @@ -4,6 +4,7 @@ import * as utils from "./utils.js"; import path from 'path'; import { get, RequestOptions } from 'http'; import { toolParameterSchema } from '../types.js'; +import { logger } from './logger.js'; import { CodegenServer } from '../CodegenServer.js'; export class ToolsService { @@ -42,24 +43,55 @@ export class ToolsService { case "createTestsFromSpecs": func = this.createTestsFromSpecs; break; + case "setupModuleStructure": + func = this.setupModuleStructure; + break; + case "runPartnerModuleWorkflow": + func = this.runPartnerModuleWorkflow; + break; default: throw new Error(`Tool ${name} not found`); } - return this.constructCallback(func, responseTemplate); + return this.constructCallback(func, responseTemplate, name); } - constructCallback = (fn: (arr: Args) => Promise, responseTemplate: string|undefined): (args: Args) => Promise => { + constructCallback = (fn: (arr: Args) => Promise, responseTemplate: string|undefined, toolName: string): (args: Args) => Promise => { return async (args: Args): Promise => { - const argsArray = await fn(args); - const response = this.getResponseString(argsArray, responseTemplate) ?? ""; - return { - content: [ - { - type: "text", - text: response - } - ] - }; + const argKeys = Object.keys(args as any); + const correlationId = `${toolName}-${Date.now()}-${Math.random().toString(16).slice(2,8)}`; + // Build a sanitized snapshot of arguments (stringified & truncated) for logging + const rawArgs: any = args as any; + const sanitized: Record = {}; + for (const k of argKeys) { + try { + const v = rawArgs[k]; + let str: string; + if (typeof v === 'string') str = v; + else if (typeof v === 'number' || typeof v === 'boolean') str = String(v); + else str = JSON.stringify(v); + if (str && str.length > 400) str = str.slice(0, 400) + `...[${str.length - 400} trunc]`; + sanitized[k] = str; + } catch { + sanitized[k] = '[unserializable]'; + } + } + logger.info('Tool invoked', { tool: toolName, correlationId, args: sanitized }); + try { + const argsArray = await fn(args); + const response = this.getResponseString(argsArray, responseTemplate) ?? ""; + logger.info(`Tool completed`, { tool: toolName, correlationId }); + return { + content: [ + { + type: "text", + text: response + } + ] + }; + } catch (err: any) { + logger.error(`Tool failed`, { tool: toolName, correlationId }, err); + throw err; + } }; } @@ -135,36 +167,186 @@ export class ToolsService { const workingDirectory = z.string().parse(Object.values(args)[0]); const examplePath = path.join(workingDirectory, "examples"); const exampleSpecsPath = await utils.getExamplesFromSpecs(workingDirectory); - const exampleSpecs = await utils.getExampleJsonContent(exampleSpecsPath); - for (const {name, content} of exampleSpecs) { - const example = await utils.flattenJsonObject(content['parameters']); - try { - const response = await this._server!.elicitInput({ - "message": `Please review example data for ${name}: ${example.map(({key: k, value:v}) => ` \n${k}: ${v}`)}`, - "requestedSchema": { - "type": "object", - "properties": { - "skipAll": { - "type": "boolean", - "description": "If true, skip the review of all examples and proceed to the next step." - } - }, - } - }); - if (response.content && response.content['skipAll'] === true) { - break; - } - } catch (error) { - console.error(`Error eliciting input for example ${name}:`, error); - } - } - return [exampleSpecsPath, examplePath]; + // Interactive elicitation removed previously; also parameter export removed (simplified workflow). + const idealExamplePaths = utils.getIdealModuleExamplePaths(); + return [exampleSpecsPath, examplePath, idealExamplePaths]; } createTestsFromSpecs = async (args: Args): Promise => { const workingDirectory = z.string().parse(Object.values(args)[0]); const testPath = path.join(workingDirectory, "test"); const exampleSpecsPath = await utils.getExamplesFromSpecs(workingDirectory); - return [exampleSpecsPath, testPath]; + const idealTestPaths = utils.getIdealModuleTestPaths(); + return [exampleSpecsPath, testPath, idealTestPaths]; + } + + setupModuleStructure = async (args: Args): Promise => { + try { + const runId = `setup-${Date.now()}-${Math.random().toString(16).slice(2,8)}`; + + // List available services with dropdown + const modules = await utils.listSpecModules(); + logger.debug('Eliciting user input', { step: 'service-select', runId, moduleCount: modules.length }); + + const serviceResponse = await this._server!.elicitInput({ + message: `Select an Azure service from the dropdown below:`, + requestedSchema: { + type: "object", + properties: { + service: { + type: "string", + description: "Select a service from the dropdown", + enum: modules + } + }, + required: ["service"] + } + }); + + const selectedService = serviceResponse.content?.service as string; + if (!selectedService) { + throw new Error("No service selected"); + } + logger.info('User input captured', { step: 'service', service: selectedService, runId }); + + // List providers for the selected service with dropdown + const providers = await utils.listProvidersForService(selectedService); + if (providers.length === 0) { + throw new Error(`No providers found for service '${selectedService}'`); + } + logger.debug('Eliciting user input', { step: 'provider-select', runId, providerCount: providers.length, service: selectedService }); + + const providerResponse = await this._server!.elicitInput({ + message: `Select a provider for ${selectedService} from the dropdown below:`, + requestedSchema: { + type: "object", + properties: { + provider: { + type: "string", + description: "Select a provider from the dropdown", + enum: providers + } + }, + required: ["provider"] + } + }); + + const selectedProvider = providerResponse.content?.provider as string; + if (!selectedProvider) { + throw new Error("No provider selected"); + } + logger.info('User input captured', { step: 'provider', provider: selectedProvider, runId }); + + // List API versions with dropdown combining version and stability + const apiVersions = await utils.listApiVersions(selectedService, selectedProvider); + const allVersions = [ + ...apiVersions.stable.map(v => ({ version: v, stability: 'stable' as const })), + ...apiVersions.preview.map(v => ({ version: v, stability: 'preview' as const })) + ]; + + if (allVersions.length === 0) { + throw new Error(`No API versions found for ${selectedService}/${selectedProvider}`); + } + + const versionOptions = allVersions.map(v => `${v.version} (${v.stability})`); + logger.debug('Eliciting user input', { step: 'version-select', runId, versionOptionCount: versionOptions.length, service: selectedService, provider: selectedProvider }); + + const versionResponse = await this._server!.elicitInput({ + message: `Select an API version for ${selectedService}/${selectedProvider} from the dropdown below:`, + requestedSchema: { + type: "object", + properties: { + versionWithStability: { + type: "string", + description: "Select an API version with stability level", + enum: versionOptions + } + }, + required: ["versionWithStability"] + } + }); + + const selectedVersionWithStability = versionResponse.content?.versionWithStability as string; + if (!selectedVersionWithStability) { + throw new Error("Version not selected"); + } + + const versionMatch = selectedVersionWithStability.match(/^(.+) \((stable|preview)\)$/); + if (!versionMatch) { + throw new Error("Invalid version format selected"); + } + + const selectedVersion = versionMatch[1]; + const selectedStability = versionMatch[2] as 'stable' | 'preview'; + logger.info('User input captured', { step: 'version', version: selectedVersion, stability: selectedStability, runId }); + + // Resolve Readme placeholder values based on Responses + const resolved = await utils.resolveAutorestInputs({ + service: selectedService, + provider: selectedProvider, + stability: selectedStability, + version: selectedVersion + }); + logger.debug('Autorest inputs resolved', { runId, resolvedServiceName: resolved.serviceName, commitId: resolved.commitId }); + + const moduleNameResponse = await this._server!.elicitInput({ + message: `What would you like call the powershell module? \n\n Configuration resolved:\n- Service: ${selectedService}\n- Provider: ${selectedProvider}\n- Version: ${selectedVersion} (${selectedStability})\n- Service Name: ${resolved.serviceName}\n- Commit ID: ${resolved.commitId}\n- Service Specs: ${resolved.serviceSpecs}\n- Swagger File: ${resolved.swaggerFileSpecs}`, + requestedSchema: { + type: "object", + properties: { + moduleName: { + type: "string", + description: "Enter the PowerShell module name (e.g., 'HybridConnectivity')" + } + }, + required: ["moduleName"] + } + }); + + const moduleName = moduleNameResponse.content?.moduleName as string; + if (!moduleName) { + throw new Error("No module name provided"); + } + logger.info('User input captured', { step: 'moduleName', moduleName, runId }); + + // Create folder structure and README.md + const mcpPath = process.cwd(); // Current working directory is tools/Mcp + const azurePowerShellRoot = path.resolve(mcpPath, '..', '..'); // Go up two levels to azure-powershell root + const srcPath = path.join(azurePowerShellRoot, 'src'); + const modulePath = path.join(srcPath, moduleName); + const autorestPath = path.join(modulePath, `${moduleName}.Autorest`); + const readmePath = path.join(autorestPath, 'README.md'); + + await utils.createDirectoryIfNotExists(modulePath); + await utils.createDirectoryIfNotExists(autorestPath); + + let readmeContent = this._server!.getResponseTemplate('autorest-readme-template'); + if (!readmeContent) { + throw new Error('README template not found in server responses'); + } + + // Replace placeholders + readmeContent = readmeContent + .replace('{commitId}', resolved.commitId) + .replace('{serviceSpecs}', resolved.serviceSpecs) + .replace(/\{serviceSpecs\}/g, resolved.serviceSpecs) + .replace('{swaggerFileSpecs}', resolved.swaggerFileSpecs) + .replace(/\{moduleName\}/g, moduleName); + + // Write README.md file + await utils.writeFileIfNotExists(readmePath, readmeContent); + + logger.info('Setup module structure complete', { runId, moduleName }); + return [moduleName]; + + } catch (error) { + const errorMessage = error instanceof Error ? error.message : String(error); + logger.error('Setup module structure failed', { errorMessage }); + return [`Error during setup: ${errorMessage}`]; + } + } + + runPartnerModuleWorkflow = async (args: Args): Promise => { + return []; } } \ No newline at end of file diff --git a/tools/Mcp/src/services/utils.ts b/tools/Mcp/src/services/utils.ts index 154252fdf6c0..0921db32f694 100644 --- a/tools/Mcp/src/services/utils.ts +++ b/tools/Mcp/src/services/utils.ts @@ -1,8 +1,14 @@ import fs from 'fs'; import yaml from "js-yaml"; import { yamlContent } from '../types.js'; -import { execSync } from 'child_process'; +import { spawnSync } from 'child_process'; +import { logger } from './logger.js'; import path from 'path'; +import { Dirent } from 'fs'; + +const GITHUB_API_BASE = 'https://api.github.com'; +const REST_API_SPECS_OWNER = 'Azure'; +const REST_API_SPECS_REPO = 'azure-rest-api-specs'; const _pwshCD = (path: string): string => { return `pwsh -Command "$path = resolve-path ${path} | Set-Location"` } const _autorestReset = "autorest --reset" @@ -22,20 +28,30 @@ function testYaml() { } export function generateAndBuild(workingDirectory: string): void { - const genBuildCommands = [_autorestReset, _autorest, _pwshBuild] - + const genBuildCommands = [_autorestReset, _autorest, _pwshBuild]; for (const command of genBuildCommands) { - try { - console.log(`Executing command: ${command}`); - const result = execSync(command, { stdio: 'inherit', cwd: workingDirectory }); + logger.info(`Executing command`, { command }); + const [bin, ...args] = command.split(/\s+/); + const res = spawnSync(bin, args, { cwd: workingDirectory, encoding: 'utf-8' }); + if (res.error) { + logger.error(`Command spawn error`, { command }, res.error as any); + throw res.error; } - catch (error) { - console.error("Error executing command:", error); - throw error; + if (res.status !== 0) { + logger.error(`Command failed`, { command, status: res.status, stderr: trimLarge(res.stderr) }); + throw new Error(`Command failed: ${command}`); } + if (res.stdout) logger.debug(`Command stdout`, { command, stdout: trimLarge(res.stdout) }); + if (res.stderr) logger.debug(`Command stderr`, { command, stderr: trimLarge(res.stderr) }); + logger.info(`Command finished`, { command }); } } +function trimLarge(text: string, max = 4000): string { + if (!text) return ''; + return text.length > max ? text.slice(0, max) + `...[truncated ${text.length - max}]` : text; +} + export function getYamlContentFromReadMe(readmePath: string): string { const readmeContent = fs.readFileSync(readmePath, 'utf8'); const yamlRegex = /```\s*yaml(?:\w+)?\r?\n?(?[\s\S]*?)\r?\n```/g; @@ -73,11 +89,110 @@ export async function getSwaggerContentFromUrl(swaggerUrl: string): Promise } return await response.json(); } catch (error) { - console.error('Error fetching swagger content:', error); + logger.error('Error fetching swagger content', { swaggerUrl }, error as Error); throw error; } } +export async function getSpecsHeadCommitSha(branch: string = 'main'): Promise { + const url = `${GITHUB_API_BASE}/repos/${REST_API_SPECS_OWNER}/${REST_API_SPECS_REPO}/branches/${branch}`; + const res = await fetch(url); + if (!res.ok) { + throw new Error(`Failed to fetch branch '${branch}' info: ${res.status}`); + } + const data = await res.json(); + return data?.commit?.sha as string; +} + +export async function listSpecModules(): Promise { + const url = `${GITHUB_API_BASE}/repos/${REST_API_SPECS_OWNER}/${REST_API_SPECS_REPO}/contents/specification`; + const res = await fetch(url); + if (!res.ok) { + throw new Error(`Failed to list specification directory: ${res.status}`); + } + const list = await res.json(); + return (Array.isArray(list) ? list : []) + .filter((e: any) => e.type === 'dir') + .map((e: any) => e.name) + .sort((a: string, b: string) => a.localeCompare(b)); +} + +export async function listProvidersForService(service: string): Promise { + const url = `${GITHUB_API_BASE}/repos/${REST_API_SPECS_OWNER}/${REST_API_SPECS_REPO}/contents/specification/${service}/resource-manager`; + const res = await fetch(url); + if (!res.ok) { + // Sometimes service has alternate structure or doesn't exist + throw new Error(`Failed to list providers for service '${service}': ${res.status}`); + } + const list = await res.json(); + return (Array.isArray(list) ? list : []) + .filter((e: any) => e.type === 'dir') + .map((e: any) => e.name) + .sort((a: string, b: string) => a.localeCompare(b)); +} + +export async function listApiVersions(service: string, provider: string): Promise<{ stable: string[]; preview: string[] }> { + const base = `specification/${service}/resource-manager/${provider}`; + const folders = ['stable', 'preview'] as const; + const result: { stable: string[]; preview: string[] } = { stable: [], preview: [] }; + for (const f of folders) { + const url = `${GITHUB_API_BASE}/repos/${REST_API_SPECS_OWNER}/${REST_API_SPECS_REPO}/contents/${base}/${f}`; + const res = await fetch(url); + if (!res.ok) { + // ignore missing + continue; + } + const list = await res.json(); + const versions = (Array.isArray(list) ? list : []) + .filter((e: any) => e.type === 'dir') + .map((e: any) => e.name) + .sort((a: string, b: string) => a.localeCompare(b, undefined, { numeric: true })); + result[f] = versions; + } + return result; +} + +export async function listSwaggerFiles(service: string, provider: string, stability: 'stable'|'preview', version: string): Promise { + const dir = `specification/${service}/resource-manager/${provider}/${stability}/${version}`; + const url = `${GITHUB_API_BASE}/repos/${REST_API_SPECS_OWNER}/${REST_API_SPECS_REPO}/contents/${dir}`; + const res = await fetch(url); + if (!res.ok) { + throw new Error(`Failed to list files for ${dir}: ${res.status}`); + } + const list = await res.json(); + const files: any[] = Array.isArray(list) ? list : []; + // Find JSON files; prefer names ending with provider or service + const jsons = files.filter(f => f.type === 'file' && f.name.endsWith('.json')); + const preferred = jsons.filter(f => new RegExp(`${provider.split('.').pop()}|${service}`, 'i').test(f.name)); + const ordered = (preferred.length ? preferred : jsons).map(f => f.path); + return ordered; +} + +export async function resolveAutorestInputs(params: { + service: string; + provider: string; + stability: 'stable'|'preview'; + version: string; + swaggerPath?: string; // optional repo-relative path override +}): Promise<{ serviceName: string; commitId: string; serviceSpecs: string; swaggerFileSpecs: string }> { + const commitId = await getSpecsHeadCommitSha('main'); + const serviceSpecs = `${params.service}/resource-manager`; + let swaggerFileSpecs = params.swaggerPath ?? ''; + if (!swaggerFileSpecs) { + const candidates = await listSwaggerFiles(params.service, params.provider, params.stability, params.version); + if (candidates.length === 0) { + throw new Error(`No swagger files found for ${params.service}/${params.provider}/${params.stability}/${params.version}`); + } + swaggerFileSpecs = candidates[0]; + } + return { + serviceName: params.provider.replace(/^Microsoft\./, ''), + commitId, + serviceSpecs, + swaggerFileSpecs + }; +} + export async function findAllPolyMorphism(workingDirectory: string): Promise>> { const polymorphism = new Map>(); const moduleReadmePath = path.join(workingDirectory, "README.md"); @@ -155,7 +270,7 @@ export async function getExamplesFromSpecs(workingDirectory: string): Promise = []; if (!fs.existsSync(exampleSpecsPath)) { - console.error(`Example specs directory not found at ${exampleSpecsPath}`); + logger.warn(`Example specs directory not found`, { exampleSpecsPath }); } try { @@ -178,13 +293,13 @@ export function getExampleJsonContent(exampleSpecsPath: string): Array<{name: st const fileContent = fs.readFileSync(filePath, 'utf8'); const jsonContent = JSON.parse(fileContent); jsonList.push({name: jsonFile.split('.json')[0], content: jsonContent}); - console.log(`Loaded example JSON: ${jsonFile}`); + logger.debug(`Loaded example JSON`, { file: jsonFile }); } catch (error) { - console.error(`Error reading JSON file ${jsonFile}:`, error); + logger.error(`Error reading JSON file`, { file: jsonFile }, error as Error); } } } catch (error) { - console.error(`Error reading examples directory ${exampleSpecsPath}:`, error); + logger.error(`Error reading examples directory`, { exampleSpecsPath }, error as Error); } return jsonList; @@ -241,6 +356,76 @@ export function unflattenJsonObject(keyValuePairs: Array<{ key: string; value: a return result; } +export async function createDirectoryIfNotExists(dirPath: string): Promise { + try { + if (!fs.existsSync(dirPath)) { + fs.mkdirSync(dirPath, { recursive: true }); + logger.info(`Created directory`, { dirPath }); + } + } catch (error) { + logger.error(`Error creating directory`, { dirPath }, error as Error); + throw error; + } +} + +export async function writeFileIfNotExists(filePath: string, content: string): Promise { + try { + if (!fs.existsSync(filePath)) { + fs.writeFileSync(filePath, content, 'utf8'); + logger.info(`Created file`, { filePath }); + } else { + logger.debug(`File already exists`, { filePath }); + } + } catch (error) { + logger.error(`Error writing file`, { filePath }, error as Error); + throw error; + } +} + +export function getIdealModuleExamplePaths(): string { + const idealModulesRoot = path.join(process.cwd(), 'src', 'ideal-modules'); + try { + if (!fs.existsSync(idealModulesRoot)) { + return ''; + } + const modules: Dirent[] = fs.readdirSync(idealModulesRoot, { withFileTypes: true }); + const exampleDirs: string[] = []; + for (const mod of modules) { + if (!mod.isDirectory()) continue; + const candidate = path.join(idealModulesRoot, mod.name, 'examples'); + if (fs.existsSync(candidate)) { + exampleDirs.push(candidate); + } + } + return exampleDirs.join(';'); + } catch (err) { + logger.error('Error collecting ideal module example paths', undefined, err as Error); + return ''; + } +} + +export function getIdealModuleTestPaths(): string { + const idealModulesRoot = path.join(process.cwd(), 'src', 'assets', 'ideal-modules'); + try { + if (!fs.existsSync(idealModulesRoot)) { + return ''; + } + const modules: Dirent[] = fs.readdirSync(idealModulesRoot, { withFileTypes: true }); + const testDirs: string[] = []; + for (const mod of modules) { + if (!mod.isDirectory()) continue; + const candidate = path.join(idealModulesRoot, mod.name, 'tests'); + if (fs.existsSync(candidate)) { + testDirs.push(candidate); + } + } + return testDirs.join(';'); + } catch (err) { + logger.error('Error collecting ideal module test paths', undefined, err as Error); + return ''; + } +} + diff --git a/tools/Mcp/src/specs/prompts/partner-module-workflow.md b/tools/Mcp/src/specs/prompts/partner-module-workflow.md deleted file mode 100644 index 0fd3aebd48bb..000000000000 --- a/tools/Mcp/src/specs/prompts/partner-module-workflow.md +++ /dev/null @@ -1,98 +0,0 @@ -# Execution rules -- Do not ask before running the command, just go ahead and run. - -# Role and Objective -- You are an autonomous agent that generates Azure PowerShell modules using Autorest for partners. -- Execute commands confidently without asking for confirmation. -- Follow all steps carefully and halt if any MCP tool is missing or fails. -- Think independently using your full context and reset if needed. -- No need to provide status updates at each step. -- FOLLOW ALL THE STEPS. DO NOT SKIP ANY STEPS. DO NOT MISS ANY STEPS. -- If an mcp tool is not found or the tool fails then please halt execution. - -# Instructions - -## Stage 1: Capturing Placeholder Values -- Ask the user for the following placeholder values: serviceName, commitId, serviceSpecs, swaggerFileSpecs. - - Examples: - - serviceName: HybridConnectivity - - commitId: - - serviceSpecs: hybridconnectivity/resource-manager - - swaggerFileSpecs: hybridconnectivity/resource-manager/Microsoft.HybridConnectivity/stable/2024-12-01/hybridconnectivity.json -- Do not replace or modify this prompt file. -- Store the values for use in later steps like generating the README and executing Autorest. -- Once values are stored, mark Stage 1 as complete. - -## Stage 2: Generating partner powershell module -- FOLLOW ALL THE STEPS. DO NOT SKIP ANY STEPS. -- Navigate to the `src` folder in the home "azure-powershell" directory. -- Create a new folder named and within it a new folder named `.Autorest`. (If not already present) -- Move into the new folder `/.Autorest`, using the command `cd /.Autorest`. -- Create a new file `README.md`. (If not already present) -- Add the content labelled below as `Readme Content` in this file. -- Use the "generate-autorest" mcp tool to generate the module. -- Stage 2 Complete. - -## Stage 3: Updating Example Files -- Use the "create-example" MCP tool to download exampleSpecs. Use the output of this tool as a prompt input/task for you. -- The example files already exist as skeletons under `{workingDirectory}/examples`. -- Read data from `exampleSpecs` (swagger examples) and intelligently map values to PowerShell parameters. -- Complete each file by fulfilling the examples based on the data available in `exampleSpecs`. -- Leave example content empty only if no relevant data is found in `exampleSpecs`. -- Once all example files are updated, mark stage 3 as complete. - -## Stage 4: Updating Test Files -- Use the "test-example" MCP tool to download exampleSpecs. Use the output of this tool as a prompt input/task for you. -- Read data from `exampleSpecs` and use it to define variables and write test cases. -- Define setup variables inside `setupEnv` in `utils.ps1`, inferred from `exampleSpecs`. -- Use those variables in the actual test case content. -- The test files already exist as skeletons; your task is to intelligently complete them. -- Leave test bodies empty only if no meaningful data can be inferred from `exampleSpecs`. -- Once all test files are updated, mark stage 4 as complete. - -## Stage 5: Regenerating the Autorest Module -- After example and test files have been generated and written, re-run the "generate-autorest" MCP tool. -- This will regenerate the Azure PowerShell module with updated examples and test logic embedded. -- Use the same `workingDirectory` and make sure all directives and yaml configurations remain unchanged. -- This is a mandatory finalization step before pushing to GitHub. -- Do not skip this regeneration even if the module was generated earlier. - -# Readme Content - -### AutoRest Configuration -> see https://aka.ms/autorest - -```yaml - -commit: - -require: - - $(this-folder)/../../readme.azure.noprofile.md - - $(repo)/specification//readme.md - -try-require: - - $(repo)/specification//readme.powershell.md - -input-file: - - $(repo)/ - -module-version: 0.1.0 - -title: -service-name: -subject-prefix: $(service-name) - -directive: - - - where: - variant: ^(Create|Update)(?!.*?(Expanded|JsonFilePath|JsonString)) - remove: true - - - where: - variant: ^CreateViaIdentity$|^CreateViaIdentityExpanded$ - remove: true - - - where: - verb: Set - remove: true -``` diff --git a/tools/Mcp/src/specs/responses.json b/tools/Mcp/src/specs/responses.json index 4b1dddf55193..fbd49eeb91e8 100644 --- a/tools/Mcp/src/specs/responses.json +++ b/tools/Mcp/src/specs/responses.json @@ -22,21 +22,31 @@ { "name": "create-example", "type": "tool", - "text": "Read examples from specs under {0}. Fulfill examples under {1}. You are expert in Azure-PowerShell and Autorest.PowerShell. Leave example as empty if you don't find any matches. You know how to map data from {0} to {1}" + "text": "@file:assets/example-instructions.md" }, { "name": "create-test", "type": "tool", - "text": "Read examples from specs are under {0}. Implement empty test stubs under {1}. Test stubs are named as '.Test.ps1'. Define variables in function 'setupEnv' in 'utils.ps1' under {1}, and use these variables for test cases. Value of these variables are from {0}. Leave test cases as empty if you don't find any matches. You are expert in Azure-PowerShell and Autorest.PowerShell, You know how to map data from {0} to {1}. " + "text": "@file:assets/test-instructions.md" }, { - "name": "create-greeting", - "type": "prompt", - "text": "Please generate a greeting in {1} style to {0}." + "name": "setup-module-structure", + "type": "tool", + "text": "Created the module structure under the folder: {0}." }, { "name": "partner-module-workflow", "type": "prompt", - "text": "@file:prompts/partner-module-workflow.md" + "text": "@file:assets/partner-module-workflow.md" + }, + { + "name": "autorest-readme-template", + "type": "resource", + "text": "@file:assets/autorest-readme-template.md" + }, + { + "name": "run-partner-module-workflow", + "type": "tool", + "text": "@file:assets/partner-module-workflow.md" } ] \ No newline at end of file diff --git a/tools/Mcp/src/specs/specs.json b/tools/Mcp/src/specs/specs.json index 98c85eba76d8..a6584058f7a7 100644 --- a/tools/Mcp/src/specs/specs.json +++ b/tools/Mcp/src/specs/specs.json @@ -71,18 +71,21 @@ } ], "callbackName": "createTestsFromSpecs" + }, + { + "name": "setup-module-structure", + "description": "Setup Azure PowerShell module structure by selecting service, provider, and API version through interactive dropdowns", + "parameters": [], + "callbackName": "setupModuleStructure" + }, + { + "name": "run-partner-module-workflow", + "description": "This tools generates an autorest powershell module. This can be used to automatically generate a powershell module. Use this when a user asks about: partner module steps, autorest workflow, generating an Azure PowerShell module, order of tools, examples/tests guidance, working directory invariants, or stop-on-failure logic. Includes: exact tool invocation order, workingDirectory derivation rule (never recompute), STOP conditions (no retries), and completion summary expectations. Keywords: partner module, autorest, azure powershell onboarding, create-example, create-test, setup-module-structure, generate-autorest, workflow steps, module generation, examples, tests, regeneration.", + "parameters": [], + "callbackName": "runPartnerModuleWorkflow" } ], "prompts": [ - { - "name": "create-greeting", - "description": "Generate a customized greeting message", - "parameters": [ - {"name": "name", "description": "Name of the person to greet", "type": "string"}, - {"name": "style", "description": "The style of greeting, such a formal, excited, or casual. If not specified casual will be used", "type": "string", "optional": true} - ], - "callbackName": "createGreetingPrompt" - }, { "name": "partner-module-workflow", "description": "Full autonomous workflow instructions to generate a partner Azure PowerShell module via Autorest.", diff --git a/tools/Mcp/src/types.ts b/tools/Mcp/src/types.ts index f578b44abc04..559fbcdf7385 100644 --- a/tools/Mcp/src/types.ts +++ b/tools/Mcp/src/types.ts @@ -25,6 +25,13 @@ export interface promptSchema { callbackName: string; } +export interface resourceSchema { + name: string; + description: string; + parameters?: any[]; + callbackName: string; +} + export interface responseSchema { name: string; type: string; diff --git a/tools/Mcp/test/vscode/mcpprompt.md b/tools/Mcp/test/vscode/mcpprompt.md deleted file mode 100644 index 3e0a9d20dc06..000000000000 --- a/tools/Mcp/test/vscode/mcpprompt.md +++ /dev/null @@ -1,98 +0,0 @@ -# Execution rules -- Do not ask before running the command, just go ahead and run. - -# Role and Objective -- You are an autonomous agent that generates Azure PowerShell modules using Autorest for partners. -- Execute commands confidently without asking for confirmation. -- Follow all steps carefully and halt if any MCP tool is missing or fails. -- Think independently using your full context and reset if needed. -- No need to provide status updates at each step. -- FOLLOW ALL THE STEPS. DO NOT SKIP ANY STEPS. DO NOT MISS ANY STEPS. -- If an mcp tool is not found or the tool fails then please halt execution. - -# Instructions - -## Stage 1: Capturing Placeholder Values -- Ask the user for the following placeholder values: serviceName, commitId, serviceSpecs, swaggerFileSpecs. - - Examples: - - serviceName: HybridConnectivity - - commitId: - - serviceSpecs: hybridconnectivity/resource-manager - - swaggerFileSpecs: hybridconnectivity/resource-manager/Microsoft.HybridConnectivity/stable/2024-12-01/hybridconnectivity.json -- Do not replace or modify this prompt file. -- Store the values for use in later steps like generating the README and executing Autorest. -- Once values are stored, mark Stage 1 as complete. - -## Stage 2: Generating partner powershell module -- FOLLOW ALL THE STEPS. DO NOT SKIP ANY STEPS. -- Navigate to the `src` folder in the home "azure-powershell" directory. -- Create a new folder named and within it a new folder named `.Autorest`. (If not already present) -- Move into the new folder `/.Autorest`, using the command `cd /.Autorest`. -- Create a new file `README.md`. (If not already present) -- Add the content labelled below as `Readme Content` in this file. -- Use the "generate-autorest" mcp tool to generate the module. -- Stage 2 Complete. - -## Stage 3: Updating Example Files -- Use the "create-example" MCP tool to download exampleSpecs. Use the output of this tool as a prompt input/task for you. -- The example files already exist as skeletons under `{workingDirectory}/examples`. -- Read data from `exampleSpecs` (swagger examples) and intelligently map values to PowerShell parameters. -- Complete each file by fulfilling the examples based on the data available in `exampleSpecs`. -- Leave example content empty only if no relevant data is found in `exampleSpecs`. -- Once all example files are updated, mark stage 3 as complete. - -## Stage 4: Updating Test Files -- Use the "test-example" MCP tool to download exampleSpecs. Use the output of this tool as a prompt input/task for you. -- Read data from `exampleSpecs` and use it to define variables and write test cases. -- Define setup variables inside `setupEnv` in `utils.ps1`, inferred from `exampleSpecs`. -- Use those variables in the actual test case content. -- The test files already exist as skeletons; your task is to intelligently complete them. -- Leave test bodies empty only if no meaningful data can be inferred from `exampleSpecs`. -- Once all test files are updated, mark stage 4 as complete. - -## Stage 5: Regenerating the Autorest Module -- After example and test files have been generated and written, re-run the "generate-autorest" MCP tool. -- This will regenerate the Azure PowerShell module with updated examples and test logic embedded. -- Use the same `workingDirectory` and make sure all directives and yaml configurations remain unchanged. -- This is a mandatory finalization step before pushing to GitHub. -- Do not skip this regeneration even if the module was generated earlier. - -# Readme Content - -### AutoRest Configuration -> see https://aka.ms/autorest - -```yaml - -commit: - -require: - - $(this-folder)/../../readme.azure.noprofile.md - - $(repo)/specification//readme.md - -try-require: - - $(repo)/specification//readme.powershell.md - -input-file: - - $(repo)/specification/ - -module-version: 0.1.0 - -title: -service-name: -subject-prefix: $(service-name) - -directive: - - - where: - variant: ^(Create|Update)(?!.*?(Expanded|JsonFilePath|JsonString)) - remove: true - - - where: - variant: ^CreateViaIdentity$|^CreateViaIdentityExpanded$ - remove: true - - - where: - verb: Set - remove: true -```