Skip to content

Commit 7a7489e

Browse files
committed
Updates
1 parent af3ef67 commit 7a7489e

File tree

3 files changed

+9
-9
lines changed

3 files changed

+9
-9
lines changed

azure-local/known-issues-2411.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ The following table lists the known issues in this release:
4444

4545
|Feature |Issue |Workaround |
4646
|---------|---------|---------|
47-
| Security vulnerability <!--ADO--> |Microsoft has identified a security vulnerability that could expose the local admin credentials used during the creation of Arc VMs on Azure Stack HCI to non-admin users on the VM and on the hosts. <br> Arc VMs running on releases prior to Azure Stack HCI 2411 release are vulnerable. |To identify the Arc VMs that require this change and to change the account passwords, see detailed instructions in: https:\/\/aka.ms\/CVE-2024-49060.|
47+
| Security vulnerability <!--ADO--> |Microsoft has identified a security vulnerability that could expose the local admin credentials used during the creation of Arc VMs on Azure Local to non-admin users on the VM and on the hosts. <br> Arc VMs running on releases prior to Azure LOcal 2411 release are vulnerable. |To identify the Arc VMs that require this change and to change the account passwords, see detailed instructions in: https:\/\/aka.ms\/CVE-2024-49060.|
4848
| Update <!--ADO--> | With the 2411 release, applying a Solution Builder Extension package requires a separate update run. Solution and Solution Builder Extension update are not combined in a single update run. ||
4949
| Update <!--ADO--> | When applying solution update, the update fails at the step "update ARB and extension" error "Clear-AzContext failed with 0 and Exception calling "Initialize" with "1" argument(s): "Object reference not set to an instance of an object." |Follow these steps on each node of the system. <br> 1. Check if `Az.Accounts` PowerShell module version 3.0.4 is installed. Run the following command: <br><br> `Get-InstalledModule Az.Accounts`<br><br> Verify that the version in output is 3.0.4. <br><br> 2. Force install `Az.Accounts` PowerShell module version 3.0.3. Run the following commands: <br><br> `Uninstall-Module -Name Az.Accounts -RequiredVersion 3.0.4 -Force`<br> `Install-Module -Name Az.Accounts -RequiredVersion 3.0.3 -Force` <br><br> 3. Confirm `Az.Accounts` PowerShell module version 3.0.3 is installed. Run the following command:<br><br> `Get-InstalledModule Az.Accounts`. <br><br>Verify that the version in the output is 3.0.3. <br><br>4. Retry the update. |
5050

@@ -67,11 +67,11 @@ The following table lists the known issues from previous releases:
6767
| Update | There's an intermittent issue in this release when the Azure portal incorrectly reports the update status as **Failed to update** or **In progress** though the update is complete. |[Connect to your Azure Local instance](./update/update-via-powershell-23h2.md#connect-to-your-azure-local) via a remote PowerShell session. To confirm the update status, run the following PowerShell cmdlets: <br><br> `$Update = get-solutionupdate`\| `? version -eq "<version string>"`<br><br>Replace the version string with the version you're running. For example, "10.2405.0.23". <br><br>`$Update.state`<br><br>If the update status is **Installed**, no further action is required on your part. Azure portal refreshes the status correctly within 24 hours. <br> To refresh the status sooner, follow these steps on one of the cluster nodes. <br>Restart the Cloud Management cluster group.<br>`Stop-ClusterGroup "Cloud Management"`<br>`Start-ClusterGroup "Cloud Management"`|
6868
| Update <!--28299865--> |During an initial MOC update, a failure occurs due to the target MOC version not being found in the catalog cache. The follow-up updates and retries show MOC in the target version, without the update succeeding, and as a result the Arc Resource Bridge update fails.<br><br>To validate this issue, collect the update logs using [Troubleshoot solution updates for Azure Local, version 23H2](./update/update-troubleshooting-23h2.md#collect-update-logs). The log files should show a similar error message (current version might differ in the error message):<br><br>`[ERROR: { "errorCode": "InvalidEntityError", "errorResponse": "{\n\"message\": \"the cloud fabric (MOC) is currently at version v0.13.1. A minimum version of 0.15.0 is required for compatibility\"\n}" }]`|Follow these steps to mitigate the issue:<br><br>1. To find the MOC agent version, run the following command: `'C:\Program Files\AksHci\wssdcloudagent.exe' version`.<br><br>2. Use the output of the command to find the MOC version from the table below that matches the agent version, and set `$initialMocVersion` to that MOC version. Set the `$targetMocVersion` by finding the Azure Local build you're updating to and get the matching MOC version from the following table. Use these values in the mitigation script provided below:<br><br><table><tr><td><b>Build</b></td><td><b>MOC version</b></td><td><b>Agent version</b></td></tr><tr><td>2311.2</td><td>1.0.24.10106</td><td>v0.13.0-6-gf13a73f7, v0.11.0-alpha.38,01/06/2024</td></tr><tr><td>2402</td><td>1.0.25.10203</td><td>v0.14.0, v0.13.1, 02/02/2024</td></tr><tr><td>2402.1</td><td>1.0.25.10302</td><td>v0.14.0, v0.13.1, 03/02/2024</td></tr><tr><td>2402.2</td><td>1.1.1.10314</td><td>v0.16.0-1-g04bf0dec, v0.15.1, 03/14/2024</td></tr><tr><td>2405/2402.3</td><td>1.3.0.10418</td><td>v0.17.1, v0.16.5, 04/18/2024</td></tr></table><br><br>For example, if the agent version is v0.13.0-6-gf13a73f7, v0.11.0-alpha.38,01/06/2024, then `$initialMocVersion = "1.0.24.10106"` and if you are updating to 2405.0.23, then `$targetMocVersion = "1.3.0.10418"`.<br><br>3. Run the following PowerShell commands on the first node:<br><br>`$initialMocVersion = "<initial version determined from step 2>"`<br>`$targetMocVersion = "<target version determined from step 2>"`<br><br># Import MOC module twice<br>`import-module moc`<br>`import-module moc`<br>`$verbosePreference = "Continue"`<br><br># Clear the SFS catalog cache<br>`Remove-Item (Get-MocConfig).manifestCache`<br><br># Set version to the current MOC version prior to update, and set state as update failed<br>`Set-MocConfigValue -name "version" -value $initialMocVersion`<br>`Set-MocConfigValue -name "installState" -value ([InstallState]::UpdateFailed)`<br><br># Rerun the MOC update to desired version<br>`Update-Moc -version $targetMocVersion`<br><br>4. Resume the update. |
6969
| AKS on HCI |AKS cluster creation fails with the `Error: Invalid AKS network resource id`. This issue can occur when the associated logical network name has an underscore. |Underscores aren't supported in logical network names. Make sure to not use underscore in the names for logical networks deployed on your Azure Local instance. |
70-
| Update | When viewing the readiness check results for an Azure Stack HCI cluster via the Azure Update Manager, there might be multiple readiness checks with the same name. |There's no known workaround in this release. Select **View details** to view specific information about the readiness check. |
71-
| Deployment<!--27312671--> | In some instances, during the registration of Azure Stack HCI servers, this error might be seen in the debug logs: *Encountered internal server error*. One of the mandatory extensions for device deployment might not be installed. |Follow these steps to mitigate the issue: <br><br> `$Settings = @{ "CloudName" = $Cloud; "RegionName" = $Region; "DeviceType" = "AzureEdge" }` <br><br> `New-AzConnectedMachineExtension -Name "AzureEdgeTelemetryAndDiagnostics" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.AzureStack.Observability" -Settings $Settings -ExtensionType "TelemetryAndDiagnostics" -EnableAutomaticUpgrade` <br><br> `New-AzConnectedMachineExtension -Name "AzureEdgeDeviceManagement" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.Edge" -ExtensionType "DeviceManagementExtension"`<br><br> `New-AzConnectedMachineExtension -Name "AzureEdgeLifecycleManager" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.AzureStack.Orchestration" -ExtensionType "LcmController"` <br><br>`New-AzConnectedMachineExtension -Name "AzureEdgeRemoteSupport" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.AzureStack.Observability" -ExtensionType "EdgeRemoteSupport" -EnableAutomaticUpgrade` |
72-
| Update | There's an intermittent issue in this release when the Azure portal incorrectly reports the update status as **Failed to update** or **In progress** though the update is complete. |[Connect to your Azure Stack HCI](./update/update-via-powershell-23h2.md#connect-to-your-azure-local) via a remote PowerShell session. To confirm the update status, run the following PowerShell cmdlets: <br><br> `$Update = get-solutionupdate` \| `? version -eq "<version string>"`<br><br>Replace the version string with the version you're running. For example, "10.2405.0.23". <br><br>`$Update.state`<br><br>If the update status is **Installed**, no further action is required on your part. Azure portal refreshes the status correctly within 24 hours. <br> To refresh the status sooner, follow these steps on one of the cluster nodes. <br>Restart the Cloud Management cluster group.<br>`Stop-ClusterGroup "Cloud Management"`<br>`Start-ClusterGroup "Cloud Management"`|
73-
| Update <!--28299865--> |During an initial MOC update, a failure occurs due to the target MOC version not being found in the catalog cache. The follow-up updates and retries show MOC in the target version, without the update succeeding, and as a result the Arc Resource Bridge update fails.<br><br>To validate this issue, collect the update logs using [Troubleshoot solution updates for Azure Stack HCI, version 23H2](./update/update-troubleshooting-23h2.md#collect-update-logs). The log files should show a similar error message (current version might differ in the error message):<br><br>`[ERROR: { "errorCode": "InvalidEntityError", "errorResponse": "{\n\"message\": \"the cloud fabric (MOC) is currently at version v0.13.1. A minimum version of 0.15.0 is required for compatibility\"\n}" }]`|Follow these steps to mitigate the issue:<br><br>1. To find the MOC agent version, run the following command: `'C:\Program Files\AksHci\wssdcloudagent.exe' version`.<br><br>2. Use the output of the command to find the MOC version from the table below that matches the agent version, and set `$initialMocVersion` to that MOC version. Set the `$targetMocVersion` by finding the Azure Stack HCI build you're updating to and get the matching MOC version from the following table. Use these values in the mitigation script provided below:<br><br><table><tr><td><b>Build</b></td><td><b>MOC version</b></td><td><b>Agent version</b></td></tr><tr><td>2311.2</td><td>1.0.24.10106</td><td>v0.13.0-6-gf13a73f7, v0.11.0-alpha.38,01/06/2024</td></tr><tr><td>2402</td><td>1.0.25.10203</td><td>v0.14.0, v0.13.1, 02/02/2024</td></tr><tr><td>2402.1</td><td>1.0.25.10302</td><td>v0.14.0, v0.13.1, 03/02/2024</td></tr><tr><td>2402.2</td><td>1.1.1.10314</td><td>v0.16.0-1-g04bf0dec, v0.15.1, 03/14/2024</td></tr><tr><td>2405/2402.3</td><td>1.3.0.10418</td><td>v0.17.1, v0.16.5, 04/18/2024</td></tr></table><br><br>For example, if the agent version is v0.13.0-6-gf13a73f7, v0.11.0-alpha.38,01/06/2024, then `$initialMocVersion = "1.0.24.10106"` and if you are updating to 2405.0.23, then `$targetMocVersion = "1.3.0.10418"`.<br><br>3. Run the following PowerShell commands on the first node:<br><br>`$initialMocVersion = "<initial version determined from step 2>"`<br>`$targetMocVersion = "<target version determined from step 2>"`<br><br># Import MOC module twice<br>`import-module moc`<br>`import-module moc`<br>`$verbosePreference = "Continue"`<br><br># Clear the SFS catalog cache<br>`Remove-Item (Get-MocConfig).manifestCache`<br><br># Set version to the current MOC version prior to update, and set state as update failed<br>`Set-MocConfigValue -name "version" -value $initialMocVersion`<br>`Set-MocConfigValue -name "installState" -value ([InstallState]::UpdateFailed)`<br><br># Rerun the MOC update to desired version<br>`Update-Moc -version $targetMocVersion`<br><br>4. Resume the update. |
74-
| AKS on HCI |AKS cluster creation fails with the `Error: Invalid AKS network resource id`. This issue can occur when the associated logical network name has an underscore. |Underscores aren't supported in logical network names. Make sure to not use underscore in the names for logical networks deployed on your Azure Stack HCI. |
70+
| Update | When viewing the readiness check results for an Azure Local cluster via the Azure Update Manager, there might be multiple readiness checks with the same name. |There's no known workaround in this release. Select **View details** to view specific information about the readiness check. |
71+
| Deployment<!--27312671--> | In some instances, during the registration of Azure Local machines, this error might be seen in the debug logs: *Encountered internal server error*. One of the mandatory extensions for device deployment might not be installed. |Follow these steps to mitigate the issue: <br><br> `$Settings = @{ "CloudName" = $Cloud; "RegionName" = $Region; "DeviceType" = "AzureEdge" }` <br><br> `New-AzConnectedMachineExtension -Name "AzureEdgeTelemetryAndDiagnostics" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.AzureStack.Observability" -Settings $Settings -ExtensionType "TelemetryAndDiagnostics" -EnableAutomaticUpgrade` <br><br> `New-AzConnectedMachineExtension -Name "AzureEdgeDeviceManagement" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.Edge" -ExtensionType "DeviceManagementExtension"`<br><br> `New-AzConnectedMachineExtension -Name "AzureEdgeLifecycleManager" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.AzureStack.Orchestration" -ExtensionType "LcmController"` <br><br>`New-AzConnectedMachineExtension -Name "AzureEdgeRemoteSupport" -ResourceGroupName $ResourceGroup -MachineName $env:COMPUTERNAME -Location $Region -Publisher "Microsoft.AzureStack.Observability" -ExtensionType "EdgeRemoteSupport" -EnableAutomaticUpgrade` |
72+
| Update | There's an intermittent issue in this release when the Azure portal incorrectly reports the update status as **Failed to update** or **In progress** though the update is complete. |[Connect to your Azure Local](./update/update-via-powershell-23h2.md#connect-to-your-azure-local) via a remote PowerShell session. To confirm the update status, run the following PowerShell cmdlets: <br><br> `$Update = get-solutionupdate` \| `? version -eq "<version string>"`<br><br>Replace the version string with the version you're running. For example, "10.2405.0.23". <br><br>`$Update.state`<br><br>If the update status is **Installed**, no further action is required on your part. Azure portal refreshes the status correctly within 24 hours. <br> To refresh the status sooner, follow these steps on one of the cluster nodes. <br>Restart the Cloud Management cluster group.<br>`Stop-ClusterGroup "Cloud Management"`<br>`Start-ClusterGroup "Cloud Management"`|
73+
| Update <!--28299865--> |During an initial MOC update, a failure occurs due to the target MOC version not being found in the catalog cache. The follow-up updates and retries show MOC in the target version, without the update succeeding, and as a result the Arc Resource Bridge update fails.<br><br>To validate this issue, collect the update logs using [Troubleshoot solution updates for Azure Local, version 23H2](./update/update-troubleshooting-23h2.md#collect-update-logs). The log files should show a similar error message (current version might differ in the error message):<br><br>`[ERROR: { "errorCode": "InvalidEntityError", "errorResponse": "{\n\"message\": \"the cloud fabric (MOC) is currently at version v0.13.1. A minimum version of 0.15.0 is required for compatibility\"\n}" }]`|Follow these steps to mitigate the issue:<br><br>1. To find the MOC agent version, run the following command: `'C:\Program Files\AksHci\wssdcloudagent.exe' version`.<br><br>2. Use the output of the command to find the MOC version from the table below that matches the agent version, and set `$initialMocVersion` to that MOC version. Set the `$targetMocVersion` by finding the Azure Local build you're updating to and get the matching MOC version from the following table. Use these values in the mitigation script provided below:<br><br><table><tr><td><b>Build</b></td><td><b>MOC version</b></td><td><b>Agent version</b></td></tr><tr><td>2311.2</td><td>1.0.24.10106</td><td>v0.13.0-6-gf13a73f7, v0.11.0-alpha.38,01/06/2024</td></tr><tr><td>2402</td><td>1.0.25.10203</td><td>v0.14.0, v0.13.1, 02/02/2024</td></tr><tr><td>2402.1</td><td>1.0.25.10302</td><td>v0.14.0, v0.13.1, 03/02/2024</td></tr><tr><td>2402.2</td><td>1.1.1.10314</td><td>v0.16.0-1-g04bf0dec, v0.15.1, 03/14/2024</td></tr><tr><td>2405/2402.3</td><td>1.3.0.10418</td><td>v0.17.1, v0.16.5, 04/18/2024</td></tr></table><br><br>For example, if the agent version is v0.13.0-6-gf13a73f7, v0.11.0-alpha.38,01/06/2024, then `$initialMocVersion = "1.0.24.10106"` and if you are updating to 2405.0.23, then `$targetMocVersion = "1.3.0.10418"`.<br><br>3. Run the following PowerShell commands on the first node:<br><br>`$initialMocVersion = "<initial version determined from step 2>"`<br>`$targetMocVersion = "<target version determined from step 2>"`<br><br># Import MOC module twice<br>`import-module moc`<br>`import-module moc`<br>`$verbosePreference = "Continue"`<br><br># Clear the SFS catalog cache<br>`Remove-Item (Get-MocConfig).manifestCache`<br><br># Set version to the current MOC version prior to update, and set state as update failed<br>`Set-MocConfigValue -name "version" -value $initialMocVersion`<br>`Set-MocConfigValue -name "installState" -value ([InstallState]::UpdateFailed)`<br><br># Rerun the MOC update to desired version<br>`Update-Moc -version $targetMocVersion`<br><br>4. Resume the update. |
74+
| AKS on HCI |AKS cluster creation fails with the `Error: Invalid AKS network resource id`. This issue can occur when the associated logical network name has an underscore. |Underscores aren't supported in logical network names. Make sure to not use underscore in the names for logical networks deployed on your Azure Local. |
7575
| Repair server <!--27053788--> |In rare instances, the `Repair-Server` operation fails with the `HealthServiceWaitForDriveFW` error. In these cases, the old drives from the repaired node aren't removed and new disks are stuck in the maintenance mode. |To prevent this issue, make sure that you DO NOT drain the node either via the Windows Admin Center or using the `Suspend-ClusterNode -Drain` PowerShell cmdlet before you start `Repair-Server`. <br> If the issue occurs, contact Microsoft Support for next steps. |
7676
| Repair server <!--27152397--> |This issue is seen when the single node Azure Local instance is updated from 2311 to 2402 and then the `Repair-Server` is performed. The repair operation fails. |Before you repair the single node, follow these steps:<br>1. Run version 2402 for the *ADPrepTool*. Follow the steps in [Prepare Active Directory](./deploy/deployment-prep-active-directory.md). This action is quick and adds the required permissions to the Organizational Unit (OU).<br>2. Move the computer object from **Computers** segment to the root OU. Run the following command:<br>`Get-ADComputer <HOSTNAME>` \| `Move-ADObject -TargetPath "<OU path>"`|
7777
| Deployment <!--27118224--> |If you prepare the Active Directory on your own (not using the script and procedure provided by Microsoft), your Active Directory validation could fail with missing `Generic All` permission. This is due to an issue in the validation check that checks for a dedicated permission entry for `msFVE-RecoverInformationobjects – General – Permissions Full control`, which is required for BitLocker recovery. |Use the [Prepare AD script method](./deploy/deployment-prep-active-directory.md) or if using your own method, make sure to assign the specific permission `msFVE-RecoverInformationobjects – General – Permissions Full control`. |

0 commit comments

Comments
 (0)