You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-monitor/agents/agent-windows-troubleshoot.md
+362Lines changed: 362 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -131,3 +131,365 @@ If the query returns results, you need to determine if a particular data type is
131
131
|8000 |HealthService |This event will specify if a workflow related to performance, event, or other data type collected is unable to forward to the service for ingestion to the workspace. | Event ID 2136 from source HealthService is written together with this event and can indicate the agent is unable to communicate with the service. Possible reasons might be misconfiguration of the proxy and authentication settings, network outage, or the network firewall or proxy doesn't allow TCP traffic from the computer to the service.|
132
132
|10102 and 10103 |Health Service Modules |Workflow couldn't resolve the data source. |This issue can occur if the specified performance counter or instance doesn't exist on the computer or is incorrectly defined in the workspace data settings. If this is a user-specified [performance counter](data-sources-performance-counters.md#configure-performance-counters), verify the information specified follows the correct format and exists on the target computers. |
133
133
|26002 |Health Service Modules |Workflow couldn't resolve the data source. |This issue can occur if the specified Windows event log doesn't exist on the computer. This error can be safely ignored if the computer isn't expected to have this event log registered. Otherwise, if this is a user-specified [event log](data-sources-windows-events.md#configure-windows-event-logs), verify the information specified is correct. |
134
+
135
+
## Pinned Certificate Issues with Older Microsoft Monitoring Agents - Breaking Change
136
+
137
+
*Root CA Change Overview*
138
+
139
+
As of 30 June 2023, Log Analytics back-end will no longer be accepting connections from MMA that reference an outdate root certificate. These MMAs are older versions prior to the Winter 2020 release (Log Analytics Agent) and prior to SCOM 2019 UR3 (SCOM). Any version, Bundle: 10.20.18053 / Extension: 1.0.18053.0, or greater will not have any issues, as well as any version above SCOM 2019 UR3. Any agent older than that will break and no longer be working and uploading to Log Analytics.
140
+
141
+
*What exactly is changing?*
142
+
143
+
As part of an ongoing security effort across various Azure services, Azure Log Analytics will be officially switching from the Baltimore CyberTrust CA Root to the [DigiCert Global G2 CA Root](https://www.digicert.com/kb/digicert-root-certificates.htm). This change will impact TLS communications with Log Analytics if the new DigiCert Global G2 CA Root certificate is missing from the OS, or the application is referencing the old Baltimore Root CA. **What this means is that Log Analytics will no longer accept connections from MMA that use this old root CA after it's retired.**
144
+
145
+
*Solution products*
146
+
147
+
You may have received the breaking change notification even if you have not personally installed the Microsoft Monitoring Agent. That is because various Azure products leverage the Microsoft Monitoring Agent. If you’re using one of these products, you may be affected as they leverage the Windows Log Analytics Agent. For those products with links below there may be specific instructions that will require you to upgrade to the latest agent.
148
+
149
+
- VM Insights
150
+
- [System Center Operations Manager (SCOM)](/system-center/scom/deploy-upgrade-agents)
151
+
- [System Center Service Manager (SCSM)](/system-center/scsm/upgrade-service-manager)
152
+
- [Microsoft Defender for Server](/microsoft-365/security/defender-endpoint/update-agent-mma-windows)
153
+
- [Microsoft Defender for Endpoint](/microsoft-365/security/defender-endpoint/update-agent-mma-windows)
For deployments with a limited number of agents, we highly recommend you upgrading your agent per node via [these management instructions](https://aka.ms/MMA-Upgrade).
163
+
164
+
For deployments with multiple nodes, we've written a script that will detect any affected breaking MMAs per subscription and then subsequently upgrade them to the latest version. These scripts need to be run sequentially, starting with UpdateMMA.ps1 then UpgradeMMA.ps1. Depending on the machine, the script may take a while. PowerShell 7 or greater is needed to run to avoid a timeout.
165
+
166
+
*UpdateMMA.ps1*
167
+
This script will go through VMs in your subscriptions, check for existing MMAs installed and then generate a .csv file of agents that need to be upgraded.
168
+
169
+
*UpgradeMMA.ps1*
170
+
This script will use the .CSV file generated in UpdateMMA.ps1 to upgrade all breaking MMAs.
171
+
172
+
Both of these scripts may take a while to complete.
173
+
174
+
# [UpdateMMA](#tab/UpdateMMA)
175
+
176
+
```powershell
177
+
# UpdateMMA.ps1
178
+
# This script is to be run per subscription, the customer has to set the az subscription before running this within the terminal scope.
179
+
# This script uses parallel processing, modify the $parallelThrottleLimit parameter to either increase or decrease the number of parallel processes
180
+
# PS> .\UpdateMMA.ps1 GetInventory
181
+
# The above command will generate a csv file with the details of VM's and VMSS that require MMA upgrade.
182
+
# The customer can modify the the csv by adding/removing rows if needed
183
+
# Update the MMA by running the script again and passing the csv file as parameter as shown below:
184
+
# PS> .\UpdateMMA.ps1 Upgrade
185
+
# If you don't want to check the inventory, then run the script wiht an additional -no-inventory-check
Write-Host "This script requires Powershell version 7 or newer to run. Please see https://docs.microsoft.com/powershell/scripting/whats-new/migrating-from-windows-powershell-51-to-powershell-7?view=powershell-7.1."
194
+
exit 1
195
+
}
196
+
197
+
$parallelThrottleLimit = 16
198
+
$mmaFixVersion = [version]"10.20.18053.0"
199
+
200
+
function GetVmsWithMMAInstalled
201
+
{
202
+
param(
203
+
$fileName
204
+
)
205
+
206
+
$vmList = az vm list --show-details --query "[?powerState=='VM running'].{ResourceGroup:resourceGroup, VmName:name}" | ConvertFrom-Json
207
+
208
+
if(!$vmList)
209
+
{
210
+
Write-Host "Cannot get the VM list, this script can only detect the running VM's"
211
+
return
212
+
}
213
+
214
+
$vmsCount = $vmList.Length
215
+
216
+
$vmParallelThrottleLimit = $parallelThrottleLimit
217
+
if ($vmsCount -lt $vmParallelThrottleLimit)
218
+
{
219
+
$vmParallelThrottleLimit = $vmsCount
220
+
}
221
+
222
+
if($vmsCount -eq 1)
223
+
{
224
+
$vmGroups += ,($vmList[0])
225
+
}
226
+
else
227
+
{
228
+
# split the vm's into batches to do parallel processing
229
+
for ($i = 0; $i -lt $vmsCount; $i += $vmParallelThrottleLimit)
$isMMAExtensionInstalled = az vmss extension list -g $resourceGroup --vmss-name $vmssName --query "[?name == 'MicrosoftMonitoringAgent'].name" | ConvertFrom-Json
304
+
if ($isMMAExtensionInstalled )
305
+
{
306
+
# check an instance in vmss, if it needs an MMA upgrade. Since the extension is installed at VMSS level, checking for bad version in 1 instance should be fine.
0 commit comments