Skip to content

Commit bf699c1

Browse files
rcj1noahfalk
andauthored
Dotnet-counters docs updates w.r.t UI and list of counters (#48111)
* updating dotnet-counters UI docs and removign dotnet-counters --list command : * updating dotnet-counters UI docs and removing dotnet-counters --list command : * linting * linting * Fix formatting of counters description in markdown * UI changes, versioning notices, and other edits * Update docs/core/diagnostics/dotnet-counters.md Co-authored-by: Noah Falk <[email protected]> --------- Co-authored-by: Noah Falk <[email protected]>
1 parent 3d30fe1 commit bf699c1

File tree

6 files changed

+310
-276
lines changed

6 files changed

+310
-276
lines changed

docs/core/diagnostics/debug-highcpu.md

Lines changed: 99 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -32,60 +32,85 @@ The tutorial uses:
3232

3333
## CPU counters
3434

35-
Before attempting to collect diagnostics data, you need to observe a high CPU condition. Run the [sample application](/samples/dotnet/samples/diagnostic-scenarios) using the following command from the project root directory.
35+
Before attempting this tutorial, please install the latest version of dotnet-counters:
3636

37-
```dotnetcli
38-
dotnet run
39-
```
37+
```dotnetcli
38+
dotnet tool install --global dotnet-counters
39+
```
4040

41-
To find the process ID, use the following command:
41+
If your app is running a version of .NET older than .NET 9, the output UI of dotnet-counters will look slightly different; see [dotnet-counters](dotnet-counters.md) for details.
42+
43+
Before attempting to collect diagnostics data, you need to observe a high CPU condition. Run the [sample application](/samples/dotnet/samples/diagnostic-scenarios) using the following command from the project root directory.
4244

4345
```dotnetcli
44-
dotnet-trace ps
46+
dotnet run
4547
```
4648

47-
Take note of the process ID from your command output. Our process ID was `22884`, but yours will be different. To check the current CPU usage, use the [dotnet-counters](dotnet-counters.md) tool command:
49+
To check the current CPU usage, use the [dotnet-counters](dotnet-counters.md) tool command:
4850

4951
```dotnetcli
50-
dotnet-counters monitor --refresh-interval 1 -p 22884
52+
dotnet-counters monitor -n DiagnosticScenarios --showDeltas
5153
```
5254

53-
The `refresh-interval` is the number of seconds between the counter polling CPU values. The output should be similar to the following:
55+
The output should be similar to the following:
5456

5557
```console
5658
Press p to pause, r to resume, q to quit.
5759
Status: Running
5860

61+
Name Current Value Last Delta
5962
[System.Runtime]
60-
% Time in GC since last GC (%) 0
61-
Allocation Rate / 1 sec (B) 0
62-
CPU Usage (%) 0
63-
Exception Count / 1 sec 0
64-
GC Heap Size (MB) 4
65-
Gen 0 GC Count / 60 sec 0
66-
Gen 0 Size (B) 0
67-
Gen 1 GC Count / 60 sec 0
68-
Gen 1 Size (B) 0
69-
Gen 2 GC Count / 60 sec 0
70-
Gen 2 Size (B) 0
71-
LOH Size (B) 0
72-
Monitor Lock Contention Count / 1 sec 0
73-
Number of Active Timers 1
74-
Number of Assemblies Loaded 140
75-
ThreadPool Completed Work Item Count / 1 sec 3
76-
ThreadPool Queue Length 0
77-
ThreadPool Thread Count 7
78-
Working Set (MB) 63
63+
dotnet.assembly.count ({assembly}) 111 0
64+
dotnet.gc.collections ({collection})
65+
gc.heap.generation
66+
------------------
67+
gen0 8 0
68+
gen1 1 0
69+
gen2 0 0
70+
dotnet.gc.heap.total_allocated (By) 4,042,656 24,512
71+
dotnet.gc.last_collection.heap.fragmentation.size (By)
72+
gc.heap.generation
73+
------------------
74+
gen0 801,728 0
75+
gen1 6,048 0
76+
gen2 0 0
77+
loh 0 0
78+
poh 0 0
79+
dotnet.gc.last_collection.heap.size (By)
80+
gc.heap.generation
81+
------------------
82+
gen0 811,512 0
83+
gen1 562,024 0
84+
gen2 1,095,056 0
85+
loh 98,384 0
86+
poh 24,528 0
87+
dotnet.gc.last_collection.memory.committed_size (By) 5,623,808 0
88+
dotnet.gc.pause.time (s) 0.019 0
89+
dotnet.jit.compilation.time (s) 0.582 0
90+
dotnet.jit.compiled_il.size (By) 138,895 0
91+
dotnet.jit.compiled_methods ({method}) 1,470 0
92+
dotnet.monitor.lock_contentions ({contention}) 4 0
93+
dotnet.process.cpu.count ({cpu}) 22 0
94+
dotnet.process.cpu.time (s)
95+
cpu.mode
96+
--------
97+
system 0.109 0
98+
user 0.453 0
99+
dotnet.process.memory.working_set (By) 65,515,520 0
100+
dotnet.thread_pool.queue.length ({work_item}) 0 0
101+
dotnet.thread_pool.thread.count ({thread}) 0 0
102+
dotnet.thread_pool.work_item.count ({work_item}) 6 0
103+
dotnet.timer.count ({timer}) 0 0
79104
```
80105

81-
With the web app running, immediately after startup, the CPU isn't being consumed at all and is reported at `0%`. Navigate to the `api/diagscenario/highcpu` route with `60000` as the route parameter:
106+
Focusing in on the ```Last Delta``` values of ```dotnet.process.cpu.time```, these tell us how many seconds within the refresh period (currently set to the default of 1 s) the CPU has been active. With the web app running, immediately after startup, the CPU isn't being consumed at all and these deltas are both `0`. Navigate to the `api/diagscenario/highcpu` route with `60000` as the route parameter:
82107

83108
`https://localhost:5001/api/diagscenario/highcpu/60000`
84109

85-
Now, rerun the [dotnet-counters](dotnet-counters.md) command. If interested in monitoring just the `cpu-usage` counter, add '--counters System.Runtime[cpu-usage]` to the previous command. We are unsure if the CPU is being consumed, so we will monitor the same list of counters as above to verify counter values are within expected range for our application.
110+
Now, rerun the [dotnet-counters](dotnet-counters.md) command.
86111

87112
```dotnetcli
88-
dotnet-counters monitor -p 22884 --refresh-interval 1
113+
dotnet-counters monitor -n DiagnosticScenarios --showDeltas
89114
```
90115

91116
You should see an increase in CPU usage as shown below (depending on the host machine, expect varying CPU usage):
@@ -94,29 +119,52 @@ You should see an increase in CPU usage as shown below (depending on the host ma
94119
Press p to pause, r to resume, q to quit.
95120
Status: Running
96121

122+
Name Current Value Last Delta
97123
[System.Runtime]
98-
% Time in GC since last GC (%) 0
99-
Allocation Rate / 1 sec (B) 0
100-
CPU Usage (%) 25
101-
Exception Count / 1 sec 0
102-
GC Heap Size (MB) 4
103-
Gen 0 GC Count / 60 sec 0
104-
Gen 0 Size (B) 0
105-
Gen 1 GC Count / 60 sec 0
106-
Gen 1 Size (B) 0
107-
Gen 2 GC Count / 60 sec 0
108-
Gen 2 Size (B) 0
109-
LOH Size (B) 0
110-
Monitor Lock Contention Count / 1 sec 0
111-
Number of Active Timers 1
112-
Number of Assemblies Loaded 140
113-
ThreadPool Completed Work Item Count / 1 sec 3
114-
ThreadPool Queue Length 0
115-
ThreadPool Thread Count 7
116-
Working Set (MB) 63
124+
dotnet.assembly.count ({assembly}) 111 0
125+
dotnet.gc.collections ({collection})
126+
gc.heap.generation
127+
------------------
128+
gen0 8 0
129+
gen1 1 0
130+
gen2 0 0
131+
dotnet.gc.heap.total_allocated (By) 4,042,656 24,512
132+
dotnet.gc.last_collection.heap.fragmentation.size (By)
133+
gc.heap.generation
134+
------------------
135+
gen0 801,728 0
136+
gen1 6,048 0
137+
gen2 0 0
138+
loh 0 0
139+
poh 0 0
140+
dotnet.gc.last_collection.heap.size (By)
141+
gc.heap.generation
142+
------------------
143+
gen0 811,512 0
144+
gen1 562,024 0
145+
gen2 1,095,056 0
146+
loh 98,384 0
147+
poh 24,528 0
148+
dotnet.gc.last_collection.memory.committed_size (By) 5,623,808 0
149+
dotnet.gc.pause.time (s) 0.019 0
150+
dotnet.jit.compilation.time (s) 0.582 0
151+
dotnet.jit.compiled_il.size (By) 138,895 0
152+
dotnet.jit.compiled_methods ({method}) 1,470 0
153+
dotnet.monitor.lock_contentions ({contention}) 4 0
154+
dotnet.process.cpu.count ({cpu}) 22 0
155+
dotnet.process.cpu.time (s)
156+
cpu.mode
157+
--------
158+
system 0.344 0.013
159+
user 14.203 0.963
160+
dotnet.process.memory.working_set (By) 65,515,520 0
161+
dotnet.thread_pool.queue.length ({work_item}) 0 0
162+
dotnet.thread_pool.thread.count ({thread}) 0 0
163+
dotnet.thread_pool.work_item.count ({work_item}) 6 0
164+
dotnet.timer.count ({timer}) 0 0
117165
```
118166

119-
Throughout the duration of the request, the CPU usage will hover around the increased percentage.
167+
Throughout the duration of the request, the CPU usage will hover around the increased value.
120168

121169
> [!TIP]
122170
> To visualize an even higher CPU usage, you can exercise this endpoint in multiple browser tabs simultaneously.

docs/core/diagnostics/debug-memory-leak.md

Lines changed: 49 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,8 @@ The tutorial uses:
3434

3535
The tutorial assumes the sample apps and tools are installed and ready to use.
3636

37+
If your app is running a version of .NET older than .NET 9, the output UI of dotnet-counters will look slightly different; see [dotnet-counters](dotnet-counters.md) for details.
38+
3739
## Examine managed memory usage
3840

3941
Before you start collecting diagnostic data to help root cause this scenario, make sure you're actually seeing a memory leak (growth in memory usage). You can use the [dotnet-counters](dotnet-counters.md) tool to confirm that.
@@ -73,44 +75,68 @@ The live output should be similar to:
7375

7476
```console
7577
Press p to pause, r to resume, q to quit.
76-
Status: Running
78+
Status: Running
7779

80+
Name Current Value
7881
[System.Runtime]
79-
# of Assemblies Loaded 118
80-
% Time in GC (since last GC) 0
81-
Allocation Rate (Bytes / sec) 37,896
82-
CPU Usage (%) 0
83-
Exceptions / sec 0
84-
GC Heap Size (MB) 4
85-
Gen 0 GC / sec 0
86-
Gen 0 Size (B) 0
87-
Gen 1 GC / sec 0
88-
Gen 1 Size (B) 0
89-
Gen 2 GC / sec 0
90-
Gen 2 Size (B) 0
91-
LOH Size (B) 0
92-
Monitor Lock Contention Count / sec 0
93-
Number of Active Timers 1
94-
ThreadPool Completed Work Items / sec 10
95-
ThreadPool Queue Length 0
96-
ThreadPool Threads Count 1
97-
Working Set (MB) 83
82+
dotnet.assembly.count ({assembly}) 111
83+
dotnet.gc.collections ({collection})
84+
gc.heap.generation
85+
------------------
86+
gen0 1
87+
gen1 0
88+
gen2 0
89+
dotnet.gc.heap.total_allocated (By) 4,431,712
90+
dotnet.gc.last_collection.heap.fragmentation.size (By)
91+
gc.heap.generation
92+
------------------
93+
gen0 803,576
94+
gen1 15,456
95+
gen2 0
96+
loh 0
97+
poh 0
98+
dotnet.gc.last_collection.heap.size (By)
99+
gc.heap.generation
100+
------------------
101+
gen0 811,960
102+
gen1 1,214,720
103+
gen2 0
104+
loh 0
105+
poh 24,528
106+
dotnet.gc.last_collection.memory.committed_size (By) 4,296,704
107+
dotnet.gc.pause.time (s) 0.003
108+
dotnet.jit.compilation.time (s) 0.329
109+
dotnet.jit.compiled_il.size (By) 120,212
110+
dotnet.jit.compiled_methods ({method}) 1,202
111+
dotnet.monitor.lock_contentions ({contention}) 2
112+
dotnet.process.cpu.count ({cpu}) 22
113+
dotnet.process.cpu.time (s)
114+
cpu.mode
115+
--------
116+
system 0.344
117+
user 0.344
118+
dotnet.process.memory.working_set (By) 64,331,776
119+
dotnet.thread_pool.queue.length ({work_item}) 0
120+
dotnet.thread_pool.thread.count ({thread}) 0
121+
dotnet.thread_pool.work_item.count ({work_item}) 7
122+
dotnet.timer.count ({timer}) 0
123+
98124
```
99125

100126
Focusing on this line:
101127

102128
```console
103-
GC Heap Size (MB) 4
129+
dotnet.gc.last_collection.memory.committed_size (By) 4,296,704
104130
```
105131

106132
You can see that the managed heap memory is 4 MB right after startup.
107133

108134
Now, go to the URL `https://localhost:5001/api/diagscenario/memleak/20000`.
109135

110-
Observe that the memory usage has grown to 30 MB.
136+
Observe that the memory usage has grown to over 20 MB.
111137

112138
```console
113-
GC Heap Size (MB) 30
139+
dotnet.gc.last_collection.memory.committed_size (By) 21,020,672
114140
```
115141

116142
By watching the memory usage, you can safely say that memory is growing or leaking. The next step is to collect the right data for memory analysis.

0 commit comments

Comments
 (0)