Why aren't arrays rented from a shared memory pool by default? #6177
-
I've been perusing through the high performance library, specifically In the 'When should this be used?' section of the documentation the following is said:
Which implies that use of A bit further down they offer an example where
The My question is: If renting an array from a shared memory pool is more performant than allocation, and the initial overhead of renting isn't large, why does Is it as simple as 'the word new comes with the expectation a new object is actually being created in memory', or is it more complex than I give it credit for? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Because rented array need to be explicitly returned. Let GC to return the array can harm the performance too. Additionally, at which size pooling is beneficial tights to the pooling implementation. As dotnet/runtime#68800 (comment) shown, pooling small arrays can be far slower than allocating.
This is not true. Initial overhead of renting is very large that CoreLib refuse to use any pool other than |
Beta Was this translation helpful? Give feedback.
Because rented array need to be explicitly returned. Let GC to return the array can harm the performance too.
Additionally, at which size pooling is beneficial tights to the pooling implementation. As dotnet/runtime#68800 (comment) shown, pooling small arrays can be far slower than allocating.
This is not true. Initial overhead of renting is very large that CoreLib refuse to use any pool other than
byte
. The overhead includes a generic instantiation, and a minimal amount of arrays cached in the pool and never reclaimed.