The .NET Garbage Collector (GC) is really cool. It helps providing our applications with virtually unlimited memory, so we can focus on writing code instead of manually freeing up memory. But how does .NET manage that memory? What are hidden allocations? Are strings evil? It still matters to understand when and where memory is allocated. In this talk, we’ll go over the base concepts of .NET memory management and explore how .NET helps us and how we can help .NET – making our apps better. Expect profiling, Intermediate Language (IL), ClrMD and more!
5. Memory management and GC
“Virtually unlimited memory for our applications”
Big chunk of memory pre-allocated
Runtime manages allocation in that chunk
Garbage Collector (GC) reclaims unused memory, making it available again
6. .NET memory management 101
Memory allocation
Objects allocated in “managed heap” (big chunk of memory)
Allocating memory is fast, it’s just adding a pointer
Some unmanaged memory is also consumed (not GC-ed)
.NET CLR, Dynamic libraries, Graphics buffer, …
Memory release or “Garbage Collection” (GC)
Generations
Large Object Heap
7. .NET memory management 101
Memory allocation
Memory release or “Garbage Collection” (GC)
GC releases objects no longer in use by examining application roots
GC builds a graph of all the objects that are reachable from these roots
Object unreachable? Remove object, release memory, compact heap
Takes time to scan all objects!
Generations
Large Object Heap
8. .NET memory management 101
Memory allocation
Memory release or “Garbage Collection” (GC)
Generations
Large Object Heap
Generation 0 Generation 1 Generation 2
Short-lived objects (e.g. Local
variables)
In-between objects Long-lived objects (e.g. App’s
main form)
9. .NET memory management 101
Memory allocation
Memory release or “Garbage Collection” (GC)
Generations
Large Object Heap (LOH)
Special segment for large objects (>85KB)
Collected only during full garbage collection
Not compacted (by default) -> fragmentation!
Fragmentation can cause OutOfMemoryException
10. The .NET garbage collector
Runs very often for gen0
Short-lived objects, few references, fast to clean
Local variable, web request/response
Higher generation
Usually more references, slower to clean
GC pauses the running application to do its thing
Usually short, except when not…
Background GC (enabled by default)
Concurrent with application threads
May still introduce short locks/pauses, usually just for one thread
11. The .NET garbage collector
When does it run? Vague… But usually:
Out of memory condition – when the system fails to allocate or re-allocate memory
After some significant allocation – if X memory is allocated since previous GC
Failure of allocating some native resources – internal to .NET
Profiler – when triggered from profiler API
Forced – when calling methods on System.GC
Application moves to background
GC is not guaranteed to run
http://blogs.msdn.com/b/oldnewthing/archive/2010/08/09/10047586.aspx
http://blogs.msdn.com/b/abhinaba/archive/2008/04/29/when-does-the-net-compact-framework-garbage-collector-run.aspx
12. Helping the GC, avoid pauses
Optimize allocations (use struct when it makes sense, Span<T>, object pooling)
Don’t allocate when not needed
Make use of IDisposable / using statement
Clean up references, giving the GC an easy job
Weak references
Allow the GC to collect these objects, no need for checks
Finalizers
Beware! Moved to finalizer queue -> always gen++
15. When is memory allocated?
Not for value types (int, bool, struct, decimal, enum, float, byte, long, …)
Allocated on stack, not on heap
Not managed by garbage collector
For reference types
When you new
When you load data into a variable, object, property, ...
16. Hidden allocations!
Boxing!
Put an int in a box
Take an int out of a box
Lambda’s/closures
Allocate compiler-generated
DisplayClass to capture state
Params arrays
And more!
int i = 42;
// boxing - wraps the value type in an "object box"
// (allocating a System.Object)
object o = i;
// unboxing - unpacking the "object box" into an int again
// (CPU effort to unwrap)
int j = (int)o;
17. How to find them?
Past experience
Intermediate Language (IL)
Profiler
“Heap allocations viewer”
ReSharper Heap Allocations Viewer plugin
Roslyn’s Heap Allocation Analyzer
19. Measure!
Don’t do premature optimization – measure!
Allocations don’t always matter (that much)
Measure!
How frequently are we allocating?
How frequently are we collecting?
What generation do we end up on?
Are our allocations introducing pauses?
www.jetbrains.com/dotmemory (and www.jetbrains.com/dottrace)
23. Object pools / object re-use
Re-use objects / collections (when it makes sense)
Fewer allocations, fewer objects for the GC to scan
Less memory traffic that can trigger a full GC
Object pooling - object pool pattern
Create a pool of objects that can be re-used
https://www.codeproject.com/articles/20848/c-object-pooling
“Optimize ASP.NET Core” - https://github.com/aspnet/AspLabs/issues/3
System.Buffers.ArrayPool
24. Garbage Collector summary
GC is optimized for high memory traffic in short-lived objects
Use that knowledge! Don’t fear allocations!
Don’t optimize what should not be optimized…
GC is the concept that makes .NET / C# tick – use it!
Know when allocations happen
GC is awesome
Gen2 collection that stop the world not so much…
Measure!
26. Strings are objects
.NET tries to make them look like a value type, but they are a reference type
Read-only collection of char
Length property
A bunch of operator overloading
Allocated on the managed heap
var a = new string('-', 25);
var b = a.Substring(5);
var c = httpClient.GetStringAsync("http://blog.maartenballiauw.be");
27. String literals
Are all strings on the heap? Are all strings duplicated?
var a = "Hello, World!";
var b = "Hello, World!";
Console.WriteLine(a == b);
Console.WriteLine(Object.ReferenceEquals(a, b));
Prints true twice. So “Hello World” only in memory once?
29. String literals in #US
Compile-time optimization
Store literals only once in PE header metadata stream ECMA-335 standard, section II.24.2.4
Reference literals (IL: ldstr)
var a = Console.ReadLine();
var b = Console.ReadLine();
Console.WriteLine(a == b);
Console.WriteLine(Object.ReferenceEquals(a, b));
30. String duplicates
Any .NET application has them (System.Globalization duplicates quite a few)
Are they bad?
.NET GC is fast for short-lived objects, so meh.
Don’t waste memory with string duplicates on gen2
(but: it’s okay to have strings there)
31. String interning
Store (and read) strings from the intern pool
Simply call String.Intern when “allocating” or reading the string
Scans intern pool and returns reference
var url = "http://blog.maartenballiauw.be";
var stringList = new List<string>();
for (int i = 0; i < 1000000; i++)
{
stringList.Add(string.Intern(url + "/"));
}
32. String interning caveats
Why are not all strings interned by default?
CPU vs. memory
Not on the heap but on intern pool
No GC on intern pool – all strings in memory for AppDomain lifetime!
Rule of thumb
Lot of long-lived, few unique -> interning good
Lot of long-lived, many unique -> no benefit, memory growth
Lot of short-lived -> trust the GC
Measure!
34. How would you...
…build a managed type system, store in memory, CPU/memory friendly
Probably:
Store type info (what’s in there, what’s the offset of fieldN, …)
Store field data (just data)
Store method pointers
Inheritance information
36. Stuff on the Managed Heap
(scroll down for more...)
37. Theory is nice...
Microsoft.Diagnostics.Runtime (ClrMD)
“ClrMD is a set of advanced APIs for programmatically inspecting a crash dump of
a .NET program much in the same way that the SOS Debugging Extensions (SOS)
do. This allows you to write automated crash analysis for your applications as well
as automate many common debugger tasks. In addition to reading crash dumps
ClrMD also allows supports attaching to live processes.”
“LINQ-to-heap”
Maarten’s definition
39. But... Why?
Programmatic insight into memory space of a running project
Unit test critical paths and assert behavior (did we clean up what we expected?)
Capture memory issues in running applications
Other (easier) options in this space
dotMemory Unit (JetBrains)
Benchmark.NET
42. Conclusion
Garbage Collector (GC) optimized for high memory traffic + short-lived objects
Don’t fear allocations! But beware of gen2 “stop the world”
Don’t optimize what should not be optimized…
Measure!
Using a profiler/memory analysis tool
ClrMD to automate inspections
dotMemory Unit, Benchmark.NET, … to profile unit tests
Blog series: https://blog.maartenballiauw.be
Application roots: Typically, these are global and static object pointers, local variables, and CPU registers.
Application roots: Typically, these are global and static object pointers, local variables, and CPU registers.
Application roots: Typically, these are global and static object pointers, local variables, and CPU registers.
Open TripDownMemoryLane.sln
Show WeakReferenceDemo (demo “1-1”)
Explain weak reference allows GC to collect reference
Show Cache object – has weak references to data, we expect these to probably be cleaned up by GC
Attach profiler, run demo “1-1”, snapshot, see 20 instances of WeakReference<Data>
Snapshot again, compare – see WeakReference<Data> has been regenerated a couple of times
Show DisposeObjectsDemo (demo “1-2”)
Explain first demo does not dispose and relies on GC + finalizers. This will mean our object remains in memory for two GC cycles!
Explain dispose does clean them up and requires only one cycle
In SampleDisposable, explain GC.SuppressFinalize -> tell the GC no finalizer queue work is needed here!
Open TripDownMemoryLane.sln
Show Demo02_Random
Open IL viewer tool window, show what happens in IL for each code sample
Explain IL viewer + hovering statements to see what they do
BoxingRing() – show boxing and unboxing statements in IL, explain they consume CPU and allocate an object
ParamsArray() – the call to ParamsArrayImpl() actually allocates a new string array! CPU + memory
AverageWithinBounds() – temporary class is created to capture state of all variables, then passed around
IL_0000: newobj instance void TripDownMemoryLane.Demo02.Demo02_Random/'<>c__DisplayClass3_0'::.ctor()
Lambdas() – same thing, temporary class to capture state in the loop
IL_001f: newobj instance void Allocatey.Talk.Demo02_Random/'<>c__DisplayClass4_0'::.ctor()
Show Demo02_ValidateArgumentsDemo – this one is fun!
Explain what we want to do: build a guard function – check a condition, show error
First one is the easy one, but it allocates a string and runs string.Format
Second one is better – does not allocate the string! But does allocate a function and a state capture...
Third one – allocates an array (params)
Fourth one – no allocations, yay! Using overloads...
Show heap allocations viewer!
Open TripDownMemoryLane.sln
Show BeersDemoUnoptimized (demo “3-1” and “3-2”)
Explain we’re building an application that shows all beers in the world and their ratings
Stored in beers.json (show document) with beer name, brewery, number of votes
For a view in our application, read this file into a multi-dimensional dictionary that contains breweries, beers, and their rating
Show BeerLoader and note the dictionary format
Show LoadBeersInsane and explain this is BAD BAD BAD because of the high memory usage
Show LoadBeersUnoptimized, explain what it does, optimized against the insane version as we’re streaming over our file
Load beers a number of times
Inspect snapshots
GC is very visible
Most memory in gen2 (we keep our beers around)
Compare two snapshots: high traffic on dictionary items
(Lots of string allocations - JSON.NET)
Show LoadBeersOptimized, explain what it does, re-using dictionary and updating items as we read the JSON
Load beers a number of times
Inspect snapshots
GC is almost invisible
Less allocations happening
Compare two snapshots: almost no traffic
Less work for GC, less pauses!
Measure and make it look good!
There is an old adage in IT that says “don’t do premature optimization”. In other words: maybe some allocations are okay to have, as the GC will take care of cleaning them up anyway. While some do not agree with this, I believe in the middle ground.
The garbage collector is optimized for high memory traffic in short-lived objects, and I think it’s okay to make use of what the .NET runtime has to offer us here. If it’s in a critical path of a production application, fewer allocations are better, but we can’t write software with zero allocations - it’s what our high-level programming language uses to make our developer life easier. It’s not okay to have objects go to gen2 and stay there when in fact they should be gone from memory.
Learn where allocations happen, using any of the above methods, and profile your production applications frequently to see if there are large objects in higher generations of the heap that don’t belong there.
Will print “true” twice.
Open our demo application in dotPeek
Explain PE headers
Show #US table
Open StringAllocationDemo class. Jump to IL code, show ldstr statement for strings that are in #US table
Code = trick question, what if we enter same value twice? String equals, reference not equals!
How many strings are stored
How many strings are stored
Open ClrMD.sln
Explain: two projects, one target application, one running ClrMD to analyze what we have
Open ClrMD.Explorer.Program, show attaching ClrMD
Get CLR version – gets info about the current CLR version
Get runtime – gets info about the actual runtime hosting our app
Show DumpClrInfo – get info, stress DAC data access components location – defines the runtime structures, used by ClrMD and VS Debugger etc to explore runtime while debugging/profiling/...
Explore DumpHeapObjects, stress the heap structure
Loop object addresses - foreach (var objectAddress in generation)
Get type of object at address - var type = heap.GetObjectType(objectAddress.Ptr);
Use type info to get value - type.GetValue(objectAddress.Ptr)
Explore type autocomplete – structure to get enum, method addresses, ...
Open TestsWithDMU.sln
Explain: similar Clock leak as in previous demo
Two unit tests that create the clock, and timer. Then run GC.Collect, then use dMU to check whether instances are left in memory.
Easy way to test, after investigation, when a memory leak comes back (or not)