In Part 1, I discussed the basics of
WeakReference<T>. Part 2 introduced short and long weak references as well as the concept of resurrection. I also covered how to use the debugger to inspect your memory for the presence of weak references. This article will complete this miniseries with a discussion of when to use weak references at all and a small, practical example.
When to Use
Short answer: rarely. Most applications won’t require this.
Long answer: If all of the following criteria are met, then you may want to consider it:
- Memory use needs to be tightly restricted – this probably means mobile devices these days. If you’re running on Windows RT or Windows Phone, then your memory is restricted.
- Object lifetime is highly variable – if you can predict the lifetime of your objects well, then using
WeakReference doesn’t really make sense. In that case, you should just control their lifetime directly.
- Objects are relatively large, but easy to create –
WeakReference is really ideal for that large object that would be nice to have around, but if not, you could easily regenerate it as needed (or just do without).
- The object’s size is significantly more than the overhead of using
WeakReference<T> – Using
WeakReference<T> adds an additional object, which means more memory pressure, an extra dereference step. It would be a complete waste of time and memory to use
WeakReference<T> to store an object that’s barely larger than
WeakReference<T> itself. However, there are some caveats to this, below.
There is another scenario in which
WeakReference may make sense. I call this the “secondary index” feature. Suppose you have an in-memory cache of objects, all indexed by some key. This could be as simple as
Dictionary<string, Person>, for example. This is the primary index, and represents the most common lookup pattern, the master table, if you will.
However, you also want to look up these objects with another key, say a last name. Maybe you want a dozen other indexes. Using standard strong references, you could have additional indexes, such as
Dictionary<DateTime, Person> for birthdays, etc. When it comes time to update the cache, you then have to modify all of these indexes to ensure that the Person object gets garbage collected when no longer needed.
This might be a pretty big performance hit to do this every time there is an update. Instead, you could spread that cost around by having all of the secondary indexes use WeakReference instead:
Dictionary<DateTime, WeakReference<Person>>, or, if the index has non-unique keys (likely),
By doing this, the cleanup process becomes much easier: you just update the master cache, which removes the only strong reference to the object. The next time a garbage collection runs (of the appropriate generation), the Person object will be cleaned up. If you ever access a secondary index looking for those objects, you’ll discover the object has been cleaned up, and you can clean up those indexes right then. This spreads out the cost of cleanup of the index overhead, while allowing the expensive cached objects to be cleaned up earlier.
This Stack Overflow thread has some additional thoughts, with some variations of the example below and other uses.
A rather famous and involved example is using WeakReferences to prevent the dangling event handler problem (where failure to unregister an event handler keeps objects in memory, despite them having no explicit references anywhere in your code).
I had mentioned in Chapter 2 (Garbage Collection) of Writing High-Performance .NET Code that WeakReference could be used in a multilevel caching system to allow objects to gracefully fall out of memory when pressure increases. You can start with strong references and then demote them to weak references according to some criteria you choose.
That is the example I’ll show here. Note that this not production-quality code. It’s only about 5% of the code you would actually need, even assuming this algorithm makes sense in your scenario. At a minimum, you probably want to implement
IDictionary<TKey, TValue> on it, perhaps tighten up some of the temporary memory allocations, and more.
This is a very simple implementation. When you add items to the cache, it adds them as strong references (removing any existing weak references for that key). When you attempt to read a value from the cache, it tries the strong references first, before attempting the weak references.
Objects are demoted from strong to weak references based simply on a maximum age. This is admittedly rather simplistic, but it gets the point across.
sealed class HybridCache<TKey, TValue>
public T value;
public long additionTime;
public long demoteTime;
private readonly TimeSpan maxAgeBeforeDemotion;
private readonly ConcurrentDictionary<TKey, ValueContainer<TValue>>
That’s it for the series on Weak References–I hope you enjoyed it! You may never need them, but when you do, you should understand how they work in detail to make the smartest decisions.