Short question is: this is not how memory management works. I would guess, your misconception stems from experience in classical old C++, where going out of the function scope really cause destruction.
In .NET, you are dealing with memory management based on GC; and object destruction is also based on this architecture. The destruction and reclaiming of memory is based on the concept of
reachable or
unreachable references, which is a pretty complication criterion. When some reference becomes unreachable, object destruction follows, but not immediately. Rather, it happens according to the GC's internal behavior; so you don't have control over the moment of time when the destructor is called and memory is reclaimed. That's why destructors are pretty rarely written in .NET applications.
This is explained, for example, here:
Garbage collection (computer science) — Wikipedia, the free encyclopedia.
Another, related problem is the problem of possible memory leaks. First of all, "detection" of memory leak is often a false-positive observation. Do you draw your conclusion from Windows Task Manager? If so, don't trust it.
Memory leaks are still possible, but the whole concept is not as trivial. The "accidental" leaks, very usual due to unmanaged code bugs, are very unlikely. But they can come from mistakes in general code design. I discussed this problem in my past answers:
Best way to get rid of a public static List Causing an Out of Memory,
deferring varirable inside the loop can cuase memory leak?,
Garbage collectotion takes care of all the memory management,
Memory management in MDI forms,
Memory leak in WPF DataBinding.
—SA