Memory and performance of .NET dictionaries

Since generics were added to .NET everybody seems to have forgotten about all the different dictionaries available in the framework and uses Dictionary<,>. I was doing the same until recently I ran into a situation where tens of thousands of dictionaries are created and the memory usage went rather high.

In my case most of the dictionaries are rather small (mostly 2-3 items) so the first thing I tried was switching to HybridDictionary. A short test later I found out that by performing this simple switch I have reduced the memory usage by 10%.

So I went to investigate and compare the various dictionaries available in .NET both of the previously mentioned and also Hashtable, ListDictionary, OrderedDictionary, SortedDictionary, SortedList.

The test is very simple – it recursively creates dictionaries (with integers as keys) and captures both memory usage and time it took to create the dictionary and fully read it (once). Of course, each of the abovementioned classes have their specific usages, especially the sorted and ordered ones but this test only looks at a simple un-optimized scenario.

image

The first test creates small dictionaries in 9 levels (each dictionary contains 5 other dictionaries). What the results show is that HybridDictionary (together with ListDictionary that is used by it internally) takes the lead with much better timings and memory usage than the standard Dictionary<,>.

image

As expected, raising the size of each dictionary put ListDictionary in a bad position. Why SortedList with typed key has surprisingly good memory/performance ratings - it uses a simple array for storage and the test uses sequential keys that is the strong point for this class. A bit strange is the bad performance of Hashtable (memory-wise) especially since Hashtable is the class being used by HybridDictionary in this case and it does not show this weakness. It is not because of cold start as the tests were run multiple times and always produced the same result.

Obviously this needs more research and probably better ways of measuring the memory consumption (I am using GC.GetTotalMemory method) but it should be enough to remind that performance is not the only thing to measure when using lots of collections – there is a lot of memory overhead that should be investigated to find the best possible match for each scenario.

Another thing to mention is that the dictionaries increase their capacity in bulk so even if the dictionary size is 200 the capacity of the underlying data stores could as well be 300. It might seem to be a good idea to set the capacity with the appropriate constructor (as long as the maximum is known beforehand) but couple of quick tests showed that for Dictionary<,> it might produce even worse results. Although this again could be the result of suboptimal testing techniques.

Source code for the test application.

More results on different dictionary sizes.

Comments are closed