In-memory caching in Xamarin apps

CPU

Recently we added in-memory caching to Azure App. You can try it out now on iOS and Android!

It turns out Mono doesn’t have System.Runtime.Caching namespace, which makes it easy to implement caching for .NET apps. We had to find another way.

Caching libraries for Xamarin

We looked at a few libraries for caching (e.g., MemoryCache and Akavache), but surprisingly none of them manage cache size and memory. They simply add items to Dictionary, and if you add too many you get OutOfMemoryException.

It may not be an issue for many applications, but in Azure App we need to take into account users who has multiple subscriptions with thousands of resources.

BTW: Akavache is a great library. Besides in-memory cache it also supports persistent cache, have clean APIs and a lot of great documentation.

Implementing in-memory cache

After browsing internets and asking people at Xamarin chat we didn’t find anything that would work for us, and we decided to implement in-memory cache by ourselves.

public class InMemoryCache<T> : IInMemoryCache<T>
{
    private const int LimitedCacheThreshold = 1000;

    private class Reference
    {
        private int _hitCount = 0;

        public DateTimeOffset Timestamp
        {
            get;
            private set;
        }

        public T Data
        {
            get;
            private set;
        }

        public void AddRef()
        {
            Interlocked.Increment(ref _hitCount);
        }

        public int ResetRef()
        {
            var count = _hitCount;
            _hitCount = 0;
            return count;
        }

        public static Reference Create(T obj)
        {
            return new Reference()
            {
                Timestamp = DateTimeOffset.Now,
                Data = obj,
            };
        }

        private Reference()
        {
        }
    }

    private readonly ConcurrentDictionary<string, WeakReference<Reference>> _weakCache;
    private readonly ConcurrentDictionary<string, Reference> _limitedCache;
    private readonly ConcurrentDictionary<string, Task<T>> _pendingTasks;

    private InMemoryCache()
    {
        _weakCache = new ConcurrentDictionary<string, WeakReference<Reference>>(StringComparer.Ordinal);
        _limitedCache = new ConcurrentDictionary<string, Reference>(StringComparer.Ordinal);
        _pendingTasks = new ConcurrentDictionary<string, Task<T>>(StringComparer.Ordinal);
    }

    public static IInMemoryCache<T> Create()
    {
        return new InMemoryCache<T>();
    }

    public async Task<T> GetOrAdd(string key, DateTimeOffset expiration, Func<string, Task<T>> addFactory)
    {
        WeakReference<Reference> cachedReference;

        if (_weakCache.TryGetValue(key, out cachedReference))
        {
            Reference cachedValue;
            if (cachedReference.TryGetTarget(out cachedValue) || cachedValue != null)
            {
                if (cachedValue.Timestamp > expiration)
                {
                    cachedValue.AddRef();
                    return cachedValue.Data;
                }
            }
        }

        try
        {
            var actualValue = await _pendingTasks.GetOrAdd(key, addFactory);

            if (_limitedCache.Count > LimitedCacheThreshold)
            {
                var keysToRemove = _limitedCache
                    .Select(item => Tuple.Create(
                        item.Value.ResetRef(),
                        item.Value.Timestamp,
                        item.Key))
                    .ToArray()
                    .OrderBy(item => item.Item1)
                    .ThenBy(item => item.Item2)
                    .Select(item => item.Item3)
                    .Take(LimitedCacheThreshold / 2)
                    .ToArray();

                foreach (var k in keysToRemove)
                {
                    Reference unused;
                    _limitedCache.TryRemove(k, out unused);
                }
            }

            var reference = Reference.Create(actualValue);
            _weakCache[key] = new WeakReference<Reference>(reference);
            _limitedCache[key] = reference;

            return actualValue;
        }
        finally
        {
            Task<T> unused;
            _pendingTasks.TryRemove(key, out unused);
        }
    }
}

We use two layers of caching. First is using WeakReference that leaves memory management to Garbage Collector. As GC is not very predictable and sometimes may unnecessary release some reference, we have second layer of caching. We call it _limitedCache, and it keeps objects in memory until capacity reach 1000 objects. Then we remove half (500), least used objects from dictionary. Because the same objects are being kept in two dictionaries, the WeakReference will never be released as long as object is in _limitedCache. Thus, we always check only if object is present in _weakCache.

There is also third dictionary that keeps track of pending tasks that are responsible for getting data. This prevents us from sending the same requests more than once if object is not in cache yet.

Summary

What is great about building apps with Xamarin is the ability to share code across platforms. When we were implementing cache, we didn’t touch any platform specific code. All work was done in Portable Class Library.

Adding cache to Azure App helped not only to decrease user’s network data usage, but also to improve performance significantly!

If you need in-memory cache for your app, go ahead and use the above code snippet! If you are looking for persistent cache then consider using Akavache.

Are you caching? How? Why? Why not?