Valorem Consulting

| Tags: Rene Schulte, Strategy

We recently worked on a Xamarin application for the Amazon Fire TV Android platform. We faced some challenges which we love to address. One of which was the backend speed; however, this feature was out of our scope. Still, the client application needed to interact with the backend speed. Otherwise, with no data, there is no application.


We implemented an Electronic Program Guide (EPG). This EPG needed to show data two weeks in the future and one week in the past for more than 1000 TV channels. This is a massive amount of data that was slow to fetch from the backend. Because the EPG data rarely changes, this application is a perfect candidate for client-side caching as it increases the application speed substantially. Client-side caching is an area which is still often paid too less and/or too late attention; although, performance is crucial for an optimal UX.


The usual approach one often sees uses serialized data in text or binary files inside the app’s local / isolated storage. For our scenario with the EPG, we had to show the data in a virtualized UI because you cannot load > 1 million items at once even it is from a local cache. If you seek optimal UX, the UI requires incremental loading. This means the cached EPG data needs to be loaded based on 4 parameters (fromDate – toDate and fromChannel – toChannel). This is challenging to implement in an efficient way with a simple file-based caching but is a perfect task for SQL.


Meet SQLite

We at IdentityMine have successfully used SQLite in several projects on different platforms, either for caching and performance improvements or for offline scenarios.


SQLite is an in-process, relational database system available for many platforms. In-process means it runs in the applications’s process on local files and does not need a client-server architecture like most other database management systems.


SQLite is well-known, open source and perfectly suited for such client caching scenarios. Even better, there’s a C# port/wrapper available with ORM included, support for LINQ and works as portable class library: SQLite.Net-PCL (NuGet).


How to use it

Imagine we have the following POCO representing an image / picture; its data is fetched through the web which we want to cache locally:


Note that the attributes [PrimaryKey] and [Indexed]. The attributes come from SQLite.Net and are used for its ORM system. This allows for some nice automatic mapping of properties to table columns and makes it super easy to insert, update or select data by using our POCO without any manual SQL handling. [Indexed] means the Uri column of the table will have an index lookup structure which makes selecting based on this column fast. The [PrimaryKey] attribute marks the Id property as primary key column of the table which needs to be unique and is implicitly indexed.


Creating a table with SQLite.NET for the ImageData is as simple as these lines of C# code:

var db = new SQLiteAsyncConnection(() => _sqliteConnectionPool.GetConnection(_sqliteConnectionParameters));

await db.CreateTableAsync<ImageData>();


Insert or update is even easier with serialization all taken care for us by SQLite.NET’s ORM:

await db.InsertOrReplaceAsync(image);


await db.InsertOrReplaceAllAsync(images);


A simple Select with SQLite.Net’s LINQ is just pure gold. Full auto deserialization and no error-prone SQL strings involved:

ImageData result = await db.Table<ImageData>().Where(gdi => gdi.Uri == uri).FirstOrDefaultAsync();



Usually, such nice developer productivity comes at a cost. So, one might wonder how it affects the performance compared to a traditional, simple file storage. As usual, it depends. It depends foremost on the caching scenario but also on the platform the application is running on. In the end, all of our applications run on devices which can provide quite different hardware for storage and operating systems that handle file IO differently. The only way to know what is faster for a scenario is to measure it under different conditions.


Below is the time we measured using the ImageData POCO and the above SQLite InsertOrReplaceAsync (DB Insert/Update) statements. This was compared to plain .NET File.WriteAllText of the Json.NET text serialized data (File write JSON) and also the FileStream write of the of the Json.NET binary serialized data (File write BSON). For the file comparison every single ImageData instance was written to a single file.


All tests were implemented as MSUnit test and run a few times for 10000, 20000, 50000 and 100000 images on a high-end laptop with Windows 10 Pro on a SSD.


The SQLite Insert / Update takes much longer than the simple file storage which is not surprising since the database structure is built by SQLite which takes time as well for the first usage. Interestingly is text JSON and binary BSON almost equal in write runtime performance.


Writing and creating the cache takes longer using SQLite but what about reading out the cache? Keep in mind the important performance for caching is the read/select part. That is why we are implementing caching. The insert/update/write should ideally be done in a background thread after the data was shown to the user, so it does not impact the user’s perception.


Below is the time I measured using the ImageData POCO and the above SQLite Where().FirstOrDefaultAsync() (DB Select) statements. This was compared to File.ReadAllText of the Json.NET text deserialized data (File read JSON) and also the FileStream read of the Json.NET binary deserialized data (File read BSON).


Reading the data is relatively fast with SQLite and faster than JSON text serialization for all cases. For most cases reading binary BSON is still faster than SQLite Select but it seems somewhere between 50,000 and 100,000 items is the sweet spot where SQLite Select has a better performance than BSON. One explanation could be the file IO cost. SQLite uses a single database file so it only has to open a file once for all 100,000 items; whereas the simple storage file solution stores each BSON ImageData in its own file, so it has to open 100,000 files.



In general it is recommendable to use SQLite for all data caching and not to implement a custom, storage file-based caching solution. Even simple structures and binary image data can be faster to query with SQLite once it is cached. More complex scenarios like EPG data queries with more query parameters are even better suited.


The developer productivity and code quality is nicer as well. SQLite.Net with PCL also allows one to implement the caching solution as unit tests which usually is a huge increase in developer productivity and code quality. Caching with SQLite is a win on many levels.


Nevertheless, it always depends and there will never be an all embracing solution for all scenarios. If the solution involves just a few data items which can be easily differentiated, a simple storage solution with binary JSON (BSON) can be a faster path.


For our Fire TV app and the EPG we had to implement SQLite.NET again served as the right tool and provided the foundation for a great UX.

2018 IdentityMine, Inc. Privacy Policy