Oftentimes when building mobile apps, we want to front load data to avoid loading time in between each section of the app. Today, we will look at building a simple singleton cache in Flutter, and how using it can create a better user experience.
Let’s first define an abstract class to serve as an interface:
abstract class CacheStorage {
read({required String key, bool ignoreTtl = false});
update({required String key, required value, Duration ttl});
delete({required String key});
destroy();
}
This will act as the blueprint for our cache. The cache can be read, updated, deleted, or destroyed. We can set a “time to live” value commonly abbreviated as TTL, and we can also choose to ignore it. The delete function will remove a single entry in the cache, while the destroy function will empty the cache altogether.
The more experienced among you might have noticed that value on the update doesn’t have a type. To keep this article short and sweet, I’ve decided to omit generics but if you’re interested, you can read more about them in the Dart Language docs.
Now that we have an idea of what a cache is, let’s look at the implementation:
class MemoryCache implements CacheStorage {
MemoryCache._privateConstructor();
static final MemoryCache _instance = MemoryCache._privateConstructor();
factory MemoryCache() {
return _instance;
}
Map<String, CacheItem> cache = {};
@override
delete({required String key}) {
return cache.remove(key) != null; /*True it existed, false it did not*/
}
@override
destroy() {
cache = {};
}
@override
read({required String key, bool ignoreTtl = true}) {
if (cache[key] != null &&
(ignoreTtl || DateTime.now().isBefore(cache[key]!.ttl))) {
return cache[key]!.value; /*Cache Hit*/
}
return null; /*Cache Miss*/
}
@override
update(
{required String key,
required value,
Duration ttl = const Duration(minutes: 4)}) {
var expirationDate = DateTime.now().add(ttl);
cache.update(
key,
(_) => CacheItem(value: value, ttl: expirationDate),
ifAbsent: () => CacheItem(value: value, ttl: expirationDate),
);
}
}
class CacheItem {
CacheItem({
required this.value,
required this.ttl,
});
final dynamic value;
final DateTime ttl;
}
You’ll notice that the implementation is quite simply a map in memory, and that its class is given persistence by its private constructor and static instance. This is a simple singleton memory cache, and the functions are simple as well. The destroy just resets the cache to an empty map, and the other functions are just operations on a map. The cache in this example does use a CacheItem to keep track of TTL. The advantage of using a CacheItem is, if there was additional information you wanted to perform logic on in the storage implementation, it could be stored here.
Now that the implementation is done, let’s discuss why you’d use this over just making services singletons, and the strategy in which this MemoryCache could be used.
In my January 2025 article, A Clever way to Structure Flutter Apps, I stated that one of the tasks of the Service was to call functions on the repo. Since the repo functions are essentially API calls, the service just marshals data. That may make it seem like a tempting candidate for a singleton. Besides the boilerplate of adding private constructors and factory methods, let us consider a phrase from our foredevelopers long ago…
There are only two hard things in Computer Science: cache invalidation and naming things.
https://martinfowler.com/bliki/TwoHardThings.html
Imagine going through all your services, and invalidating all the persisted data. It would be quite easy to miss a spot when adding new services. Hence we delegate this task to the MemoryCache implementation, and compose it into our services. Like so:
class FooService {
CacheStorage cache = MemoryCache();
var repo = FooRepo();
final String key = "foo-service-data";
fetch() async {
var data = await repo.getData();
cache.update(key: key, value: data);
}
get() {
cache.read(key: key);
}
}
And just to be sure users are getting fresh data when they login, you could merely destroy the cache on successful login, then call the user specific services like so:
class SessionService {
/*bootup processes*/
onSuccessfulLogin() async {
MemoryCache().destroy();
await Future.wait([ /*Allows you to run async calls in parallel*/
FooService().fetch(),
/*All your services you want to load up front for a smooth experience*/
...
This solves the problem of worrying about stale data on login. The caveat here is that you must also update this cache whenever you make any kind of data update. Luckily, since that should happen in the service, we can easily update the cache there.
class FooService {
...
updateFoo(String newFoo) async {
var updateData = await repo.updateFoo(newFoo);
cache.update(key: key, value: data);
}
}
How does all this enhance user experience? By simply reading the cache during a build (when available) rather than making an asynchronous call, we pull the necessary data on login, and prevent extra data calls between page switches. This makes those page transitions feel instantaneous, and makes users perceive a speedier site, even if the initial load time is a little longer.
One final thought I’d like to leave you with is this: Be aware of your data sizes. While I’ve never encountered an issue with the size of the data I manage, it is important to be cognizant of how large the data coming through your APIs can be and manage it appropriately.
Until Next Time, Happy Coding,
-Joe