2 min read

Using API layers on the edge for mobile apps

Recently I've been investigating how edge computing (also referred to as 'the edge') can boost the performance of mobile applications ('apps').

For those unfamiliar with the edge, check out my 10 minute talk as part of Jamstack Conf 2022 that does a quick overview.

A couple of challenges that developers working in this domain often have to consider stood out:

  • Intermittent connectivity; and
  • End users don't update their apps in a timely manner.

Intermittent connectivity

Everyone has likely encountered a โ€˜dead zoneโ€™ before where there's suddenly no cell service in a location. Driving through more rural areas is one area where I tend to encounter this.

This means that the speed at which a request is fulfilled is critical. The longer a request takes to complete, the higher the risk that it will fail because the user lost connection.

End users don't update their apps

Some folks out there will ignore app updates for not just days but months.

To account for this, developers create architectures that move as much functionality as possible onto servers under their control. This allows them to ship changes without depending on the end user. This design is invaluable when something needs to be shipped as soon as possible (such as a zero-day security fix).

Depending on where the servers are located though, they may be much further away from the user making the request. The side effect of this is longer times to fulfill the request due to the latency incurred from the physical distance the request needs to travel.

To mitigate this while keeping the benefits of hosting the application's functionality on a server, an API layer deployed to the edge can be used.

Using an API layer on the edge

The API layer would make requests to the appropriate backend service on behalf of the mobile application. It can then cache frequent, more generalized responses for better performance on those requests going forward. Due to being physically closer to the end user, the API layer on the edge will be faster than if the cached responses lived on a server within an availability zone.

import { Api } from 'funclify';

const api = new Api();

api.get("/", async (_, res) => {
    // Make request to Home service and return the response

api.get('/login', async (_, res) => {
    // Make request to Login service and return the response
Example using the 'funclify' package my co-worker Ed Stephinson made

To take things further, if the API response followed the structure of view models, it can easily be mapped into native UI on the mobile device. This greatly reduces the need to update the app when the content of the response changes.

One aspect to consider in the design of the API layer is the use of REST vs. GraphQL. For simplicity's sake, particularly with respect to caching, I suggest using REST.

Generalized responses are better to cache as a larger range of users will benefit from the performance gains. The more personalized the response, the less users that benefit.

GraphQL's ability to query deeply nested fields can be convenient but is a form of personalization. When we also consider that edge locations have smaller caches compared to servers located in cloud regions/availability zones, there's less room for more personalized caching than usual.

Improve mobile app performance with the edge

One of the edge's greatest strengths is that it boosts performance by reducing the physical distance the request has to travel. By caching responses at the edge, we can get further performance gains on top of that.

Mobile apps benefit from leveraging the edge where they can. Not only for the increased performance, but for the increased service reliability that that performance can bring by reducing the time it takes to fulfill a user's request where longer requests can result in a loss of connection before the request is fulfilled.

Thanks to Scott Birksted for his help and advice on this article.

Enjoy this post? Subscribe to be notified when I publish new content!