Edge computing is gaining more and more popularity in the IoT domain. In 2018, it was one of the top technology trends forming the foundation for the next generation of digital businesses. In parallel, given the massive amounts of data and the need to optimize computational resources, we are also seeing an increasing tendency to send data to the cloud.
While edge and cloud computing are often seen as mutually exclusive approaches, larger IoT projects frequently require a combination of both. In order to understand today’s vision of IoT and the complementary characters of edge and cloud computing, we want to go back in time and take a look at their evolution over recent decades.
The beginnings of decentralized computing
The origin of edge computing can be traced back to the 1990s, when Akamai launched its content delivery network (CDN). The idea back then was to introduce nodes at locations geographically closer to the end user for the delivery of cached content such as images and videos.
In 1997, in their work “Agile application-aware adaptation for mobility,” Nobel et al. demonstrated how different types of applications (web browsers, video, and speech recognition) running on resource-constrained mobile devices can offload certain tasks to powerful servers (surrogates). The goal was to relieve the load on the computing resources. And, as proposed in a later work, to improve the battery life – of mobile devices. Today, for example, speech-recognition services from Google, Apple, and Amazon work in a similar way. In 2001, with reference to pervasive computing, Satyanarayanan et al. generalized this approach in their paper “Pervasive computing: vision and challenges.”
In 2001 scalable and decentralized distributed applications used, as proposed, different peer-to-peer (so-called distributed hash tables) overlay networks. These self-organizing overlay networks enable efficient and fault-tolerant routing, object location, and load balancing. Moreover, these systems also make it possible to exploit the network proximity of underlying physical connections in the internet, thereby avoiding long-distance links between peers. This not only decreases the overall network load but also improves the latency of applications.
Cloud computing is a major influencer in the history of edge computing and therefore deserves special mention. It attracted particular attention in 2006. The year when Amazon first promoted its “Elastic Compute Cloud.” This opened up a raft of new opportunities in terms of computation, visualization, and storage capacity.
Nevertheless, cloud computing as such was not the solution in all use-cases. With the advent of self-driving cars and (industrial) IoT, for example, there was an increasing emphasis on local processing of information in order to enable instantaneous decision-making.
Cloudlets and fog computing
In 2009, Satyanarayanan et al. introduced the term cloudlet in their paper “The case for VM-based cloudlets in mobile computing.” In this work, the major focus is on latency. Specifically, the paper proposes a two-tier architecture. The first tier is known as cloud (high latency), and the second as cloudlets (lower latency). The latter are decentralized and widely dispersed internet infrastructure components. Their compute cycles and storage resources can be leveraged by nearby mobile computers. Moreover, a cloudlet only stores a soft state such as cached copies of data.
In 2012, Cisco introduced the term fog computing for dispersed cloud infrastructures. The aim was to promote IoT scalability, i.e., to handle a huge number of IoT devices and big data volumes for real-time low-latency applications.
Cloud and edge computing for large-scale IoT applications
Today, an IoT solution has to cover a much broader scope of requirements. We see that in most cases, organizations opt for a combination of cloud and edge computing for complex IoT solutions. Cloud computing typically comes into play when organizations require storage and computing power to execute certain applications and processes, and to visualize telemetry data from anywhere. Edge computing, on the other hand, is the right choice in cases with low latency, local autonomous actions, reduced backend traffic, and when confidential data is involved.
Are you interested in learning more about how companies benefit from cloud and edge computing when implementing IoT solutions? Read our guide “Edge computing for IoT.”