Edge Computing

By 2020 at the latest, the Internet of Things (IoT) should comprise of around 50 million objects worldwide – the Internet of Things being a network of electronic systems from personal scales to industrial production facilities. All these devices generate continuous data output that must be stored and evaluated in real time for critical applications: a task that established cloud solutions will hardly be able to cope with.

The slowdown in broadband expansion and delays in data transmission between central cloud servers and end devices at the edge of the network are the main obstacles to growth. Edge computing avoids both problems and therefore indicates a paradigm shift in the age of cloud computing.

What is edge computing? A definition

Edge computing is a design approach for IoT environments that provides IT resources like storage capacity and computing power as close as possible to the data-generating devices and sensors. This makes the concept an alternative to traditional cloud solutions with central servers.

The term edge is derived from the English word for corner: an allusion to the fact that data processing in this approach does not take place centrally in the cloud, but decentrally at the edge of the network. Edge computing is intended to provide what the cloud does not offer so far: Servers that can evaluate mass data from intelligent factories, supply networks or traffic systems without delay and take immediate action in the event of an incident.

Edge computing basics at a glance

Edge computing serves as a new architectural concept for IoT environments, but does not bring new network components into play. Instead, established technologies in compact design are fused under a new name. Here is an overview of the most important basic terms of edge computing:

  • Edge: In IT jargon, the “edge” is the edge of the network. However, which network components are assigned to the network edge depends on the situation. In telecommunications, for example, a mobile phone can be the edge of the network; in a system of networked, autonomously driving cars, the individual vehicle. In situations like these, you’re talking about an edge device.
     
  • Edge device: Every data generating device at the edge of the network functions as an edge device. Possible data sources are sensors, machines, vehicles or intelligent devices in an IoT environment – like washing machines, fire detectors, light bulbs or radiator thermostats.
     
  • Edge gateway: An edge gateway is a computer located at the transition between two networks. In IoT environments, edge gateways are used as nodes between the Internet of Things and a core network. The latter consists of powerful routers that offer enough computing power to preprocess data from the IoT. Edge gateways provide various interfaces to wired and radio-based transmission technologies and communication standards – like Ethernet, WLAN, Bluetooth, 3G mobile radio, LTE, Zigbee, Z-Wave, CAN bus, Modbus, BACnet or SCADA.

Edge computing vs. fog computing

The approach of adding local calculation instances to the cloud is not new. As early as 2014, the US technology group Cisco established the marketing term “fog computing”: a concept that relies on the decentralized preprocessing of data in “fog nodes”. Fog nodes are local mini data centers that are located in front of the cloud and therefore represent an intermediate level in the network (referred to as a fog layer). Data generated to IoT environments no longer reaches the cloud directly, but is first merged into fog nodes, evaluated and selected for further processing steps.

Today, edge computing is seen as part of fog computing, where IT resources like computing power and storage capacity move even closer to IoT terminals at the edge of the network. While data processing in fog computing architectures initially takes place on the fog layer, in edge computing concepts it takes place in powerful IoT routers or even directly on the device or sensor. A combination of both concepts is also conceivable. The following graphic shows an architecture like this with cloud, fog and edge layers.

Tip

Reference architectures for Fog and Edge computing environments are being developed within the framework of Open Fog Consortiums, the Open Fog Consortium’s official website (an open association of industry and science).

Why choose edge computing?

A drilling rig produces around 500 gigabytes of data per week; the turbine of a modern commercial aircraft around 10 terabytes in 30 minutes. Data volumes of this size can neither be loaded into the cloud nor evaluated in real time through mobile networks. In addition, the use of third-party networks is associated with high costs. It must therefore be decided locally how much and which generated information should be transmitted and stored to central systems and which data can be evaluated locally. This is where edge computing comes in.

Currently, central data centers carry the majority of the data load generated by the internet. Today, however, data sources are often mobile and too far away from the central mainframe to ensure an acceptable response time (latency). This is particularly problematic for time-critical applications like machine learning and predictive maintenance, with intelligent production facilities and self-regulating supply networks.

Note

Predictive maintenance should revolutionize the maintenance and administration of future factories. Instead of displaying failures and malfunctions afterwards, the new maintenance concept is to detect risks for defects with the help of intelligent monitoring systems even before an actual defect occurs.

Private internet use – streaming high resolution videos to mobile devices, VR and augmented reality – is already pushing traditional cloud concepts and the bandwidth of available networks to their limits. And the introduction of the new 5G mobile radio standard with a transmission speed of up to 10 GB/s will not provide a release, but increase it, according to experts. Edge computing does not provide a solution to this problem either. Rather, the concept raises the question of whether all data must actually be processed in the cloud in IoT environments.

Edge computing is not seen as a replacement, but as a supplement to the cloud, which provides the following functions:

  • Data collection and aggregation: While data sources in traditional cloud architectures transfer all data to a core data center in the cloud for central analysis, edge computing relies on data collection close to the source. For this purpose, micro-controllers are used directly in the device or else in gateways – i.e. “intelligent routers”. These combine data from different devices and enable the preprocessing and selection of a dataset. Uploading into the cloud only takes place if information cannot be evaluated locally, detailed analyses are required or data is to be archived.
     
  • Local data storage: Edge computing is particularly useful when broadband data needs to be provided locally. For large numbers of data, real-time transmission from the core data center in the cloud is usually impossible. This problem can be circumvented by storing corresponding data decentrally at the edge of the network. In a scenario like this, edge gateways act as replica servers in a content delivery network.
     
  • AI supported monitoring: The decentralized processing units of an edge computing environment receive data, evaluate it and subsequently enable continuous monitoring of the connected devices. Combined with machine learning algorithms, status monitoring in real time is possible – for example to control and optimize processes in intelligent factories.
     
  • M2M communication: M2M stands for “machine to machine”, a buzzword for the automated exchange of information between end devices through any communication standards. In IoT environments like an intelligent factory, M2M communication could be used for remote monitoring of machines and plants. Within the framework of process control, both communication between the end devices and communication with a central control center, which functions as a monitoring authority (AI supported monitoring), is possible.

The following graphic illustrates the basic principle of a decentralized cloud architecture, in which edge gateways act as a mediating instance between a central computer in the public or private cloud and IoT devices at the edge of the network.

Applications for edge computing architectures

Uses for edge computing usually originate from the IoT environment and, like the concept of a decentralized cloud architecture, still exist only in future projects. An important growth driver for edge computing technology is the increasing demand for real-time capable communication systems. Decentralized data processing is classified as a key technology for the following projects:

  • Car-to-Car communication
  • Smart Grid
  • Smart Factory

In the future, a Connected Car – networked car – will become more than just a vehicle with an Internet connection. The future of mobility promises cloud-based early warning systems based on car-to-car communication and even completely autonomous means of transport. This requires an infrastructure that makes it possible to exchange data in real time between the vehicles and communication points on the track.

The electricity grid of the future will also be adaptive, and, thanks to decentralized energy management systems, will adapt to fluctuations in output. Smart grids are becoming a key technology in the context of the energy revolution. Switching to renewable energies poses new challenges for electricity grids. Instead of a few large central generators, numerous smaller and decentralized power generators have to be connected to storage facilities and end consumers. Thanks to solar panels, some of the latter even become electricity generators themselves. Intelligent networks therefore not only transport electricity, but they also supply data for its generation, storage and consumption. This enables everyone involved to react to changes in real time. The aim is to keep power grids stable despite increasing complexity and to make them more efficient through intelligent load control. New cloud concepts like Edge and Fog computing are needed to capture, store and process the resulting data masses in the shortest possible time.

A smart factory is a self-organizing production facility and logistics system that ideally no longer requires human intervention. An intelligent factory is practically a system of networked devices, machines and sensors that communicate with each other through the Internet of Things to carry out manufacturing processes. The smart factory communication system even includes the finished product and can therefore automatically react to supply and demand. AI systems and machine learning can be used to automate maintenance processes and production optimization. This requires an IT infrastructure that can evaluate large amounts of data and react to unforeseen events without delay. Traditional cloud systems failed because of the latency problem. Fog and edge computing architectures solve this problem through distributed data processing.

Edge computing: Advantages and disadvantages

The following table compares the advantages and disadvantages of edge computing architecture compared to classic cloud environments.

Advantages Disadvantages
Real-time data processing: Edge computing architectures bring processing units closer to the data source, enabling real-time communication. The latency problem common with classic cloud solutions is avoided. More complex network structure: A distributed system is much more complex that a centralized cloud architecture. An edge computing environment is a heterogeneous combination of various network components, some from different manufacturers, which communicate with each other through a variety of interfaces.
Reduced data throughput: Edge computing primarily provides for local data processing in edge gateways. Only data that cannot be evaluated locally or should be available online is loaded into the cloud. Edge hardware acquisition costs: Centralized cloud architectures are handy, particularly because they require significantly less local hardware. This advantage is lost with distributed systems.
Data security: Edge computing leaves much of the data on the local network. This makes it much easier for companies to meet compliance requirements. Higher maintenance costs: A decentralized system with several computing nodes requires higher maintenance and administration costs than a central data center.
We use cookies on our website to provide you with the best possible user experience. By continuing to use our website or services, you agree to their use. More Information.