What is EDGE COMPUTING?

Iqra Anwar
6 min readDec 8, 2021

--

Gartner defines edge computing as “a part of a distributed computing topology in which information processing is located close to the edge — where things and people produce or consume that information.”

At its basic level, edge computing brings computation and data storage closer to the devices where it’s being gathered, rather than relying on a central location that can be thousands of miles away. This is done so that data, especially real-time data, does not suffer latency issues that can affect an application’s performance.

Edge computing was developed due to the exponential growth of IoT devices, which connect to the internet for either receiving information from the cloud or delivering data back to the cloud. And many IoT devices generate enormous amounts of data during the course of their operations.

Edge Computing

Why is Edge Computing Important?

For many companies, the cost savings alone can be a driver towards deploying an edge-computing architecture. Companies that embraced the cloud for many of their applications may have discovered that the costs in bandwidth were higher than they expected.

Increasingly, though, the biggest benefit of edge computing is the ability to process and store data faster, enabling more efficient real-time applications that are critical to companies. Before edge computing, a smartphone scanning a person’s face for facial recognition would need to run the facial recognition algorithm through a cloud-based service, which would take a lot of time to process. With an edge computing model, the algorithm could run locally on an edge server or gateway, or even on the smartphone itself, given the increasing power of smartphones. Applications such as virtual and augmented reality, self-driving cars, smart cities, and even building-automation systems require fast processing and response.

Companies such as NVIDIA have recognized the need for more processing at the edge, which is why we’re seeing new system modules that include artificial intelligence functionality built into them.

Privacy and security

However, as is the case with many new technologies, solving one problem can create others. From a security standpoint, data at the edge can be troublesome, especially when it’s being handled by different devices that might not be as secure as a centralized or cloud-based system. As the number of IoT devices grows, it’s imperative that IT understand the potential security issues around these devices, and make sure those systems can be secured. This includes making sure that data is encrypted, and that the correct access-control methods and even VPN tunnelling are utilized.

What about 5G?

Around the world, carriers are deploying 5G wireless technologies, which promise the benefits of high bandwidth and low latency for applications, enabling companies to go from a garden hose to a firehose with their data bandwidth. Instead of just offering faster speeds and telling companies to continue processing data in the cloud, many carriers are working edge-computing strategies into their 5G deployments in order to offer faster real-time processing, especially for mobile devices, connected cars and self-driving cars.

In its recent report “5G, IoT and Edge Compute Trends,” Futuriom writes that 5G will be a catalyst for edge-compute technology. “Applications using 5G technology will change traffic demand patterns, providing the biggest driver for edge computing in mobile cellular networks,” the firm writes. It cites low-latency applications that include IoT analytics, machine learning, virtual reality, autonomous vehicles as those that “have new bandwidth and latency characteristics that will require support from edge-compute infrastructure.”

How does Edge Computing Work?

Edge computing is all a matter of location. In traditional enterprise computing, data is produced at a client endpoint, such as a user’s computer. That data is moved across a WAN such as the internet, through the corporate LAN, where the data is stored and worked upon by an enterprise application. Results of that work are then conveyed back to the client endpoint. This remains a proven and time-tested approach to client-server computing for most typical business applications.

But the number of devices connected to the internet, and the volume of data being produced by those devices and used by businesses, is growing far too quickly for traditional data centre infrastructures to accommodate. Gartner predicted that by 2025, 75% of enterprise-generated data will be created outside of centralized data centres. The prospect of moving so much data in situations that can often be time- or disruption-sensitive puts incredible strain on the global internet, which itself is often subject to congestion and disruption.

So IT architects have shifted focus from the central data centre to the logical edge of the infrastructure — taking storage and computing resources from the data centre and moving those resources to the point where the data is generated. The principle is straightforward: If you can’t get the data closer to the data centre, get the data centre closer to the data. The concept of edge computing isn’t new, and it is rooted in decades-old ideas of remote computing — such as remote offices and branch offices — where it was more reliable and efficient to place computing resources at the desired location rather than rely on a single central location.

Why is Edge Computing Important?

There is an ever-increasing number of sensors providing a base of information for the Internet of Things. It has traditionally been a source of Big Data. Edge Computing, however, attempts to screen the incoming information, processing useful data on the spot, and sending it directly to the user. Consider the sheer volume of data being supplied to the Internet of Things by airports, cities, the oil drilling industry, and the smartphone industry. The huge amounts of data being communicated create problems with network latency, bandwidth, and the most significant problem, speed. Many IoT applications are mission-critical, and the need for speed is crucial.

EC can lower costs and provide a smooth flow of service. Mission-critical data can be analyzed, allowing a business to choose the services running at the edge, and to screen data sent to the cloud, lowering IoT costs and getting the most value from IoT data transfers. Additionally, Edge Computing provides “Screened” Big Data.

Transmitting immense amounts of data is expensive and can strain a network’s resources. Edge Computing processes data from, or near, the source, and sends only relevant data through the network to a data processor or Cloud. For instance, a smart refrigerator doesn’t need to continuously send temperature data to a Cloud for analysis. Instead, the refrigerator can be designed to send data only when the temperature changes beyond a certain range, minimizing unnecessary data. Similarly, a security camera would only send data after detecting motion.

Depending on how the system is designed, Edge Computing can direct manufacturing equipment (or other smart devices) to continue operating without interruption, should internet connectivity become intermittent, or drop off, completely, providing an ideal backup system.

It is an excellent solution for businesses needing to analyze data quickly in unusual circumstances, such as aeroplanes, ships, and some rural areas. For example, edge devices could detect equipment failures, while “not” being connected to a Cloud or control system.

What are the Benefits of Edge Computing?

Edge computing addresses vital infrastructure challenges — such as bandwidth limitations, excess latency, and network congestion — but there are several potential additional benefits to edge computing that can make the approach appealing in other situations. Edge computing is useful where connectivity is unreliable, or bandwidth is restricted because of the site’s environmental characteristics. Examples include oil rigs, ships at sea, remote farms, or other remote locations, such as a rainforest or desert. Edge computing does the compute work on-site — sometimes on the edge device itself — such as water quality sensors on water purifiers in remote villages and can save data to transmit to a central point only when connectivity is available. By processing data locally, the amount of data to be sent can be vastly reduced, requiring far less bandwidth or connectivity time than might otherwise be necessary.

--

--

Iqra Anwar
Iqra Anwar

Written by Iqra Anwar

Future Data Scientist on a mission to turn complex data into impactful insights. Passionate about AI, ML, and building solutions that shape the future.

No responses yet