What is Edge Computing?


Posted by Mike Fitch on Aug 19, 2021


Edge computing. Edge technologies. Edge platforms. Edge devices. Edge to enterprise. We’ve seen the term “edge” used so many times in news, articles, blogs and campaigns that we’re starting to feel some semantic satiation on whether it’s even a real word anymore.

 

Edge… edge… edge. Okay, never mind; I digress. There are many applicable uses for edge computing, but before we can dive into it, we’re providing thorough understanding of the term so that you can easily define it, provide relevant examples and showcase the benefits the next time a colleague or customer brings it up.

 

By the book: Defining “edge”

 

When we think of a literal edge of an object, it indicates the point at which a surface ends, or perhaps the furthest point from the center or core of that surface. If you think of the data center being the center of the object, with connected devices sitting closer to those points farthest from the center, you have the general edge concept. The earliest tech definitions of edge computing were broad, referring to any data stored at the edge of the network. As we’ve been able to dial in the overall edge framework, edge computing refers to the better use of data near a user or device (the edge), rather than in a remote data center (the core).

 

Edge is often used in the same breath as IoT, devices and 5G connectivity. However, today, less than 10 percent of data is created and processed at the edge, but is expected to grow to 75% by 2025 ( Gartner). As we move toward this reality, it’s important to note why: edge computing ensures devices will become smarter while reducing the reliance on data to be processed and transported from a remote server.

 

Some additional industry definitions:

 

Gartner — The edge is a part of a distributed computing topology in which information processing is located close to the edge — where things and people produce or consume that information.

 

TechTarget — Edge computing is a distributed information technology (IT) architecture in which client data is processed at the periphery of the network, as close to the originating source as possible.

 

The Verge — Edge computing is computing that’s done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work.

 

Cloudwards — Edge computing is a distributed computing paradigm that brings computation and data storage closer to the sources of data.

 

Cisco — Edge computing is a model that shifts computing resources from central data centers or public clouds closer to devices, that is ,embedded at the edge of service provider networks.

 

HPE — Edge computing is a distributed, open IT architecture that features decentralized processing power, enabling mobile computing and Internet of Things (IoT) technologies.

 

IBM — Edge computing is a distributed computing framework that brings enterprise applications closer to data sources such as IoT devices or local edge servers.

 

The edge is a part of a distributed computing topology in which information processing is located close to the edge-where things and people produce or consume that information.

Edge computing is a distributed computing framework that brings enterprise applications closer to data sources such as IoT devices or local edge servers.

 

In essence, edge computing is about geographic proximity. For years, data has been held in the cloud, which has and will continue to be an asset in data storage. By geographic proximity, the cloud is off-premise, closer to the core of the data center. By comparison, edge technology processes data much closer to the source, such as an individual user or a connected device, for speed, latency and security, rather than depending on the unknown of the cloud.

 

For example: Edge use cases

 

With approximately 21.5 billion connected IoT devices around the world, the number of edge use cases is vast. Here are some of the more common, likely examples of where you can find efficient use of storing data at the edge:

  • Autonomous driving and traffic control — The future of automotive is around self-driving, autonomous vehicles. When you have a full network of autonomous vehicles attempting to communicate with one another through sensors and cameras, ultra-low latency is an absolute priority. With edge computing, data is stored in the vehicles or in the immediate network to allow for smooth, safe traffic control, rather than needing to analyze and transport large volumes of traffic data from a centralized cloud, which can delay the instant decision-making needed to avoid collisions with other vehicles, humans or objects.

  • Smart energy grid — Organizations are placing green, energy-efficient initiatives at the forefront of their sustainability pledges. One way to do this is through sensors and IoT devices that can monitor energy usage in warehouses and offices. By analyzing energy consumption through edge computing, real-time adjustments can be made on machinery or lighting during peak or off-peak hours.

  • Online gaming — Online and community gaming, whether casually or competitively, is highly dependent on low latency. Even just a few dropped frames per second can be a major difference in the growing esports market. By building edge connectivity as close to the gamers as possible, an immersive, hyperspeed gaming experience is built.

  • Smart homes — When a home adds more IoT devices to its network, it takes more power and processing to run efficiently. For the most part, data is cent to a remote server, but with edge computing, smart home owners can bring the storage closer to the home in order to reduce backhaul costs, latency and security risks. Devices such as smart speakers would respond much quicker.

The payoff: Benefits of data at the edge
 

There have been many useful applications of cloud computing — convenience, cost, efficiency — which is now being taken to the next tier through edge devices. Thanks to devices getting smarter and more powerful, they are becoming more capable of handling and processing large amounts of data, reducing the need for the compute power of a traditional data center. By encouraging organizations to move their data to the edge, there’s an emphasis reducing latency and providing more processing of data close to the source.
 

The primary benefits of creating, processing and storing data at the edge:

  1. Speed — It’s been mentioned several times, but the primary advantage of computing power at the source is reduced latency, or the time it takes to send/receive data. There may be times in cloud computing where the server is in another city, another state or even across the world, reducing the likelihood that immediate, real-time decisions can be made, especially when quicker response times are essential.
  2. Efficiency — Data takes up an enormous amount of data center bandwidth. Think about a 24-hour security camera; most of the data is inconsequential, and not the best use of data center storage. By moving the data to the camera itself, important footage can be saved to the data center, removing the rest. When you reduce bandwidth at the data center level, it also reduces transmission costs on the enterprise.
  3. Security — From a security perspective, the fewer points of contact and the less distance data has to travel, the safer it is. Edge data is also stored in multiple places, rather than all in a single server, so in the event of a hack, damage is minimized.

Edge computing will be a critical portion of enterprise IT strategy moving forward. As IoT devices grow in sophistication and innovation, they will be able to handle more of the data creation, management and storage workload that today lives on cloud servers. Over the next few weeks, be on the lookout for additional edge computing content here on Authority as we learn from some of our thought leaders on their perspective.



About the Author

TD Synnex Editor

Photo
/content/dam/techdata/language-masters/americas/english/about-us/blog/author-bios/mike-fitch.jpeg
Name
Mike Fitch
Title
Content marketer and communicator through and through. ASU grad with more than 10 years of B2B tech marketing/communications experience.