Look sharp: an introduction to edge computing

Posted in: Business Insights, Technical Track

For years, data consumption was associated with individuals: people who stream videos, play games and otherwise live life with the help of the internet. But even though the average user’s daily data consumption is expected to increase to 1.5 gigabytes by 2020, that figure is dwarfed by the exponentially growing data demands of the Internet of Things (IoT). Today, there are more internet-connected devices than people in the world, and Gartner predicts those numbers will grow to 20 billion by 2020. The Internet of Things is creating new challenges in how all this data will get processed, and edge computing is providing the answers.

What, exactly, is edge computing? It’s an approach to computing that takes much of the burden of processing away from the cloud, offloading it instead to a small server that is physically close to the user.

Edge computing gets its name from the idea that it pushes computing intelligence to the edge of a network’s devices. The relevance of this localized data becomes clear the moment we see how IoT is now evolving. Consider, for example, the anticipated data needs of a self-driving car. With a data consumption rate of 4 terabytes per day (equal to more than 2,600 individual users), a self-driving car is effectively a cloud on wheels, one that can’t afford latency when making life-or-death decisions for passengers and pedestrians. Any network, no matter how fast, would soon be overwhelmed by the processing demands of a fleet of autonomous cars.

And the self-driving car is far from the most demanding use case. A connected airplane uses 5 terabytes each day. A smart hospital needs 3 terabytes. And for a smart factory, the data needs balloon to 3 petabytes each day — that’s 3 million gigabytes.

But the real-world uses for edge computing aren’t necessarily as ambitious as the ones described above. The humble digital surveillance camera is a perfect example of a use case that is relevant for the typical enterprise. Pre-edge security cameras were incapable of doing any processing on their own; all data needed to be sent to an external server or the cloud. But today’s security demands are very different. Organizations are installing far more cameras, and those cameras can now recognize faces, licence plates and more. Edge computing allows more of the necessary processing to happen within the camera itself, thus sparing more distant resources from dealing with workloads that exceed their bandwidths.

Edge computing is establishing itself as the standard approach to handling the data demands of IoT. But it also presents some risks that need to be considered and planned for. Mistakes in configuration, for example, are far more common when organizations are working with hundreds or thousands of devices. Security issues will now become even more critical, since the explosion of intelligent devices provides hackers with a greatly expanded vector for attack. Finally, a financial plan for edge computing should anticipate licensing costs. In the example of digital surveillance, you’re no longer done when you pay for the camera; you also need to plan for the costs of specific applications, future support, security upgrades and more.

In the age of IoT, the limits of the centralized data-processing warehouse are painfully clear. Today’s data needs to be processed quickly and reliably, and the best place to do that is near the edge of your network, where the data is being generated. Find out how Pythian can help make edge computing work for your organization.



Want to talk with an expert? Schedule a call with our team to get the conversation started.

Interested in working with Alifiya? Schedule a tech call.

No comments

Leave a Reply

Your email address will not be published. Required fields are marked *