Blogs

Search

Edge Computing and Cloud Computing: Replace or Coexist?

There’s a new trend in modern computing that is coming up behind cloud computing, and this is called “edge computing.” The relationship that edge computing and cloud computing have can be fairly complex. Here’s a look at whether edge computing is set to replace the cloud, or whether, ultimately, something else will happen more akin to coexistence.

3

The Beginning: An Explosion of Data

The reason for all of this happening in the first place is the fact that we’re going through a period of exponential growth when it comes to IT and the rise of the availability and gross amount of data. This is what spurred on the entire cloud computing movement to begin with since there was so much data that it couldn’t really be all stored or analyzed locally.

Using the Internet in order to take all of the data that could now be collected because of the Internet, and analyze it remotely with more computing power and access than you could get in your personal server is what created the cloud computing craze. This approach became so much more effective than the old way of doing things that just about everyone was using cloud computing in a relatively short span of time.

Why Was Edge Computing Even Necessary?

Considering how powerful cloud computing seems to be, how successful and how widespread, it may seem odd why any new paradigm is even necessary in the first place. This brings up the question of what it was exactly within the cloud computing approach that was lacking enough to allow for another approach for doing things that don’t fit within the cloud computing purview.

The answer to this question is that cloud computing tends to promote a centralized computing environment and that this isn’t always ideal for all situations. The constant exponential improvement of data and other related technologies means that sometimes a central approach isn’t going to work. Edge computing helps to handle this problem by focusing on processing power that works at the edge of the network.

Essentially, it’s an alternative to only trying to keep all processing power at the center of the cloud or in one centralized data warehouse.

Just What Is Edge Computing and What Is Its Future?

Edge computing refers to the processing that happens at the edge of networks, and it’s particularly useful in specific situations including especially parts of the Internet of Things. Considering just how much the Internet of Things is exploding right now, and the significant projections for how it’s going to be important in the future, this is a highly relevant topic when we’re thinking about edge computing.

Because of the present huge and continuing increase in the need to process data coming from a hundred different smart items from refrigerators to cleaning robots to watches, lawnmowers, and just about anything else you can possibly think of, the problem that bears thinking about is how all of this data can be processed properly.

Edge computing handles this by doing the processing close to the source of the data instead of doing it in a centralized location like what happens normally in cloud computing.  The trick is organizing everything and splitting up the processing between the centralized system that the device is a part of and the device itself.

Ideally, the device can do the processing itself in this kind of setup, but if not, then it will simply be something else that’s closer to the device and at the edge of the network. The reason for this is because many devices, especially devices within the Internet of Things, are going to need their data processed quickly.

You don’t want some smart light to have to wait in line for processing about what color to make the light based on whatever criteria it’s given. It needs to be able to respond to this criteria instantly and process quickly enough so that the action it’s considering is still relevant by the time it decides to do it. Edge computing and keeping the processing close to the source, such as within the bulb itself, if it has this capability, is one way of making sure that this happens.

This isn’t to say that no processing is done centrally at all. Anything that’s not time sensitive may be done more centrally. If you’re just doing general analytics on all devices, this processing will also often not be done with edge computing since it would make more sense to do it centrally.

In general, this is going to be the future of edge computing, namely Internet of Things devices, ATMs, and others that fit well into the model of processing that needs to be done away from a central location.

The Dance Between Edge Computing and Cloud Computing

There’s a case to be made that edge computing is more of a complement to cloud computing than its inevitable replacement. It can’t really replace cloud computing because there’s likely going to continue to be a need for centralized processing for some time.

Instead, edge computing is a response to cloud computing, a way of covering for some of its shortcomings. This approach works better when you have cloud computing on the one hand, and then the processing power is also used on the edges with an edge computing model for devices where this would be advantageous.

You’re going to see this constant back and forth and mutualism between the two models.

Share article :

Social

Just launched! Get our latest playbook – 5 Steps to Successful Infrastructure Expansion