Developers may redirect resources from cloud to edge computing

edge computing

New applications such as augmented and virtual reality as well as autonomous driving functions require huge amounts of memory and processing power – and they need them instantaneously. 

There is not enough time to send the data to the cloud and receive back some instructions. In the case of autonomous driving, milliseconds could be the difference between safety and an accident.

In simple terms, the time between input and output is called latency, and autonomous functions, which are available in many cars now, cannot afford any latency.

And as such new applications grow in popularity, some people are saying that cloud computing may be taking a back seat to edge computing.

Edge computing means that the computing is done on the edge device, the device used by the user, the computing device closest to the user – whether that device is a computer, a smartphone or a car.

It’s obvious that if the edge device does not need to send and receive data to and from the cloud, it’s likely to operate faster. But so far the problem has been that the edge device has not been powerful enough to deal with all the data it needs to deal with.

Recently, however, with the increasing miniaturisation of computing components – whether we’re talking about sensor-laden chipsets or smaller, more powerful processors – there is the possibility that edge devices could be enough, they could do without a cloud connection to manage those heavyweight apps.

One of the keys to making edge computing work, however, is in the development. Developers who may be used to think of the cloud first and edge device last may have to think the other way round.

One possible way the cloud could stay relevant is that there may be more of them. They may be smaller, and move closer to the end user. But that’s just a possibility, and anyway, it still wouldn’t be as fast as the edge device doing all its computing onboard.

Some developers are changing their emphasis to account for the realisation that the latency inherent in connecting to cloud computing systems may be unworkable for some applications.

In an article in Wired magazine, Juhiana Honkala, founder and CEO of Hatch, a spin-off from Angry Birds developer Rovio, says it has developed a game streaming service that reduces the number of steps required for data processing and connecting to the cloud.

Honkala told Wired: “We are one of the first consumer-facing use cases for edge computing. But I believe there will be other use cases that can benefit from low latency, such as AR/VR, self-driving cars, and robotics.”

Hatch is working a company called Packet, which is said to be installing servers in as many locations as possible, like mini-data centres everywhere.

But even if the server was in the same room as the user, it still would not be as fast as if it wasn’t needed at all.