<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=359300&amp;fmt=gif">

lan609-in-search-of-edge-computing.jpg

In recent years there’s been a massive migration of processing and data into the cloud, but powerful though the arguments are for centralising computing and resources in large data centres, it’s not a suitable architecture for all applications.

We’re not talking here about some of the usual arguments against cloud, such as data sovereignty and control. We’re talking about issues of latency — the need to process data and make decisions on it close to its source – and the need to reduce the sheer volume of data, much of it redundant, that can be generated by some applications. Welcome to the world of cloud computing, aka fog computing.

Edge computing is very much a child of the Internet of Things. Some of these things – of which there will be billions – will generate massive amounts of data. In some cases most of it will be redundant. In some cases it will require an instant response: hence the need to decentralise compute resources to the edge of the network, near where the action is.

Fog computing is a term coined by Cisco for its version of edge computing.  

It seems to have made its debut in an academic paper from 2012. Originally it was received with some scepticsm. One critic branded the concept “an ill-conceived marketing metaphor that further confuses the cloud market,” saying it represent a reason why “reporters should be trained not to parrot the marketing hype of large companies.”

However, far from clearing, the fog has thickened, with the launch in November 2015 of the Open Fog Consortium with founding members being Cisco, chipmakers ARM and Intel, Dell, Microsoft and Princeton University, through the Princeton University Edge Laboratory.

But, to get back to the more generic concept of edge computing. Fog Consortium members Intel, Dell and Microsoft all have their own implementations of cloud computing. Intel and HP announced in November 2015 that they would jointly develop cloud computing products. Akamai has teamed up with IBM. Dell announced its Edge Gateway in October 2015.

All these vendors have their own take on edge computing. If you want to get a more neutral overview, a good place to start is the white paper The Drivers and Benefits of Computing from Schneider Electric.

As a provider of data centre infrastructure, its pretty agnostic to the processing and data architectures it supports. The white paper “explains the drivers of edge computing and explores the various types of computing available.”

While cloud computing can be implemented in small devices out in the field, Schneider is in the business of supplying data centre infrastructure. So it suggests there is a whole gamut of implementations stretching from devices through localised data centres with one to 10 racks and regional data centres. It concludes an optimal architecture that “provides the deployment speed and capacity in line with future IoT application demands are the localised 1-10 rack versions.

However the model chosen will be very dependent on the number of IoT devices, the volume of data and the nature of the application.

How successful is your IT strategy? Not sure, download our Free guide below to why your companies IT Strategy is key to businesses success. 


Why your companies IT strategy is key to businesses success

 

We’re creating a business that provides unlimited potential for our team. We believe that each and every team member has an equal opportunity to play a strategic and impactful role.

GET IN TOUCH