Alex Hardie, Principal
“…and Leon is getting larger!”
For the pop culture-savvy, this is a bit from the move “Airplane!”. For those in the industries of telecom and industrial process and control it’s a warning that fog computing has rolled in. It’s a gratuitous segue, but any reference to a line by Johnny is worth the effort, however clumsy.
By empirical definition “fog” is the condensation of water vapor in the air, at dew point, where low-level clouds can form. Fog computing, however is an emerging section of the wide area network that is the natural extension of “cloud” networks. It is worth mentioning that the proper definition of “fog computing” is the border between data center and the wide area network (WAN). In a fog network the compute function is distributed to the edge of the network. It’s there that data is acquired and/or created and can be acted upon at the most logical and efficient place between the data source and the cloud.
There are some differences between fog and cloud computing. The latency of application is low requirement
s in fog computing while in cloud computing it can be much higher. Fog computing is highly distributed and designed for real-time interactions rather than centralized and designed for batch processing. Additionally, the communications links to the cloud are both terrestrial and wireless whereas communications to the fog is primarily wireless.
What is Driving Fog Computing?The Internet of Things or IoT has many definitions.
The massive growth of sensory networks is creating a situation where the line between WAN and data center becomes increasingly obscured. IoT means many things to many people, but they all mean massive scale both in terms of the number of devices and in terms of the volume of data generated by these sensory and control networks.
First generation IoT applications all follow the similar architectures – star-hub or branch-tree. With each the intent is to collect field data and measurements and then transmit themto the data center for processing, classification and action.
Next generation IoT will needmore far distributed sensor and actuator nodes must have near real-time processing of the collected data at the point of acquisition, and require compute power to be available in the WAN before disseminating information to the data center.
Fog Versus Edge Computing
Although these terms are often used interchangeably, there is a subtle difference.
design, sensory data is transmitted across a local network from many endpoints to a shared compute platform for processing. In an edge design, the sensors are directly connected to the compute platform.
Edge computing allows for faster processing, reduced latency and removes a possible failure point as it eliminates the transmission step before processing. Fog computing is more scalable and comes at a slightly lower cost as the computing node is shared among more data points.
Fog/Edge Computing Characteristics
Fog computing nodes are deployed away from the main cloud data centers at the edge. Cloud computing on Fog nodes enables low and predictable latency. Fog application code runs on fog computing nodes as part of a distributed cloud application. The fog application may run specific code that may only be required for that location specific context such as serial to IP conversion.
Fog computing nodes are widespread and provide applications with awareness of device geographical location and device context. It also can cope with the mobility of devices, for example, if a device moves farther away from the current servicing computing node, the fog node can redirect the application on the mobile device to associate with a new application instance on a fog computing node that is now closer to the device.
The Impact on Network Communications
Within the Internet of Things are multiple heterogeneous wireless communications such as WiFi, ZigBee, Bluetooth, LTE and cellular coexist. The enablers of the IoT are the communications equipment manufacturers, the telecommunications carriers, and the implementation of enabling software technologies such as Network Functions Virtualization (NFV) and Software-Defined Networking (SDN).
Communications equipment manufacturers such as Redline Communications (www.rdlcom.com) design communications equipment for deployments over very large coverage areas that can scale from a few remote to over 100,000 devices. Freewave Technologies (www.freewave.com), another communications equipment manufacturer, has developed wireless machine-to-machine, M2M, communications equipment enabling direct communications between devices on the sensory network.
The telecommunications carriers such as AT&T (www.att.com) and Verizon Wireless (www.verizonwireless) are upgrading to the next generation of networking 5G. While the focus on mobile broadband will continue with 5G, support for a much wider set of diverse usage scenarios is expected. The three major usage scenarios include: (1) enhanced mobile broadband; (2) ultra-reliable and low-latency communications; and (3) massive machine-type communications from IoT. For the carriers, 5G is a scalable, energy-efficient, secure communications infrastructure.
To deliver the features proposed for 5G and beyond, it will be necessary to design and deploy a network architecture that moves away from these proprietary solutions and toward open platforms that offer significantly improved scalability, as well as increased efficiency, agility and flexibility. Additionally, these open platforms offer more programmability and automation capabilities to simplify infrastructure management and complexity. Although many service providers are discovering the benefits of implementing some aspects of NFV and SDN within their current networks, the role of these technologies will be vastly expanded beyond their current implementations and will become foundationally critical to 5G.
Keeping it Running
As dependence on IoT devices increases, the infrastructure that makes the IoT a functional reality must improve. Customers will call for stringent Service Level Agreements (SLAs). The design of a fog computing infrastructure will require automatic detection and recovery from outages. Additionally it will require the ability to provision, maintain and repair critical infrastructure will be essential as dependency on IoT functionality increases.
One of the most overlooked and most critical components of fog computing is power. There lies assumptions around availability of power that is being made by these IoT sensory networks and 5G small cell deployments. Facilities will not always be present – requiring a new type of renewable, portable power. Business use cases around portable power/compute/network services include:
- Oil & Gas
- Security and Surveillance
The ability to provide reliable uninterrupted power to the sensory network, the communications network and the compute platform is the foundation that the reliability of fog computing network is built.
Solis Energy (www.solisenergy.com) has addressed the requirements fog computing by creating small self-contained hardened tier 1 and tier 2 data centers. These systems integrate compute, monitoring, battery and power into a ruggedized cabinet that can be installed virtually anywhere and powered by solar, wind, or regular grid power as well as have connections for auxiliary generator power.
Fog computing, IoT and the cloud are already making an impact on the industrial process. The success of fog computing will be determined by its reliability and resilience. As fog computing evolves remote power and compute platforms will become tightly integrated into both the 5G and IoT networks, providing power, compute and network connectivity wherever it’s needed.
Surely the benefits of fog computing will benefit everyone. Of course, everyone will benefit… and don’t call meShirley.
As a 25-year veteran in telecommunications Alex Hardie is an Internet Pioneer having worked with the development of the early technologies from which the modern Internet is based. Having worked closely with the world’s largest wireless and cable operators the author has deployed global public and private networks world-wide. Alex’s exposure to new technologies and the subsequent path to mass market acceptance has helped governmental and large enterprise users alike.
As a contributing principle for SDNNFV.NET Labs the author has worked to champion open source and open compute initiatives such as OpenStack and Facebook’s Open Compute Project. Working closely with a team of technology veterans SDNNFV.NET is committed to bringing real world open source solutions to the business and industrial marketplace.