OREANDA-NEWS. April 28, 2016. The Internet of Things is one of those disruptive technologies that, like cloud computing, is rapidly becoming a mainstream enterprise deployment. According to 451 Research’s Voice of the Enterprise survey, more than 40% of decision-makers in various vertical markets from IT to manufacturing have deployed or have a short- to long-term plan to deploy IoT.

As the enterprise embraces the Internet of Things (IoT), more companies need to be able to perform edge computation, analysis and other applications on the growing amount of data created by IoT devices, and they need to do it with extremely low latency (<10 ms) and high availability (99.9999% uptime). This has reinvented a computing term that has been around since 2012: “fog computing,” which means bringing the cloud closer to the edge of the corporate network.

In its March 2016 report, “Clearing the Fog Around Edge Computing in the Internet of Things,” by Christian Renaud, 451 Research explores the current state of the edge/fog computing market as it applies to IoT and those vendors who are enabling enterprises to adopt it.

We’re already at the edge

For many industrial applications that predate the cloud, edge computing is already the default paradigm. So it makes sense to bring the cloud to them, rather than bring them to the cloud.

One reason is that by enabling access between latency-sensitive applications and the cloud (either on-premises or at a nearby service provider or colocation point of presence), companies can eliminate delays caused by backhauling traffic to a centralized cloud provider for computation, analysis or action. Local access also enables greater uptime and resilience in the event of a wide area network outage.

The benefits go beyond lower latency. Bringing applications and clouds closer to the devices being monitored can also increase bandwidth efficiency, cost savings, and security and privacy. However, the 451 Research report notes there is a drawback to fog computing as it could increase CAPEX and OPEX associated with purchasing and maintaining high-function distributed devices. For example, the IoT gateways that translate proprietary protocols to IP are more expensive than a centralized cloud model that simply forwards data without taking any local actions. Devices at the edge can also pose a security threat because they offer additional targets for intrusion.

These cost and security issues can be mitigated with an edge/fog computing deployment model that uses virtualized gateway capabilities within a multitenant data center or interexchange carrier, such as Equinix, to aggregate multiple locations into a single virtual gateway.

This approach enables enterprises with multiple metro or global locations to directly and securely connect to the virtualized edge IoT gateway and reduce the overall complexity and cost associated with multiple edge/fog computing devices. And performance can be greatly improved, since the gateway is at the network edge closest to the enterprise network connection.

451 Research anticipates that this type of deployment model will gain momentum throughout 2016 and 2017 as trial deployments transition to high-volume production deployments, and the economics of the topology and architecture come into focus.

According to 451 Research, “The diversity and scale of the Internet of Things will include, multiple use cases where minimizing network delays (latency) and/or reducing the amount of data processed and stored from IoT edge devices are mission-critical for success.” The report presents these use cases for edge/fog computing, as well as an overview of vendors that provide edge computing solutions for IoT services.

To download the full report, go to: “Clearing the Fog Around Edge Computing in the Internet of Things.