As I mentioned in my previous blog, Fog Computing supports emerging Internet of Things (IoT) applications that demand real-time response and predictable latency such as industrial automation, transportation, networks of sensors, and actuators. Thanks to its wide geographical distribution, the Fog model is well positioned for real time big data and real time analytics. But how can we make Fog real for IoT? We believe Cisco IOx is the answer.
Cisco IOx is delivering an application enablement framework that brings the Fog concept to life by allowing the delivery of distributed computing capabilities and enabling the creation of an intermediate layer between the “things” and the cloud.
![]() |
![]() |
So what exactly is Cisco IOx? In simple terms, Cisco is combining the communication and computing resources that are required for IoT into a single platform for application enablement at the network edge.
Now companies who have devices with XYZ interface, for example, or want to use ABC application to monitor and act upon sensor information can bring their own interface and have it run on Cisco networked devices such as our CGR 1240, 819 ISR and even some of our Video Surveillance Cameras.
Cisco IOx enables users, partners, ISV’s and solution providers to enable new sources of value right at the edge of the network by providing the resources required to Bring Your Own Application (BYOA), and Bring Your Own Interface (BYOI) as close to the devices as possible, enabling a new wave of distributed intelligence capabilities fueling new business models and bringing together a larger ecosystem of companies that can drive innovative solutions for the Internet of Things.
Cisco IOx also supports a paradigm shift on how data is processed. Today, data captured is first transmitted to the cloud and stored. From there, it’s analyzed and commands are sent to act upon that information, then operators are notified. IOx helps overcome the costly need to constantly move data around and allows analysis and notification to occur before the critical information is stored by performing critical data processing and analysis capabilities at the edge of the network, so not only the Cisco IOx model faster and more efficient, but it also helps to meet compliance and regulation policies that are requirements for many of these connected things across different industries.
As part of the application enablement framework, Cisco IOx also offers services to manage those apps and interfaces in a smart and efficient way, so the companies leveraging IOx can start and stop apps at will, update, and modify them as they see fit. And finally, leveraging the Cisco DevNet, Cisco is also offering tools, education, and support for individuals and companies interested in developing and offering their own apps and interface drivers, and a platform to offer them to all customers that use Cisco’s network infrastructure products.
If you liked this blog you may also like:
Cisco IOx: An Application Enablement Framework for the Internet of Things |
IOx at Distributech: An Open Framework for the Internet of Things |
Business News: Increasing Business Investments and IT Opportunities |
Download Cisco’s Enterprise Networks App for Android or iDevice |
Hi Roberto, great post!
With the amount of data being generated, it’s pointless to send all kind of data to the cloud.
The real value of IoT is when we start making decisions right at the edge!
See you in San Francisco, right?
Cheers!
Thanks Arua, fully agreed. Distributed computing capabilities are a cornerstone of a successful IoT architecture. And about SF, if you are talking about CL, you bet I’ll be there.
The only thing that I see missing from most (all?) IoT discussions is operational impact.
Traditional monitoring/management solutions were built on the client/server model and struggle with real time, low latency solutions (like Cisco collaboration – voice and video) that are centralized, let alone a massively distributed compute platform like this would be.
The real value and uptake in these solutions *will not* – in my opinion, be realized unless the solution consumer has complete confidence that the solution is always there, available and delivering accurate information (see the correlation – can’t commit to causation! – between the growth in Twitter and the reduction of their infamous “fail whale” occurrences).
Something to think about.
rob
Thanks for the comment Rob,
I agree with your point. One of the advantages of the fog computing model is that with some of the intelligence being distributed, the system achieves a superior level of reliability at the scale needed for the Internet of Things. And when we are going from thousands or tens of thousands to endpoints, to millions and millions of connected devices, this concept makes moire and more sense.