Data Gravity
Data Gravity: Driving Design for Intelligent Mission Fabric
What is Data Gravity? The term ‘Data Gravity’ was generated by Dave McCrory in 2010. In his blog post he asserted that, analogous to gravitational pull between objects in space, “. . . as data builds mass there is a greater likelihood that additional services and applications will be attracted to th…
Paradigm Shift with Edge Intelligence
In my Internet of Things keynote at LinuxCon 2014 in Chicago last week, I touched upon a new trend: the rise of a new kind of utility or service model, the so-called IoT specific service provider model, or IoT SP for short. I had a recent conversation with a team of physicists at the Large Hadron Co…
Open Source at The Large Hadron Collider and Data Gravity
I am delighted to announce a new Open Source cybergrant awarded to the Caltech team developing the ANSE project at the Large Hadron Collider. The project team lead by Caltech Professor Harvey Newman will be further developing the world’s fastest data forwarding network with Open Daylight. The LHC ex…
The Three Mega Trends in Cloud and IoT
A consequence of the Moore Nielsen prediction is the phenomenon known as Data Gravity: big data is hard to move around, much easier for the smaller applications to come to it. Consider this: it took mankind over 2000 years to produce 2 Exabytes (2×1018 bytes) of data until 2012; now we produce…