Abound as we are by technical acronyms and jargon, why do we feel the need for yet more terminology? Doesn’t it just serve to confuse people? Shouldn’t we be marketing technology in a way that everybody understands what we are talking about? Or do IT companies feel the only way to innovate is by bringing out new words for existing technologies?

Reading an article about fog computing, it led to a discussion amongst our technical team about what this ‘new’ technology is. According to the article it means localising some functionality and resources that need a higher bandwidth than the cloud can offer, in order to maintain high performance levels.  So some devices, and storage is put at a client level like in the past.

Compare that to a definition of hybrid cloud – a cloud environment in which an organisation provides and manages some resources in-house (localising it) and others provided externally in the cloud.

These definitions seem pretty much the same to me and when we already have a terminology that is accepted by the industry, what does one of the big players feel it will achieve by adding their own new one to it? The big player who is coining the phrase ‘fog computing’ argues that this is genuinely new in IT, and will be big for real time big data and real time analytics. “Fog devices are geographically distributed over heterogeneous platforms, spanning multiple management domains. Fog services are abstracted inside a container for ease of orchestration to accommodate heterogeneity”. However fog devices can push data to the cloud or to other places on the network if required which isn’t anything new.

So maybe this is just a new name for something has been available for  a while now, or maybe it is something new, in which case it may be wise to sharpen up on the marketing and explain  the differences between fog cloud and hybrid cloud with clear features and benefits. We will observe closely to see.