The impact of big data on datacenter infrastructure will be of a scale that is hard to imagine today. The current infrastructure of datacenters was built to handle smaller workloads, based on simple, human-machine interactions. As industry moves forward with big data, it will require an infrastructure that's at least as cost-effective as the current one, but much more efficient.

The traditional internet: humans talking to machines

Up to now, there have essentially been two consumer-facing business models for delivering content on the internet:

  • Web content: Usually, there’s no charge to the user, but the publisher makes revenue by showing the user ads. Some computing power, mostly via third-party ad servers and exchanges, is used to determine which ads to display, based on location, demographics, user history, and other factors.
  • Mobile data content: Usually, the user gets a free or inexpensive phone, but pays for data. Some computing power is used to route calls and transmit data.

Within these two models, there is, on an industrial scale, very little compute power needed to process the data or figure out how to present the data to the user. VoIP packets are switched and routed. And on the web, most of the data flows “down” the infrastructure; users receive much more than they transmit.

The new internet: machine-to-machine (M2M) and the IoT

The Internet of Things (IoT) will bring major changes:

  • Individual users (devices) will be transmitting much more data than individual human users.
  • That data will be transmitted “up” the infrastructure, rather than down; that is, the user/device will primarily be sending the data, rather than receiving it.
  • The business model will involve making sense of large quantities of data. Importantly, this is not a human-based model. Revenue does not depend on the interaction of a person with a machine (such as a phone or a personal computer). Instead, it depends on the interaction of (at least) two machines: the one collecting the data and the one analyzing it.

The impact of big data on a datacenter infrastructure

We’re talking about big data. Examples of the decisions that need to be made using that massive upstream data include:

  • EKG heart data from patients
  • routing, maintenance, and safety data from driverless cars
  • fuel and efficiency data from jet engines

Where will all this data go? Onto what storage? Using what infrastructure?

The current infrastructure of datacenters is not up to the task. It was built to handle much smaller workloads, based on simple, human-machine interactions.

As industry moves forward with big data, it will require an infrastructure that's at least as cost-effective as the current one, but much more efficient.

Below, watch Jason Hoffman explain to Rick Ramsey the shift away from human-based data flow.

 

 

To explore Jason's ideas on platforms and other cloud business models in more depth, please download his article  The Wisdom of Clouds from the latest Ericsson Business Review.

Download the article by Jason Hoffman

Background photo by Deirdre Straughan 


Digital Industrialization Cloud Infrastructure

Deirdre Straughan

Deirdré Straughan has been communicating online since 1982 and has worked in technology nearly as long, in technical writing and editing, training, UI design, marketing, open source community management, events, social media, video, and more. She operates at the interfaces between companies and customers, technologists and non-technologists, marketers and engineers, and anywhere else that people need help communicating with each other about technology. Recently had a run-in with breast cancer. More about her life and work can be found at beginningwithi.com.

Follow Deirdre Straughan:
Deirdre Straughan

Discussions