Big Data isn’t a Big Problem

Russell Poole



Equinix UK managing director Russell Poole explains why he believes big data is a good thing.


Discussions around big data often present it as a problem of epic proportions for business leaders. Too much data! Too fast! Too many sources! I, respectfully, beg to differ.

In this article I’d like to explain why I believe big data is not only a good thing, but that it also holds real value for businesses.

A good place to start is by approaching what characterises ‘big’. In 2001 analyst firm Gartner saw big data sets as having three notable ‘v’ dimensions: volume, velocity and variety. Humanity was generating and storing more data than ever before, at an increasing rate and in diverse types and formats. It was more than most organisations knew what to do with, using relational databases and traditional analytics tools – big was officially a problem.

However, we should place all of this in context. In 2001 Google was celebrating its third birthday, Mark Zuckerberg had yet to enrol at Harvard University and YouTube would not exist for another four years! Big data was going to explode, yet businesses would rise to the challenge and develop methods to extract valuable information.

Perhaps this was all to be expected. The challenge of big data persists because, like Moore’s Law, the relationship is relatively constant. For the foreseeable future businesses will generate, access and hold more data than they can fully analyse or manage, in spite of data processing advances like MapReduce and Hadoop. Big data continues to motivate technology giants to innovate and extract deeper insights from the variety of information they collect and hold.

And it’s not just the headline acts. I’ve spoken to CIOs from companies of all shapes and sizes that are uncovering valid and actionable facts. So I don’t believe big data should be seen as a problem for businesses. Instead, it is an opportunity, providing you have sound objectives and take a scientific approach.

At Equinix we have learned several best practice tips from our customers, which I’m happy to share with you to help IT decision makers address their big data needs:

  • Get the business drivers in place. Understand the required business outcomes of the insights you seek and – importantly – be certain that the outcomes to the organisation merit the cost and effort.
  • Only deal with the data you need. Take a scientific approach and test hypotheses with limited data sets before you commit. These help you to determine whether certain data types and sources are relevant.
  • Be on top of governance. Be aware of any personal or confidential information contained within data sets, or that could be inferred from it. And understand how governance criteria affect your use of this information.
  • Don’t get lost in the science. Data expertise is crucial because you can’t harmonise multiple data sources without it, however you should remain objective.
  • Make the results accessible. Decision makers need actionable and clear recommendations. So bear data accessibility and service delivery in mind.

In 1998, the founders of Equinix asked the question ‘How can we ensure the vitality of the digital economy?’ whilst the Web was revolutionising information-sharing between businesses – and exceeding its capacity for growth. Their answer was to create large data centres where businesses could reliably run and grow their operations and securely exchange critical information. It’s a different context, of course, but I’d say focusing on enabling growth and identifying critical information is sound advice for approaching big data too.

I hope that all of these tips help you to find real, actionable information from your sources of big data. Let me know how your exploration of big data goes!


View our whitepaper: Strategic Implications of Big Data on the Data Centre


Avatar photo
Russell Poole Managing Director, UK
Subscribe to the Equinix Blog