Business Technology

“Big Data” – why are we waiting?

4 min read

23 October 2013

The rate of data growth continues to escalate. Analyst Wikibon estimates the global big data market opportunity will touch $53bn by 2017 up from $5bn in 2012. Yet, while there is more data out there, and interest in it is growing fast, translating this into actual big data projects is a harder challenge.

In a recent Talend-sponsored survey, 24 per cent of respondents said there was no interest in big data within their organisation – a huge drop from a similar research carried out a year earlier, in which 61 per cent said they had no interest. At the same time, the proportion engaged in a preliminary discussion about using a big data approach has increased from 24 per cent to 36 per cent over the same period last year.

Unfortunately, many businesses have still not taken practical steps to roll-out a big data strategy. Just one in ten respondents in the latest survey are actually engaged in a large scale roll-out, up from two per cent last year, but still a disappointing figure. So, what’s the problem? What’s holding businesses back?

Two of the top three constraints identified are budgetary restrictions and skills shortages, widely recognised as key barriers to any IT endeavour. The potential scale of most big data projects explains the financial concerns. Also, skills are key because big data requires people to be able to integrate any number of large and inflexible disparate data sources and skilled data scientists to analyse the collective data streams.

Some of these issues are more perception than reality. Contrary to popular belief, big data projects do not have to come with a massive price-tag. Being able to run open source databases and integration tools over an open-source Hadoop platform allows big data integration and analysis applications to be run cost-effectively across commodity server clusters, thereby reducing hardware spend.

The report argues that tools are available today capable of exploiting Hadoop’s MapReduce framework, breaking the vendor lock-in of proprietary ETL engines and escalating licence fees. Graphical tools have been developed which can be used by Java programmers, eliminating the need for rare and expensive data scientists. Also, these open-source tools can be downloaded and tried for free, which cuts the cost of sinking the big data equivalent of a test well by removing the limitations on technical and economic scalability. 

At the same time, a range of factors are positively driving big data adoption. For 24 per cent of respondents to the survey, it is the increasing volume of data that is driving uptake. For around one in five, it is the business requirement to increase revenue. Pressures of compliance and product and service development motivate 11 per cent each; six per cent say big data in their organisation is driven by the requirement to find new customers, keep up with competitors or collaborate more closely with business partners.

So, there is reason to be hopeful. The long-term future for big data adoption looks positive, with the technology in place, the key drivers getting stronger and levels of business interest growing all the time. In the short term, organisations need to address the barriers and the issues holding many of them back – and once again technology can be key in streamlining the process and making it more effective to implement big data strategies.

Yves de Montcheuil is VP of Marketing at Talend

Image Source