All you need to know about big data


Big Data – the term that organisations worldwide have been obsessed with for some time now – refers to data sets that are so huge or complex that traditional software tools are inadequate to capture, curate, manage, and process them. With the world’s technological per capita capacity to store information multiplying every other year, big data and its challenges have gained much hype in the recent past.

In 2001, Gartner Inc., an American research and advisory firm, had evolved a “3Vs” model to define data growth challenges and opportunities. They defined big data as “high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimisation.”

Intensive research activities on sampling of big data and deriving its optimum benefits soon took the IT sector by storm. Since then, various processing models have continued to evolve to manage this vast cluster of ever-growing information. In 2000, a C++ based distributed file-sharing framework for data storage and query was developed by Seisint Inc., which was followed by more advances and better approaches like MapReduce and Hadoop in later years.

Volume 1 of the Big Data and Advanced Analytics Survey, 2015, by Evans Data Corporation presented the following insights:

  • 6 per cent of all big data apps developed for manufacturing were being created by enterprises.
  • 2 per cent of all big data and advanced analytics apps in use were in customer-facing departments.
  • 2 per cent of all big data and advanced analytics developers were concentrating on the software and computing industry.
  • Enterprises competing in the software and computing industry, manufacturing and financial industry were investing the heaviest in big data and analytics app development.
  • Marketing departments have quickly become the most common users of big data and advanced analytics apps, followed by IT and R&D departments.

Evidently, we are seeing a rapid evolution in how organisations are dealing with big data today. Employees interpreting big data must bring a broad skillset to the table--statistical and technical expertise coupled with an analytical mindset and an acute business sense.

Statistical and technical expertise is needed as data is the fuel that feeds the machine. Employees must have an intimate knowledge of data analysis, data security, data visualisation and data quality. They must be comfortable with massive volumes of unstructured data and be able to organise it in a consumable way. Employees must be open to using advanced technology and tools in machine learning.

Today, data segmentation has crossed the barriers of demography and has begun to run through the veins of organisations as a whole. With advances in machine learning, analytics and computing power, there has been a huge improvement in the ability of organisations to target customers at the individual level and create personalised offers for each customer.

Enterprises today have enormous opportunities to harness big data to improve their competitiveness. In this new age of the Internet-of-Things (IoT), around 80 per cent information and data comes from a multitude of different sources worldwide.

We’re entering an era where companies are no longer just keeping up with technology, but are instead exploring ways to reshape their business processes to take advantage of big data. Every industry today is realising that this Internet-connected world is providing them with the information they need to change and grow – to introduce new products or offer better service based on that information. Definitely by 2018, we’re going to see virtually every enterprise taking advantage of big data.

(Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of YourStory.)