The dictionary defines big data as “extremely large data sets that may be analysed computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions.

Many businesses now have data management systems for processing and storing big data. Gartner has characterised the three v’s of big data:

  • Volume – the sheer amount of data produced in many environments
  • Variety – the wide variety of data that is stored in data systems
  • Velocity – the speed at which data is generated, collected and processed

Although the term “big data” includes the word “big” it doesn’t refer to any specific volume of data – although it does cover terabytes (TB), petabytes (PB) and even exabytes (EB).

Why is big data so important?

More and more companies are choosing to use the data they have accumulated on their systems to help them to improve their business processes, provide their customers with a better service, and therefore increase their profitability in the long run. The consensus is that companies who tap into big data have a competitive advantage over those that don’t, as it allows them to make more informed business decisions more efficiently (as long as they use their data effectively.)

Big data enables companies to become much more customer-focused. It can provide businesses with insights into their customers that will allow them to refine their marketing strategies, focus on customer engagement, and improve conversion rates.

Big data examples

Big data can be collated from a wide range of sources, including (but not limited to):

  • Business transaction systems
  • Customer databases
  • Medical records
  • Internet click logs
  • Mobile apps
  • Social media networks
  • Scientific research repositories

The data collected may be left in its raw form or preprocessed by data preparation software or data mining tools to be ready for its particular analytical use.

Examples of industry use of big data

Big data also has its uses in the pharmaceutical industry, with medical researchers tapping into it to identify potential disease risk factors – with some doctors even using it to help diagnose conditions in their patients. As we have seen this year with the COVID-19 pandemic, big data derived from social media, the web, electronic health records (EHRs) and other sources can provide the Government and healthcare organisations with real-time information about infectious disease outbreaks.

Big data can also be used by energy companies, such as oil and gas companies, to help them to identify potential drilling locations as well as to help them monitor pipeline operations and electrical grids. Financial companies use big data systems for real-time market data analysis and risk management. Those in the manufacturing and logistics industry also rely on big data to optimise their delivery routes and manage their supply chains successfully.  The list of uses for big data is endless.

More about the three V’s of Big Data

Big data covers a wide variety of data types:

  • Structured data – based on Structured Query Language (SQL)
  • Unstructured data – text and document files held in Hadoop clusters or NoSQL systems
  • Semistructured data – streaming data from sensors or web server logs

These different big data types can be stored together in what is known as a “data lake” – usually based on a Hadoop or cloud object storage service. On top of this, big data applications can also include multiple data sources that may not otherwise be integrated.

We touched above on the three V’s of big data, as characterised by Gartner, but since their original identification in 2001, another three V’s have been added:

  • Veracity – the degree to which big data can be trusted
  • Value – the business value of the data collected
  • Variability – how the data can be used and formatted

Velocity is one of the most important of the six V’s of big data – data assets are often updated on a real-time basis – instead of the weekly or monthly update that tends to take place in data warehouses. This means data scientists and other data analysts need to have an in-depth understanding of the available data and know what sort of answers they are looking for to get the best results out of the available data.

Data velocity, and the management of such, is also critical for artificial intelligence (AI) and Machine Learning – as machines will tend to automatically find patterns in the collected data and generate insights based on that.

The challenge of big data analytics

At the end of the day, the effectiveness and value of big data depend on one thing – the analysts tasked with understanding it and formulating the correct queries required to direct the project.  There are opportunities for less technical users to get to grips with big data using specialised niche big data tools. In contrast, Hadoop-based big data appliances can help businesses to implement a computing infrastructure which minimises the need for hardware and distributed software know-how.

If you would like to be part of helping an organisation gain a competitive advantage using big data, then please get in touch with one of our big data consultants at info@agilerecruit.com

Share this blog