The Strategic consolidation of big data

Table of Contents

The enormous volume of digital data generated daily from every single human activity, constitutes a powerful tool for optimizing strategic decision-making in companies. We only need to size technological solutions that allow us to analyze them with agility, precision and truthfulness, adapting to changes in the context and the growing needs of the market and customers.

The collection and strategic analysis of large amounts of digital data, commonly known as Big Data, has long since ceased to be just a fashionable concept. Today it is a successful and growing economic activity, which moves almost US $ 10 billion, and it is expected to reach an annual growth rate close to 30% in 2023.

These strong business figures show that the new “data science” is at the core of business digitization and, like it or not, it will cover, to a greater or lesser degree, all everyday aspects of human activity.

Furthermore, the entire gigantic process of technological and cultural transformation of companies is based on the massive implementation of data analytics and its “sister disciplines”, among which are Business Intelligence (BI), the Internet of Things (IoT), Artificial Intelligence (AI), Cloud Computing, and of course automation.


Technically, the expression “Big Data” encompasses both the methodological approach focused on the collection and analysis of digital data from different areas (such as economy, culture, society, defense and politics, among others), as well as the technologies used to carry out this task.

This implies, broadly speaking, gathering large amounts (volume) of data, at the fastest possible pace (speed) and using a great diversity of formats (variety).

However, this data collection does not work only from conscious or specially directed actions (such as the classic opinion polls, or focus groups and market studies, for example). It also involves collecting data from everyday and unconscious activities that anyone performs when online. Among these actions, we can highlight the filling of registration forms in blogs or multimedia portals, links or clicks type “I like” in social networks, keywords entered in search engines and browsers, and visits to certain web pages.

All these actions generate valuable data continuously collected, analyzed and used to obtain actionable information and automated actions, generally aimed at achieving concrete benefits for the positioning and competitiveness of companies, brands and / or products.

A process that becomes more profound and efficient every day, as increasingly sophisticated algorithms are constantly designed to “learn” and take advantage of our online behavior.

Of course, this activity is no longer limited to the online world. The advent of cloud computing and the IoT also allows cities, shopping malls, public and private offices, streets and even our own homes to become “smart”, which in turn allows them to become effective tools for the collection of more and better data.

In other words, a new “intelligent and ultra-connected civilization” that thrives precisely thanks to strategic decision-making based on Big Data.


The correct application of Big Data can have great and different implications for business progress and, jointly, of society.

For example, for public administrators, politicians and legislators, the large amount of knowledge based on Big Data, can translates into the formulation of better and more austere public policies, as well as in the choice of better means to advance towards the achievement of the objectives demanded by the common good.

In turn, for the average citizen it translates, for example, into aspects such as better purchasing decisions, optimization of intercity trips, better time management, or more informed, transparent and democratic elections. In short terms, in a better quality of life.


This rapid paradigm shift confronts companies with the need to leave behind strategies focused only on information, to replace them with those based on data analysis.

Those who have understood this evolutionary significance are now progressing thanks to the active use of analytics, moving from a strictly descriptive approach to a diagnostic, predictive and prescriptive analysis model.

In this way, the organizations that today aspire to conquer the market act, more and more successfully, from the knowledge and predictions based on data.

This involves operating from certain key attributes:

– Emphasize data collection.

– Constantly invest in tools and skills to collect and analyze that data.

– Commit to making the data widely accessible.

– Have availability to consider ideas based on data, which arise from all levels of the company, without discrimination of any kind.

– Dedicate to continuous progress, also based on data.


This enormous work of collecting, analyzing and interpreting large amounts of data (many times different and dissimilar, although they come from relatively homogeneous audiences), requires certain technical and methodological details.

First, it is not just about emphasizing volume, velocity, and variety in the compilation. The specialists have also identified seven additional characteristics that must be taken into account when trying to understand, study and apply the concept of Big Data.

This leads to the following 10 characteristics that, for practical convenience, all begin with the letter “V”:

Volume: It is perhaps the most widespread and transcendent value, since more than 90% of all digital data was generated only in the last two years. The current generation speed is simply staggering. For example, people upload 300 hours of video content to YouTube every minute.

Velocity: Refers to how fast we generate the data. Moreover, today we are talking about figures that may seem puzzling. For example, Facebook alone claims that 600 terabytes of data are fed to it every day; while Google claims to process an average of 40,000 queries per second, which is equivalent to 3.5 billion daily searches.

Variety: Approximately 90% of all available data is unstructured or semi-structured. Consequently, we have to put aside the classic columns and rows of tabulated data, as well as any other traditional method of analysis in order to work efficiently. Moreover, we need now new apps and technological tools in order to make a better and more efficient Data analyze. 

Variability: It refers to the “contamination” of the data, that is, the number of existing discrepancies. We can resolved this by anomaly and outlier detection methods. Although it is often confused with variety, this characteristic is more specific and relates to the multiplicity of sources and inconsistent data types, resulting in a multitude of discrepant points that are key to correct interpretation.

Veracity: It refers mainly to the origin and reliability of the data, as well as its context and relevance for subsequent analysis. This is an important feature, especially in the context of automated functionalities, as trust in data tends to decrease as some or all of the above properties increase.

Validity: It raises the need to know if the data is correct for the intended use. Its value is so great that data scientists spend approximately 60% of their time cleaning up all the data before even considering it in the analysis. Therefore, excellent data governance is imperative to give results reliability.

Vulnerability: Privacy and security must be an essential part of the DNA of any organization that collects stores and processes data. It is an unavoidable imperative, since any violation in the Big Data chain implies a large gap that can lead to a potential disaster.

Volatility: In a short time, certain data becomes obsolete and useless. For this reason, organizations must establish clear rules for their exchange and timely availability, as well as guaranteeing rapid retrieval of information.

Visualization: It involves making the large amounts of available data understandable, so that it is easier to translate them into actions. Visualization is responsible for the elusive perceptions that everyone is looking for, since using graphics to tell complex stories is extremely difficult.

Value: The use of Big Data implies great costs, but also offers the opportunity to close important deals. In other words, we have to be sure that the return on investment covers the budget problems involved in making efficient data analysis.

The correct combination of these 10 “V’s” will allow us to better understand clients, better respond to their needs, optimize procedures and workflows, reduce costs, increase returns and, consequently, improve business performance.


Management expertise, or the knowledge and skills of the leader, are important factors in making strategic decisions in a company. However, the data itself (and its subsequent analysis) can also lead to “managerial wisdom.”

In order to achieve this goal, we can use the DIKW pyramid, or DIKW hierarchy. This term defines a model that represents the structural and functional relationships that we can built between data, information, knowledge and wisdom.

This idea is summarized in a quote from researcher Jennifer Rowley, who, in her study “The wisdom hierarchy: representations of the DIKW hierarchy”, published in 2007 in the Journal of Information and Communication Science, stated that “typically information is defined in terms of data, knowledge in terms of information, and wisdom in terms of knowledge”.

Therefore, it is logical to say that decision-making is based on wisdom; and, in turn, wisdom is based on data collection.

Technically the “DIKW pyramid” illustrates the way in which databased decision processes should take place, based on the following scheme:

At first, we collect and enter quality data into the company’s management and control systems.

The resulting information is then evaluated for completeness, correctness, currency, consistency, and accuracy. If it scores enough on these scales, this information is assimilated as “knowledge”.

The quality of the resulting knowledge is evaluated against procedural knowledge already incorporated into management and control systems, such as templates for measuring competitors’ strengths, assumptions about their future positioning and investment campaigns, and doctrinal assumptions, often encoded as rules, among other possible variables.

Finally, the conscience (or wisdom) resulting from these analyzes is evaluated in terms of its usefulness, within the knowledge base incorporated in the management and control systems.

By applying these indicators and trading instruments, a powerful decision-making model will be available, based on relevant, high-quality data.

This leads us to conclude that the data, by itself, is completely useless. Its true value emerges from an analysis performed by the correct algorithms. Only in this way, it will be possible to extract the information necessary to drive better business decisions.

Therefore, only companies that implement a comprehensive Big Data strategy will be able to stay at the forefront in terms of innovation and adaptation to the different variables that today give more and more dynamism to the market.

Likewise, they will promote an effective modernization of their organizational culture, based on information management that motivates employees to make better use of processes based on data analysis and the resulting knowledge.

Take control of your Work Orders

With the # 1 platform to digitize tasks

You Might Also Like

Visit us and join the Digital Revolution with DataScope

Learn how to stop using paper today!

About the author

Picture of Francisco Gonzalez
Francisco Gonzalez

Share on

Did you like this article?

Subscribe to our newsletter and we’ll send you content like this directly to your inbox, once a month with all the news.