We didn’t have to wait long to see how Big Data would inundate our lives. When making a simple online search to plan a trip, listen to a song or buy a book, as if by magic we receive suggestions not just about what we’re looking for but about similar alternatives too.
The world of business has once again been the pioneer, but Big Data is already spreading to other areas, such as economic analysis. What’s more, the way it will do so will be revolutionary since after all data the study of patterns and the ability to draw inferences constitute much of the raw material for us analysts. From the time when the new technologies started to gather and process data in real time about the decisions of individuals, businesses, organisations and governments, the volume of new data that we can access has grown astronomically.
Yes, you read correctly: in real time! This is in itself a great step forward for short-term economic analysis, given the lag with which the majority of economic indicators are still published. Measuring the business cycle, prices, etc. in real time (“now-casting”) will be of particular benefit, offering more accurate forecasts. This will help us at every level, since we shall be able to improve resource allocation, analyse the consumption and investment decisions of individuals and businesses in greater detail and gain a better understanding of the mechanism whereby economic policies are transmitted. We shall have to wait a while longer for the development of “Big Models” to help us know not only what is happening but what will happen in the future. Here the old stock market maxim “past trends do not guarantee future ones” is still very much alive.
Not only will Big Data improve what we do, but it will extend to new analyses, such as those that until now have been merely qualitative. A good example of this is the natural processing of texts (or even images) or textual analysis. Would you like to know what the sentence with the greatest negative sentiment in the Brexit letter was? “In security terms, a failure to reach agreement would mean our cooperation in the fight against crime and terrorism would be weakened.” The algorithms that analyse feeling in language also noticed this, and they not only do so faster than we do, but they can review and contrast millions of texts at the same time. And they’re getting better and better at it. The fact that words can be translated into numbers not only increases the volume of data for the analyst but also allows us to study aspects that until now have been beyond our reach: the uncertainty of economic policy, the political climate, relations between different countries, geopolitical tensions, the degree of perception of an individual or business in the media, or even the joys and fears reflected in the millions of news items published every second…
They are what Robert Shiller calls “narratives”, and they can affect the state of the economy in that our decisions are mortal, malleable and liable to become self-fulfilling prophesies.
Fascinating, isn’t it? But remember, Big Data consists of millions of unstructured scraps of information that it will be possible to extract and process quickly but which at the end of the process will still and above all have to be analysed. We can easily build up a network of relationships with millions of interrelated nodes and intersections, but only a well-trained analyst will know how to look inside this tangled mess of relationships and, harder still, draw inferences about its future. We’ll know sooner if we’re facing an economic slowdown, but it will still be difficult to know whether we’re dealing with a sudden screeching halt or a turning point in the cycle.
The potential of Big Data is enormous, but so is the volume of knowledge it consumes, which gives some margin for craftsmanship. We will need not just Big Data but Big Models and above all Big Analysts. Quite a challenge!