By:

Big data is a hot topic in the business press. Its promise of greater insights and efficiency, improved innovation and competiveness, not to mention income streams for the providers of data analytics tools are a rich source of discussion. Several recent developments indicate the power of using data analysis and statistics effectively to reach conclusions, and we almost certainly ‘ain’t seen nothing yet’ as big data techniques emerge. However, statistics do not necessarily tell the whole story and are open to radically different interpretations. That said, the power of numbers and modelling is rising.

What is changing?

In the run up to the American election the outcome was constantly being called ‘too close to call’ by almost all pundits, except one – Nate Silver. He had already correctly predicted the outcome of the 2008 election, then on his 538 website (the number of electrical colleges in the USA), he analysed as much available data as possible to understand the trends and implications using Bayesian statistics of probability and then modelling options and outcomes. He was right in 50 out of 50 states.

He has previously used his skills and analysis to make a living at poker, predict the career capabilities of basketball players, and in his book demonstrate that the 9/11 terror attacks were not so ‘unforeseeable’ as we imagined. Other areas, which are data rich in this conventional sense where clear and fairly structured data are available, include housing markets, the economy, local government and weather. But that list is growing as analytical tools and modelling are increasingly capable of managing less structured data.

European and US weather services correctly predicted the very rare path of Hurricane Sandy into New Jersey and the likely devastation along the Eastern Seaboard, the timing of its arrival, the wind speeds and the storm surge with its 13 foot waves. This accuracy was possible as a result not only of the increased computation and higher resolution modelling, but the increased level of data from ocean sensors and atmospheric data collection.

A new report from the University of Norway has used a statistical model to examine the future of war and peace, where conflict is defined as being between governments and political organisations that use violence and in which at least 25 people die. Based on a combination of factors including socio-economic development and the declining social acceptability of violence, it predicts a continuing sharp reduction in conflict by 2050, with fewer than 7% of nations involved in conflict compared with 15% in 2009. However, the model has already had to be amended to reflect events in the Middle East which have weakened the link between socio-economic development and absence of civil war; and a very similar set of data was used to predict a world of slums, inequality and conflict.

Why is this important?

The weather service in America has a budget in the region of $1 billion; while the estimated cost of damage from Sandy is the region of $10 billion, and that may be an underestimate. But the service has been facing budget cuts in recent years. However, as the insurance and investment industries grow increasingly worried about the cost of extreme weather events and their frequency continues to appear to grow, so the need for more prevention and prediction will increase.

And science too is being changed. The increased power of modelling and computation is enabling new approaches. Regional eco-system modelling such as the impacts of changes in the Amazon rainforest can now be developed to explore hundreds of different combinations and assumptions; astronomy is able to digitally scan vast areas of the heavens and use analysis to discover new stars and even galaxies.

Big data will take statistics and modelling further, as it moves from historical to ever more real time and then predictive analysis; as it creates far more integrated data sets combining wide varieties of structured and unstructured data. The list of potential applications is almost endless including product and service development and personalisation, cost reduction, detection of fraud or security threats, product and system performance, public health warnings such as food poisoning outbreaks and medical research, effective and efficient use of traffic systems or power supplies – hence its attraction. The capabilities of new computer models have been described by one expert as “one of more positive trends we’re going to see this century”. The power of numbers is rising and providing us with important new capabilities, but we should perhaps also approach with caution; the old danger of Garbage In : Garbage Out, or even more dangerously, Gospel Out (i.e. seen as absolute truth) may also rise. See also an earlier Trend Alert Big Data: Big hype or big value.

By Sheila Moorcroft

About the author

Sheila has over 20 years experience helping clients capitalise on change – identifying changes in their business environment, assessing the implications and responding effectively to them. As Research Director at Shaping Tomorrow she has completed many futures projects on topics as diverse as health care, telecommunications, innovation management, and premium products for clients in the public and private sectors. Sheila also writes a weekly Trend Alert to highlight changes that might affect a wide range of organisations. www.ShapingTomorrow.com