In the decades ahead, most technologies will not be used to uncover or analyze ‘truth’ but to help reveal what people think about truth. We live in an age of ‘sentiment analysis’ and multi-platform updates, obsessed with shifting perceptions and changing value.
This is what our leading institutions now try to measure: how people think and feel. They deploy algorithms and ponder real-time insights into the sentiments of millions of agents and markets worldwide.
It’s why so many observers can disagree – in real-time and based solely on party affiliation – over something as basic as the size of a crowd in a photograph.
For no matter how much facts may diverge, they all depend on outside references to justify their validity. For example, if we can’t agree on universal photographic standards, or how representative a particular photo is of an event, then how can we agree on what it means?
The ‘truth’ is thus always defined in relation to external references, whether scientific, moral, tribal, cultural or political – or any combination thereof. And when these references or guides are altered, facts themselves change.
This is how populist rulers often ascend to power: by re-interpreting facts, or by discrediting those whose interpretations they oppose. The rise of Trump, Xi, Putin, Erdogen and Orbán show that these forces are clearly on the rise.
So the big lesson about Big Data is clear: scientific authority and ‘objective’ truth can succumb to other types of authority. Big data can help a dictator just as much as a scientist.
In the 21st century, belief still matters. Even scientists recognize the fragility of what we now call ‘facts’, for what’s ‘true’ now may not be true in 5 years, or when measured by a different standard.
To an extent, even the scientific method depends on faith.
Big data is no different.