Theory
< Back to The Oracle
[Article] Yuval Noah Harari:
The World After Coronavirus
"It is crucial to remember that anger, joy, boredom and love are biological phenomena just like fever and a cough. The same technology that identifies coughs could also identify laughs. If corporations and governments start harvesting our biometric data en masse, they can get to know us far better than we know ourselves, and they can then not just predict our feelings but also manipulate our feelings and sell us anything they want — be it a product or a politician."
View article >
"You could, of course, make the case for biometric surveillance as a temporary measure taken during a state of emergency... but temporary measures have a nasty habit of outlasting emergencies, especially as there is always a new emergency lurking on the horizon."
"Centralised monitoring and harsh punishments aren’t the only way to make people comply with beneficial guidelines. When people are told the scientific facts, and when people trust public authorities to tell them these facts, citizens can do the right thing even without a Big Brother watching over their shoulders. A self-motivated and well-informed population is usually far more powerful and effective than a policed, ignorant population."
[Video] The Theory, Practice and Limits of Big Data for the Social Sciences
View video >
Discourse:
Social Control
"Before Big Data, social sciences was like trying to do Astronomy without a telescope".
It provides an overview of the wide range of social, political and cultural considerations that must be made when integrating data-collection technologies, and what the repercussions might be to individuals, communities and societies as a whole.
Makes a clear connection between Big Data and Social Science by demonstrating how data connects us to the stories of micro + macro human behaviours. Provides many examples of how big data can reveal human behaviours to us, and of how this insight can allow us to manipulate behaviours.
This talk reveals some examples of how data is used to manipulate the behaviours of people: such as users of online dating, or to train algorithms that match us with a call-centre operator who is most compatible with our emotional needs.
[Journal] Biometric Recognition: Challenges and Opportunities
For example, it explains how certain cultural/religious beliefs (such as a headscarf) might deter people from using biometric systems (such as facial recognition), thus excluding an entire community from the system. From this example, we can clearly see how data collection relates to intimate human behaviours, and raises issues of social equality, inclusion and power.
Once we adopt biometrics, an irrevocable link is created between the individual and their persistent data record. "Unlike most other forms of recognition, biometric techniques are firmly tied to our physical bodies." There is no escaping our identity, when our identity is our body.
The Oracle:
Big Data + Social Control
View paper >
This is a phrase which I am sure we are all very familiar with. In fact, we may hear it so often that we forget to consider what we are actually being alerted to – the fact that our upcoming conversation, which may well contain a detailed account of our opinions, frustrations and personality, might be stored and 'owned' by a private company. After all, we have become accustomed to surrending our data on a daily basis.

Surely this is not much of a problem? We might imagine that the recording has a benign intention: a manager driving to work, listening back to the recording, ready to critique his staff later on their conversational skills. But according to Martin Hilbert (author, speaker and Professor at the University of California) the truth is a little more sinister.

In his talk 'The Theory, Practice and Limits of Big Data for the Social Sciences', Martin Hilbert makes a clear connection between Big Data and Social Science by demonstrating how data connects us to the stories of micro + macro human behaviours. He provides many examples of how big data can reveal human behaviours to us, and of how this insight can allow us to manipulate behaviours. He reveals how data is used to manipulate the behaviours of people using online dating, or to train algorithms that match us with a call-centre operator who is most compatible with our emotional needs.

Unfortunately, he does so without taking much time to address the very dangerous socio-political repurcussions of this manipulation. In fact, he portrays data as an ambivolent force, capable of being used for good or evil, but makes no attempt to provide a framework for 'good' uses of data, or how to prevent corruption. His message to the audience is powerful: big data and social science has the potential to unlock the mysteries of human behaviour... but he does so indisciminately, providing no caveat or forewarning. In the wrong hands, his message could lead to dangerous consequences.

Thankfully, the paper 'Biometric Recognition: Challenges and Opportunities' by the National Research Council serves this purpose exactly. It provides an overview of the wide range of social, political and cultural considerations that must be made when integrating data-collection technologies, and what the repercussions might be to individuals, communities and societies as a whole. For example, it explains how certain cultural/religious beliefs (such as a headscarf) might deter people from using biometric systems (such as facial recognition), thus excluding an entire community from the system. From this example, we can clearly see how data collection relates to intimate human behaviours, and raises issues of social equality, inclusion and power.

Big data is nothing new. Corporations and governments have been creating data profiles on millions of consumers for the past decade or more. However, Covid-19 has provided institutions with an easy justification for their obsessive data harvesting: people are far more willing to surrender their data if they believe that their health is at stake. In an article for the Financial Times, Yuval Noah Harari argues that it would be quite possible to safeguard health and privacy at the same time, if our institutions wanted to: "We can choose to protect our health and stop the coronavirus epidemic not by instituting totalitarian surveillance regimes, but rather by empowering citizens".

Therefore, this moment in history is important for determining the future of our relationship with data and social control. This is an opportunity to set a precedent: how much data can a goverment harvest from its population and still get away with it, under the guise of 'safeguarding public health'? How much control will we, as a population, allow our governments to have over our behaviour? Are we able to tell when we are being manipulated? Are we willing to allow it for the good of society at large?

My project aims to demonstrate how my classmates' data can be used to manipulate their emotions and behaviour. My algorithm will extract insights from their data, and then tell them how they compare to other people. These social 'nudges' may prompt them to behave differently in the future. They may be surprised, pleased, annoyed or upset about the conclusions that their data provides – but ultimately, they will have to confront the fact that their own data is the source of this judgement. Hopefully this experience will prompt them to view their personal data in a wider social context, and reconsider whether or not their health is worth sacrificing their privacy and freedom for in a post-corona society.
"This call may be recorded for quality and training purposes."
< Back to The Oracle