If you feel you are being watched,
you change your behavior.


Big Data is supercharging this effect. 

This could limit your desire to take risks or exercise free speech.
Over the long term these 'chilling effects' could 'cool down' society.

This is how it works:


Your data is turned into thousands of different scores. 

There are stars behind the cloud:

Databrokers use algorithms to see patterns in society. This allows them to deduce the likelihood of thousands of details that you may never have disclosed. These are actual examples:

Rape victim
Into dieting
Into gardening  
Number of online friends
Number of real friends
Political views
Had abortion 
Projected sexual orientation 
Real sexual orientation 
Reads magazines on travel
Reads books on travel
Planning to have a baby

Parents divorced before the age of 21
Date of Birth  
Into Fashion

Has house plants  
Economic stability 
Potential inheritor  
Year house built
Smoker in the household  
Has 'senior needs'
Has 'diabetic focus'
Easily addictable                           
Physical frailty
Communication device preference
Adult 'empty nester'
Education level
Into Elvis Memorabilia 

Their 'derived data', which is protected as corporate free speech, is more valuable than 'your data'.
If they say they don't sell your data, ask if they are selling theirs.


People are starting to realize that this 'digital reputation' could limit their opportunities.

(And that these algorithms are often biased, and built on poor data.) 



People are changing their behavior to get better scores.

This has good and bad sides.

Social Cooling describes the long-term negative side effects of living in a reputation economy:

1. A culture of conformity

Have you ever hesitated to click on a link because you thought your visit might be logged, and it could look bad?

More and more people feel this pressure, and they are starting to apply self-censorship. This is called a 'chilling effect'.

The irony: freedoms are not being taken away, we are just afraid to use them.

2. A culture of risk-aversion

When doctors in New York were given scores this had unexpected results.

Doctors that tried to help advanced cancer patients had a higher mortality rate, which translated into a lower score.

Doctors that didn't try to help were rewarded with high scores, even though their patients died prematurely.

Rating systems can create unwanted incentives, and increase pressure to conform a bureaucratic average.

3. Increased social rigidity

Digital reputation systems are limiting our ability and our will to protest injustice.

In China each adult citizen is getting a government mandated "social credit score". This represents how well behaved they are, and is based on crime records, what they say on social media, what they buy, and even the scores of their friends.

If you have a low score you can't get a government job, visa, cheap loan, or even a nice online date.

Social pressure is the most powerful and most subtle form of control. 

As our weaknesses are mapped..

We are becoming too transparent.


This is breeding a society where self-censorship and risk-aversion are the new normal. 


Yes, we've had credit ratings before. But this is a whole new scale, with an incredible level of integration, automation and accessibility.

The big philosophical question:

Are we becoming more well behaved, but less human?

What does it mean to be free in a world where surveillance is the dominant businessmodel?

The big economic question:

Are we undermining our creative economy?

In a creative economy the people who dare to be different are our greatest resource.

The big societal question:

Will this impact our ability to evolve as a society? 

Yesterday's fight for equality by a minority is today's widely accepted norm. But will minority views still flourish?


The solution?



We should compare this problem to Global Warming.

•  Social Cooling is subtle. The pollution of our social environment is invisible to most people, just like air pollution was at first.  

•  Social Cooling is complex. It cannot be solved by politicians, citizens, entrepreneurs or scientists on their own.    




Public awareness is still very low.

It took 40 years to get the problems with oil on the agenda, and 80 years to get to where we are now.
We can't take that long with Social Cooling.





In the next 10 years we will need to spread a more mature and nuanced perception of data and privacy.








When pressure to be perfect increases..


When algorithms judge everything we do, we need to protect the right to make mistakes. 

When everything is remembered as big data, we need the right to have our mistakes forgotten.  



In our data-driven world..

Privacy friendly sharing