Dear all,
Last week was dominated by Facebook’s latest series of problems. A worldwide scandal has erupted following the revelation that Cambridge Analytica, a firm that worked for the Trump campaign in 2016, has illegally diverted data about millions of Facebook users’ social graphs.
I happen to know the technological context quite well. Back in 2010-2012, I was the CEO of a Paris-based tech startup whose core business was to exploit information about people’s social graphs. Following the example of the 2008 Obama presidential campaign, my cofounders and I were seeking to orchestrate value-creating relationships between individuals who didn’t know each other but had a lot in common—whether it was to help elect a candidate, find a new job or purchase a new product.
Now, I have to say, we didn’t collect too much data. Like many startups, mine never really took off. But I remember that third-party developers like us were authorized to collect information about people’s social graph through the Facebook Graph API. I also remember the restrictions: obviously the user had to express consent; I think we weren’t allowed to store personal information on our own servers; and we were strictly forbidden from selling the data to another party—a rule that was blithely violated in Cambridge Analytica's case. (And by the way, Facebook terminated that API feature in 2014.)
My point is this: in the current discussion about data and privacy, we shouldn’t confuse the technical availability of data (Facebook) and said data being used in violation of the attached rules (Cambridge Analytica). In fact, we have to realize that there are many cases in which people’s data are easily accessible, even out in the open; however, it’s still illegal to use them for certain purposes.
One case for understanding this is being a parent. The fact that you have children can hardly be hidden from parties such as your employer. Yet this very fact can lead to discrimination (because you have children, you won’t be as dedicated as your childless colleagues, blah blah blah). But would you ever consider hiding the fact that you have children? No—you even tend to be proud of them. And so the solution is not to try and hide your loved ones. It’s to impose strict anti-discrimination rules that make it harder/illegal to disadvantage a worker because they have children that regularly require care at home.
It's the same for sexuality. We’re slowly exiting the terrible world of “Don’t ask, don’t tell”, in which it was customary to hide the fact of being LGBTQ because it could lead to retaliation/discrimination, notably in the workplace, retail or matters of housing. It’s much better to encourage people to come out (which boosts social acceptance, as seen over the last decade) and punish those who keep discriminating against them. This is another case in which the solution is not to prevent the collecting of personal data, but rather to repress organizations that misuse them (whether it’s the government, an employer, or predatory businesses such as Cambridge Analytica).
The longer we focus on Facebook as the problem (despite the fact that they had appropriate terms of service), the more we weaken the simple idea that rules should exist to forbid what Cambridge Analytica did. It's a very different focus: in one case, it’s on Facebook collecting personal data; in the other, it’s on third parties misusing this data and violating Facebook's terms of service—which Facebook apparently spotted and castigated in that particular case (even if their PR efforts have been rather atrocious).
Let me share three ideas. First, it’s not so much that privacy is obsolete. It’s that we need to refine the idea of privacy and introduce new categories.
On the one hand there are private data that are out in the open but that other parties are not allowed to use against me (or to manipulate me). There’ll be more and more attempts to do so (third parties using abundant personal data to manipulate people or discriminate against them), so we should be prepared to enact and enforce rules to prevent it instead of succumbing to anti-data collection hysteria.
On the other hand, there are data that simply shouldn’t be collected, like pictures of what happens in our bedroom or the thoughts you share with your therapist (or confessor). This is a different category, a particular version of privacy that we could call “intimacy”.
Second, introducing new categories is critical because the flow of personal data is to the Entrepreneurial Age what the flow of automobiles was to the Fordist economy: it’s highly dangerous, but it creates a tremendous amount of value and now determines our way of life, so we have to live with it.
The flow of personal data is not going away because it makes it possible to improve the user experience—something we all enjoy as consumers. In this context, a successful company is one that, just like Facebook, inspires enough trust so that individuals consent to regular and systematic data collection, which in turn makes it possible to personalize the experience, which in turn inspires more trust.
On the other hand, once that positive feedback loop is broken, it is swiftly replaced by a negative feedback loop, in which lack of trust leads to resistance against data collection, which in turn degrades the experience, which further affects trust. For Facebook, it's far from certain that the scandal is enough to break what has so far been an impressive, trust-generating positive feedback loop. But in the presence of increasing returns, that kind of butterfly effect is enough to turn winners into losers.
Third, I think that the most vocal people hate Facebook not because of what happened with Cambridge Analytica, but for political reasons. Conservatives have always hated Silicon Valley because it’s liberal. Now liberals hate Silicon Valley because its tools somehow played a role in Trump’s victory in 2016. And so the scandal exists not because Facebook was careless with data. It exists because a particular use of data has struck a responsive chord in people—one that resounds because of political circumstances.
Where do we go from here? I don’t think that calls to “regulate tech companies” will lead us anywhere in the absence of other major societal changes. After all, a new techno-economic paradigm calls for many new institutions, not a few more regulations.
What I think is that Silicon Valley needs to take up its part of that effort, as recently advocated by Gary Kamiya: embrace its liberal roots, work on building what I call a , and help mitigate the suffering that led to the presidency of Donald Trump. Will Facebook be the first to head down that path?
Here are a few readings:
Computer Mediated Transactions (Hal Varian, Google’s Chief Economist, May 2010)
Beyond Goods and Services: The (Unmeasured) Rise of the Data-Driven Economy (Michael Mandel, October 2010)
The Second Economy (W. Brian Arthur, October 2011)
Don’t Delete Your Digital Past (Navneet Alang, November 2015)
The Birth And Death Of Privacy: 3,000 Years of History Told Through 46 Images (Greg Ferenstein, November 2015)
How Online Shopping Makes Suckers of Us All (Jerry Useem, May 2017)
(me, September 2017)
Secrecy Is Dead. Here's What Happens Next (Alexis Sobel Fitts, December 2017)
(me, January 2018—TL;DR: it’s not for privacy reasons, it’s because I find my newsfeed boring)
Warm regards (from London, UK),
Nicolas