![]() |
Kennedy was absolutely correct, but it was beside the point. To be fair, it's not just Facebook. After all, who reads the user acceptance agreements of apps before accepting them as fast as your fingers can click?
That's because Facebook's basic business model is all about creating a personalized dossier about you from all the data that it collects and selling targeted advertising based on that dossier.
In fact, this is the same basic business model shared by the providers of other "free" online platforms. As such, Facebook will always try to maximize its business model through two things: one, finding ever more intrusive ways to collect your online and offline behavioral data; and two, finding even more creative ways to slice and dice your data to offer up dossiers with even more predictive value for the vendors seeking to sell their products more effectively.
To make this data collection process as comprehensive as possible while giving the users a sense of control over their privacy, Facebook employs an opt-out process for data collection.
This means that, by default, Facebook gets to collect all the data it wants from your usage of the platform, and you have to individually opt out of each channel (e.g. image recognition, targeted advertisement, calendar invite acceptances, and others) through which Facebook collects and fine-tunes its data. This means that the user would have to be informed enough about Facebook data collection methods to pick and choose per their level of comfort and desired convenience.
I tried, and it isn't easy. Further, I don't know about you, but my parents would have no idea how to even start. And listening to Zuckerberg's hearing on the Hill, the American senators are not much better informed. In fact, it's pretty obvious that many senators don't know how Facebook even works. While Facebook's privacy policy is based on informed consent, it's largely "consent" first and "informed" maybe.
The viability of this business model is contingent on a "high-trust" environment between the platform and users. Imagine if every user decided to choose to be "informed" and "validate" each instance of data collection channels. Worse, if Facebook is forced to switch from an "opt-out" to "opt-in" model for data collection, then Facebook's business model would no longer be tenable. As such, mitigating reputational risk for Facebook's trustworthiness is essential to its ability to maintain the "high-trust" environment.
Unfortunately for Facebook, the recent scandal surrounding Cambridge Analytica, undermines the high-trust environment that Facebook needs to thrive and sustain their business model.
Most users already know (and are okay with) their data is being commoditized and sold ― after all, nothing is for free. However, if the users begin to feel that they are being exploited in a betrayal of trust, then it becomes a different issue.
That Facebook is a "social" media makes this trust factor even more important. Being "social" is all about engaging with one another ― even strangers ― in a community of your own creation. We do this because we have a need to connect in an environment of empathy and trust. Connections make us feel good.
Facebook provides that "feel good" environment. But if the environment itself is no longer trustworthy, then the community is not viable because the connections are tainted with the sense of distrust. Worse, it no longer feels good.
A friend of mine who is a huge social media influencer posted the following sentiment on Facebook: "Every single day, I struggle with whether or not I should quit Facebook. I can't stand the loss of privacy. It's a Faustian pact. I want to connect and have influence, but at what cost? At some point, it affects our sense of self and dignity."
The cost that she's talking about is the socio-emotional cost of using Facebook. If you have to run an emotional deficit ― especially a negative hit to your sense of self-identity ― to use a product, then you won't be using it for long. A social media is an extension of one's self-identity. We trusted Facebook with a piece of ourselves, and they were found wanting. This is a key lesson that Facebook must learn so it can survive.
On a different note and purely out of intellectual curiosity, I can't help but question the politics of the current outrage. In other words, is the volume of outrage against Facebook resulting from the Cambridge Analytica misuse because it was helped to successfully elect Donald Trump?
I suspect that the demographics and political leanings of the majority of American Facebook users probably were not pro-Trump. Did the fact that their personal data was misused to the advantage of a candidate that many found distasteful add to the sense of betrayal? Would the criticism, fully justified as is, have been more muted had the elected candidate been someone more likeable such as Barack Obama? It's just something to think about.
Jason Lim (jasonlim@msn.com) is a Washington, D.C.-based expert on innovation, leadership and organizational culture.