The Korea Times close
National
  • Politics
  • Foreign Affairs
  • Multicultural Community
  • Defense
  • Environment & Animals
  • Law & Crime
  • Society
  • Health & Science
Business
  • Tech
  • Bio
  • Companies
Finance
  • Companies
  • Economy
  • Markets
Opinion
  • Editorial
  • Columns
  • Thoughts of the Times
  • Cartoon
  • Today in History
  • Blogs
  • Tribune Service
  • Blondie & Garfield
  • Letter to President
  • Letter to the Editor
Lifestyle
  • Travel & Food
  • Trends
  • People & Events
  • Books
  • Around Town
  • Fortune Telling
Entertainment
& Arts
  • K-pop
  • Films
  • Shows & Dramas
  • Music
  • Theater & Others
Sports
World
  • SCMP
  • Asia
Video
  • Culture
  • People
  • News
Photos
  • Photo News
  • Darkroom
  • The Korea Times
  • search
  • Site Map
  • E-paper
  • Subscribe
  • Register
  • LogIn
search close
  • The Korea Times
  • search
  • Site Map
  • E-paper
  • Subscribe
  • Register
  • LogIn
search close
Opinion
  • Yun Byung-se
  • Kim Won-soo
  • Ahn Ho-young
  • Kim Sang-woo
  • Yang Moo-jin
  • Yoo Yeon-chul
  • Peter S. Kim
  • Daniel Shin
  • Jeffrey D. Jones
  • Jang Daul
  • Song Kyung-jin
  • Park Jung-won
  • Cho Hee-kyoung
  • Park Chong-hoon
  • Kim Sung-woo
  • Donald Kirk
  • John Burton
  • Robert D. Atkinson
  • Mark Peterson
  • Eugene Lee
  • Rushan Ziatdinov
  • Lee Jong-eun
  • Chyung Eun-ju
  • Troy Stangarone
  • Jason Lim
  • Casey Lartigue, Jr.
  • Bernard Rowan
  • Steven L. Shields
  • Deauwand Myers
  • John J. Metzler
  • Andrew Hammond
  • Sandip Kumar Mishra
  • Lee Seong-hyon
  • Park Jin
  • Cho Byung-jae
Sat, May 28, 2022 | 06:27
Jason Lim
Facebook wants your naked pictures, really
Posted : 2017-11-10 17:37
Updated : 2017-11-10 17:37
Print Preview
Font Size Up
Font Size Down
By Jason Lim

According to Washington Post story by Travis M. Andrews dated Nov. 8, Facebook wants you to upload your own explicit photos – for your own good.

This is how it works. You have explicit pictures that you are worried that your ex-lover might post to Facebook to embarrass you. This is your typical revenge porn scenario. To proactively prevent this, you upload your own explicit photos ― those that you suspect your ex has and might post – to some type of a secure Facebook portal. Facebook won't store the photos but will use some AI-driven algorithm to create a digital footprint of the photo so that it would be able to recognize the photo and disallow if someone tries to post it to the platform at some later date.

In a way, this is a very forward-leaning, prevention-focused way to address the growing revenge porn problem, rather than engage in incident management after-the-fact. At the same time, it does require you to submit your naked, explicit photos (and videos, I would assume, in the next iteration of this capability) to Facebook. That is a whole lot of trust that Facebook is asking you for.

Being the old geezer that I am, the question that immediately pops up is, "How about not taking naked, explicit photos of yourself and sharing them with your significant others?" Frankly, while I intellectually understand that sexting and associated behavior is a huge problem among high school students and young people in general, I haven't quite come to terms that such behavior is common enough to warrant a specific technical solution like the one Facebook offers above. But I guess I am wrong.

This brings me to a bigger question. How far does Facebook have to go to provide solutions to check the negative effects of (mal) intentional behaviors of its users? And not just Facebook, but any other widely used social media platform in which users generate contents to share with one another.

This question goes to the heart of the disinformation campaign that Russia engaged in during the last U.S. presidential campaign. Is it Facebook's fault if a Russian operative posts a fake news about Hillary Clinton and her alleged connections to all sorts of conspiracies? Or what if the same operative announces a Black Lives Matter protest event using incendiary language against the police? Should Facebook be responsible for ferreting these out?

Twitter is struggling with a similar challenge. Originally, Twitter prided itself as the free speech champion in which anybody can say anything to anyone else – even anonymously – without being censored by the platform. Twitter's primary brand was that of unfettered, neutral platform for free speech. The Preamble to Twitter's rules (2009-2015) read in part, "Each user is responsible for the content he or she provides. We do not actively monitor and will not censor user content, except in limited circumstances."

However, free speech is not necessarily civil speech. Trolls of all colors have had a field day harassing Twitter users whose posts they don't agree with ad hominem attacks using truly vile, hateful language. And it's not just the words used in the attacks but the frequency and number of attacks, all designed to shut the person up, not engage in spirited debate over the merits of an idea or position. In other words, unfettered and unmoderated free speech has led to the lessening of the "freedom" of speech by forcing users to take themselves off the platform.

In a word, the Twitter experience shows us that free speech without civility does not lead to neutrality. Let's turn this around. Without enforcing civility, you won't have neutrality. For Facebook, the equation is similar with slightly different variables. Without enforcing transparency and accountability, it will lose trust.

But how do you enforce civility? And where do you draw the line between civility and censorship? That's the key question. One person's civil discourse could be another person's trigger. Also, does civility only cover the abusive words or does it extend to tone and actual substance of the discussion? Does overly cynical or mocking tone – without using any apparent hateful language – constitute harassment? How about discussion about the scientific legitimacy of eugenics, positive aspects of the Holocaust, or justifications for assassinating the police? Are these allowed as long as they stay on the good side of George Carlin?

And the answer is necessarily, "It depends." Ugh. How unsatisfactory. How true, though.

What's acceptable and unacceptable is a function of the social and cultural context that we live in. Try insulting the king in Thailand and see what happens. Or try denying the Holocaust in Germany. Or reject that Rape of Nanking ever happened in China. Or claim that Comfort Women were willing prostitutes in Korea.

On a more subtle and difficult note, the whole debate about "safe spaces" is a case in point in which different demographic groups with differing social and cultural sensitivities want a space where they won't be exposed to speech and ideas that they find offensive. Too much moderation run the risk of turning Twitter, Facebook, and the like into virtual safe spaces catering to the sensitivities of the self-appointed "Safety Officials." Sounds deservedly ominous.


Jason Lim (jasonlim@msn.com) is a Washington, D.C.-based expert on innovation, leadership and organizational culture. He has been writing for The Korea Times since 2006.lsdie@koreatimes.co.kr


 
  • [SPECIAL REPORT] Asylum-seekers create Myanmartown from scratch in Bupyeong
  • S. Korean volunteer fighter in Ukraine returns home with knee injuries
  • Koreans ready to flock to Japan as tourism resumes in June
  • Presidential security team misplaces live bullets
  • UN Security Council fails to pass N. Korea resolution due to opposition from China, Russia
  • Early voting for local elections kicks off
  • Trade deficit feared to become long-lasting trend
  • Sex slavery activist Youn under fire for not sharing details of settlement with victims
  • POSCO Chemical, GM pick Quebec as site for $327 mil. joint cathode plant
  • WHO assembly slams Russian attacks on Ukraine health facilities
  • Disney+ original 'Kiss Sixth Sense' to offer high-spirited, fantasy rom-com Disney+ original 'Kiss Sixth Sense' to offer high-spirited, fantasy rom-com
  • [INTERVIEW] German professor hopes to boost appreciation for Korean traditional music [INTERVIEW] German professor hopes to boost appreciation for Korean traditional music
  • 'The Roundup' becomes most-watched Korean film in pandemic era 'The Roundup' becomes most-watched Korean film in pandemic era
  • [INTERVIEW] Korea needs more small-sized concert venues: Prof. Lee Gyu-tag [INTERVIEW] Korea needs more small-sized concert venues: Prof. Lee Gyu-tag
  • Korean films make splash at Cannes Film Festival Korean films make splash at Cannes Film Festival
DARKROOM
  • 75th Cannes Film Festival

    75th Cannes Film Festival

  • People in North Korea trapped in famine and pandemic

    People in North Korea trapped in famine and pandemic

  • 2022 Pulitzer Prize: Bearing witness to history

    2022 Pulitzer Prize: Bearing witness to history

  • Worsening drought puts millions at risk

    Worsening drought puts millions at risk

  • Our children deserve the best

    Our children deserve the best

The Korea Times
CEO & Publisher : Oh Young-jin
Digital News Email : webmaster@koreatimes.co.kr
Tel : 02-724-2114
Online newspaper registration No : 서울,아52844
Date of registration : 2020.02.05
Masthead : The Korea Times
Copyright © koreatimes.co.kr. All rights reserved.
  • About Us
  • Introduction
  • History
  • Location
  • Media Kit
  • Contact Us
  • Products & Service
  • Subscribe
  • E-paper
  • Mobile Service
  • RSS Service
  • Content Sales
  • Policy
  • Privacy Statement
  • Terms of Service
  • 고충처리인
  • Youth Protection Policy
  • Code of Ethics
  • Copyright Policy
  • Family Site
  • Hankook Ilbo
  • Dongwha Group