Loading...

Facebook responded to the leak of its moderator rules: ‘We get things wrong’

Loading...
Facebook responded to the leak of its moderator rules: ‘We get things wrong’ - Apa khabar sahabat TIMES NEW MALAYSIA, Dalam artikel yang anda baca kali ini dengan tajuk Facebook responded to the leak of its moderator rules: ‘We get things wrong’, kami telah menyediakan dengan baik untuk artikel ini anda membaca dan memuat turun maklumat di dalamnya. mudah-mudahan mengisi jawatan Artikel BOLASEPAK, Artikel NEWS, Artikel PERNIAGAAN, kita menulis ini, anda boleh memahami. Nah, selamat membaca.

Tajuk : Facebook responded to the leak of its moderator rules: ‘We get things wrong’
link : Facebook responded to the leak of its moderator rules: ‘We get things wrong’

lihat juga


Facebook responded to the leak of its moderator rules: ‘We get things wrong’

mark zuckerberg facebook ceo

Justin Sullivan/Getty Images

Facebook CEO Mark Zuckerberg.

  • Facebook’s moderation rules have leaked, revealing what is and isn’t allowed on the social network.
  • A Facebook exec has written a column defending the company, admitting: “We get things wrong.”

Facebook’s moderation policies have been in the spotlight this week, after The Guardian published leaked documents detailing how the social network decides what is and isn’t acceptable on its platform.

The statement “to snap a b—h’s neck make sure to apply all your pressure to the middle of her throat” can be permissible, for example. “Someone shoot Trump,” on the other hand? Not allowed.

The company’s head of global policy management, Monika Bickert, has now responded to the leaks with a lengthy column defending its practices — also published in The Guardian.

From the offset, Bickert strikes a conciliatory tone, lauding The Guardian’s coverage (its “reporting on how Facebook deals with difficult issues/images such as this gets a lot of things right”), and defending particular issues that Facebook has been criticised for following the reports.

For example, the social network does not take down livestreaming videos of people attempting to self-harm. This is because, she says, “experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats … When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time. We are aware of at least another half-dozen cases like this from the past few months.”

vietnam war nick ut napalm girl photo pullitzer

AP Photo/Nick Ut

Pulitzer-prize winning photo ‘The Terror of War.’

She also defends not publishing Facebook’s moderation policies in detail, as some (including this author) have called for, saying the company doesn’t “want to encourage people to find workarounds.”

Facebook’s standards for moderation have previously attracted heavy criticism. In 2016, it censored iconic Vietnam War photo “The Terror of War,” and censured Aftenposten, Norway’s biggest newspaper, for publishing it. It has also banned a photo of a Renaissance-era Italian statue for being “sexually explicit. and suspended users who posted a photo of Aboriginal women in traditional dress, among other examples.

Rickert acknowledges mistakes have been made before: “We get things wrong, and we’re constantly working to make sure that happens less often. We put a lot of detailed thought into trying to find right answers, even when there aren’t any.

“I hope that readers will understand that we take our role extremely seriously.”

Earlier this month, Facebook announced it is hiring 3,000 extra reviewers, a fact Rickert reiterates in her column. The hires come after a spate of murders, accidental deaths and suicides have been streamed on Facebook Live, its live video feature.

She signs off by arguing that Facebook and broader society is still trying to work out what is “acceptable,” but that Facebook is trying as best it can: “Technology has given more people more power to communicate more widely than ever before. We believe the benefits of sharing far outweigh the risks. But we also recognise that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, can play an important part of that conversation.”

NOW WATCH: I wear these computer glasses every day even though I have perfect vision — here’s why

Please enable Javascript to watch this video

Read more stories on Business Insider, Malaysian edition of the world’s fastest-growing business and technology news website.



✍ Sumber Pautan : ☕ Business InsiderBusiness Insider

Kredit kepada pemilik laman asal dan sekira berminat untuk meneruskan bacaan sila klik link atau copy paste ke web server : http://ift.tt/2qfaSvc

(✿◠‿◠)✌ Mukah Pages : Pautan Viral Media Sensasi Tanpa Henti. Memuat-naik beraneka jenis artikel menarik setiap detik tanpa henti dari pelbagai sumber. Selamat membaca dan jangan lupa untuk 👍 Like & 💕 Share di media sosial anda!

mark zuckerberg facebook ceo

Justin Sullivan/Getty Images

Facebook CEO Mark Zuckerberg.

  • Facebook’s moderation rules have leaked, revealing what is and isn’t allowed on the social network.
  • A Facebook exec has written a column defending the company, admitting: “We get things wrong.”

Facebook’s moderation policies have been in the spotlight this week, after The Guardian published leaked documents detailing how the social network decides what is and isn’t acceptable on its platform.

The statement “to snap a b—h’s neck make sure to apply all your pressure to the middle of her throat” can be permissible, for example. “Someone shoot Trump,” on the other hand? Not allowed.

The company’s head of global policy management, Monika Bickert, has now responded to the leaks with a lengthy column defending its practices — also published in The Guardian.

From the offset, Bickert strikes a conciliatory tone, lauding The Guardian’s coverage (its “reporting on how Facebook deals with difficult issues/images such as this gets a lot of things right”), and defending particular issues that Facebook has been criticised for following the reports.

For example, the social network does not take down livestreaming videos of people attempting to self-harm. This is because, she says, “experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats … When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time. We are aware of at least another half-dozen cases like this from the past few months.”

vietnam war nick ut napalm girl photo pullitzer

AP Photo/Nick Ut

Pulitzer-prize winning photo ‘The Terror of War.’

She also defends not publishing Facebook’s moderation policies in detail, as some (including this author) have called for, saying the company

Loading...
doesn’t “want to encourage people to find workarounds.”

Facebook’s standards for moderation have previously attracted heavy criticism. In 2016, it censored iconic Vietnam War photo “The Terror of War,” and censured Aftenposten, Norway’s biggest newspaper, for publishing it. It has also banned a photo of a Renaissance-era Italian statue for being “sexually explicit. and suspended users who posted a photo of Aboriginal women in traditional dress, among other examples.

Rickert acknowledges mistakes have been made before: “We get things wrong, and we’re constantly working to make sure that happens less often. We put a lot of detailed thought into trying to find right answers, even when there aren’t any.

“I hope that readers will understand that we take our role extremely seriously.”

Earlier this month, Facebook announced it is hiring 3,000 extra reviewers, a fact Rickert reiterates in her column. The hires come after a spate of murders, accidental deaths and suicides have been streamed on Facebook Live, its live video feature.

She signs off by arguing that Facebook and broader society is still trying to work out what is “acceptable,” but that Facebook is trying as best it can: “Technology has given more people more power to communicate more widely than ever before. We believe the benefits of sharing far outweigh the risks. But we also recognise that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, can play an important part of that conversation.”

NOW WATCH: I wear these computer glasses every day even though I have perfect vision — here’s why

Please enable Javascript to watch this video

Read more stories on Business Insider, Malaysian edition of the world’s fastest-growing business and technology news website.



✍ Sumber Pautan : ☕ Business InsiderBusiness Insider

Kredit kepada pemilik laman asal dan sekira berminat untuk meneruskan bacaan sila klik link atau copy paste ke web server : http://ift.tt/2qfaSvc

(✿◠‿◠)✌ Mukah Pages : Pautan Viral Media Sensasi Tanpa Henti. Memuat-naik beraneka jenis artikel menarik setiap detik tanpa henti dari pelbagai sumber. Selamat membaca dan jangan lupa untuk 👍 Like & 💕 Share di media sosial anda!



dengan itu Perkara Facebook responded to the leak of its moderator rules: ‘We get things wrong’

yang semua artikel Facebook responded to the leak of its moderator rules: ‘We get things wrong’ Kali ini, diharapkan dapat memberi manfaat kepada anda semua. Okay, jumpa di lain post artikel.

Kini anda membaca artikel Facebook responded to the leak of its moderator rules: ‘We get things wrong’ dengan alamat pautan https://timesnewmalaysia.blogspot.com/2017/05/facebook-responded-to-leak-of-its.html

Subscribe to receive free email updates:

Related Posts :

0 Response to "Facebook responded to the leak of its moderator rules: ‘We get things wrong’"

Catat Ulasan

Loading...