Whistleblower testifies, accuses FACEBOOK of putting profits above everything else says Sree Iyer

Sree Iyer says that FACEBOOK needs to introspect as well as fix the way it filters offensive content. Hinting that their own employees could be having filtered vision, Iyer suggests that the Social Media giant has much to answer and mere brushing off of allegations won’t be enough

width="956" height="538" frameborder="0" allowfullscreen="allowfullscreen">

Mark Zuckerburg of Facebook is having a bad week. First, all the SM platforms such as Facebook, WhatsApp, and Instagram went down. It was a massive global outage that plunged the services and the businesses and people who rely on them into chaos for hours[1]. Facebook said late Monday that “the root cause of this outage was a faulty configuration change” and that there is “no evidence that user data was compromised as a result” of the outage.

Second, there was this devastating testimony by whistleblower Frances Haugen, a former product manager, at Facebook, who provided The Wall Street Journal with internal documents that exposed the company’s awareness of harms caused by its products and decisions.[2]

She went public after she had previously filed complaints with Federal Law Enforcement alleging that Facebook’s own research shows how it magnifies hate and misinformation and leads to increased polarization. In my opinion, this is a major cause for concern.

Like-minded people form a group and start sharing opinions that becomes increasing more radical, sometimes resulting in one or more individuals deciding to do something about it. Haugen’s contention was that Facebook knew about this and placed profits ahead of the danger it was causing to society. One other telling allegation by her is that Facebook knew that some of the posts from foreign countries during the 2016 elections were capable of causing waves and still chose to allow them because it put profits ahead of principles.[3]

Mark Zuckerburg, replied using a Facebook post that I am sure was vetted and re-vetted by his legal team, said that “Many of the claims don’t make any sense. If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing? And if social media were as responsible for polarizing society as some people claim, then why are we seeing polarization increase in the US while it stays flat or declines in many countries with just as heavy use of social media around the world? At the heart of these accusations is this idea that we prioritize profit over safety and well-being. That’s just not true. The argument that we deliberately push content that makes people angry for profit is deeply illogical. We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed. The moral, business, and product incentives all point in the opposite direction.”[4]

And on and on it goes – I have put a link to his post in the References section below. That Haugen testified in the Senate points to some truth in her allegations. The Wall Street Journal would have carefully researched her documentation lends credence to this.

In summary, a behemoth that connects about 3 billion people should have a stronger policy to ensure that its platform, especially the ads placed on it does not put out fake narratives.

Thankfully, Facebook did much better to filter out offensive content. But the fact remains that it has several internal groups of its own employees based on religion/ cause that spew invective based on false information. It needs to look internally too and fix this. Why? Because I don’t think Facebook wants its employees to put their own filters while looking at content that has been reported as being offensive. For someone wearing red-colored glasses, everything he/ she reads will appear Red. Will it? Only time will tell.

References:

[1] Facebook, WhatsApp, Instagram suffer worldwide outageOct 06, 2021, ABC News

[2] Facebook whistleblower Frances Haugen believes in the ‘potential’ of social platformOct 05, 2021, USA Today

[3] Facebook whistleblower’s explosive testimony: Company makes ‘disastrous’ choices, prioritizes profitOct 05, 2021, USA Today

[4] Mark Zuckerberg trashes ‘profit over safety’ allegation on Facebook as ‘deeply illogical’ Oct 06, 2021, MSM

PGurus is now on Telegram. Click here to join our channel and stay updated with all the latest news and views

For all the latest updates, download PGurus App.

LEAVE A REPLY

Please enter your comment!
Please enter your name here