Greed Is to Blame for the Radicalization of Social Media – ALEXANDER HEFFNER BUSINESS 06.30.19 08:00 AM

Casey Chin

Last week, Reddit quarantined “r/The_Donald,” a pro-Trump message board, after the company determined that the subgroup had encouraged and threatened violence. Likewise, Twitter is signaling that it will flag—but not remove—posts by government officials who violate its rules. As with YouTube’s demonetization (rather than deletion) of anti-gay videos, these are welcome, but insufficient measures.

Until recently, social media platforms could feign ignorance about the scope and impact of harmful content on their sites. Now bigotry and conspiracy theorizing, which could once be dismissed as the rants of outliers, have hijacked mainstream discourse—including on media produced by President Trump and his allies. As such posts have become some of the most popular, moneymaking content being viewed and shared, social media executives simply cannot excuse their indifference.

Growing up, your parents most likely diverted your attention away from invented tabloid cover stories at the supermarket checkout counter. They might have admonished you that such stories were fake—or, better yet, made up. Today, such tabloids may be financially imperiled, if not defunct, but their conspiracy-driven, largely fiction-first approach has become the dominant culture of social media.

This leads us to ask: Where would society be today if parents failed to draw the distinction between fact and fiction? And what if CEOs like Susan Wojcicki, Mark Zuckerberg, and Jack Dorsey behaved more like responsible parents than greedy shareholders?

Wojcicki says that she and her colleagues are trying to “understand” what’s on the platform instead of removing explicitly bigoted content and delineating between real news and fictional providers. Meanwhile, Google warned its employees against protesting YouTube’s policies in this weekend’s Pride Parade. This is similar to Twitter’s decision to “study”—instead of decisively assert—whether white supremacists are harmful to the platform. Apparently, that is not a clear-cut answer.

According to Pew Research, YouTube has become the most popular online platform among American adults. But the vicious tone that was once was relegated to comment sections is now rampant across the site.

Insatiable profit motives have led to the toleration of bigotry and the exploitation of users across major social media platforms. There is money in hate. In his testimony to Congress last week, former Facebook chief security officer Alex Stamos revealed what most social media platforms refuse to admit: Artificial intelligence cannot replace human judgment in resolving the online extremism epidemic. “These white supremacist groups have online hosts who are happy to host them,” Stamos said, adding that platforms like Gab publish racist content with impunity.

Article continues:

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s