Despite years of rigid denial from Facebook against claims that its algorithms work to divide users into extremely polarizing world views, the social media platform today released new tools designed to facilitate its users switching to a non-algorithmic view of their current News Feed.
Many of these tools were introduced in previous updates; this latest update will simply make accessing them much easier. This update will affect users’ News Feed as well as the newly-implemented Favorites feature and the Most Recent section. A new Feed Filter Bar has been added to the mobile version of Facebook. Users can manually determine who can and cannot comment on their posts via the Options menu on each post.
The new update also aims to make the Facebook algorithm more transparent to the lay user. AI-driven contextual suggestions will be paired with a “Why am I seeing this?” hyperlink which leads to an explanation of how the algorithm functions.
As real-life becomes more and more integrated with our social media personas, Facebook users have turned to post-filtering to hide their posts from certain audiences, like co-workers or close family members. With the new comment control features, not only can you decide who sees your post, but you can also control who is allowed to engage in conversations on your post. This should prove particularly valuable to influencers, public figures, and brands, who are bound to use the feature to control the conversations on their posts.
Tech giants like Facebook have come under criticism for the role they’ve played in the distribution of misinformation and directing its users toward content promoting extreme, radicalizing sentiments, like coronavirus vaccine scepticism, conspiracy theories, and hate groups. Many are also blaming Facebook for facilitating events like the storming of the US Capitol.
Facebook has remained defensive of its algorithms, even amidst growing fears of the polarizing effect that such AI-based technologies can have on its users. The company’s official stance is that the algorithms don’t lead people to extreme content, but that the users themselves are consciously seeking it out. And that the topics that appear in a user’s News Feed are a direct reflection of the content they have historically preferred.
The new update follows several major changes to the way Facebook delivers content to its users. Back in January, Facebook CEO Mark Zuckerberg announced in a conference call that the company was considering making changes to the platform based on community feedback that political posts were triggering arguments and fighting among users. And Facebook no longer recommends political groups to users, a decision made before the last US presidential election. That policy has since been expanded globally.
Nick Clegg, VP of Global Affairs for Facebook, posted a blog in which he restated the value that such algorithms provide not just for the owners of tech platforms but their users, too. In the post, Clegg puts the burden of regulating digital content on lawmakers.