How to Turn Off Facebvook News Feed

(Washington Post illustration; Facebook screenshots; iStock)

(Washington Post illustration; Facebook screenshots; iStock)

Why Facebook won't let you control your own news feed

Lawmakers want social networks to offer users a chronological timeline. Leaked documents help to explain why Facebook doesn't.

Updated November 15, 2021 at 12:45 p.m. EST |Published

November 15, 2021 at 12:42 p.m. EST

In at least two experiments over the years, Facebook has explored what happens when it turns off its controversial news feed ranking system — the software that decides for each user which posts they'll see and in what order, internal documents show. That leaves users to see all the posts from all of their friends in simple, chronological order.

Both tests appear to have taught Facebook's researchers the same lesson: Users are better off with Facebook's software calling the shots.

The internal research documents, some previously unreported, help to explain why Facebook seems so wedded to its automated ranking system, known as the news feed algorithm. That system is under intense public scrutiny.

In testimony to U.S. Congress and abroad, whistleblower Frances Haugen has pointed to the algorithm as central to the social network's problems, arguing that it systematically amplifies and rewards hateful, divisive, misleading and sometimes outright false content by putting it at the top of users' feeds. And previously reported internal documents, which Haugen provided to regulators and media outlets, including The Washington Post, have shown how Facebook crafts its ranking system to keep users hooked, sometimes at the cost of angering or misinforming them.

A growing number of lawmakers in both parties now think users should have an option to disable such automated ranking systems — for good. A bill introduced in the House of Representatives this week would require social media companies to offer a version of their services that doesn't rely on opaque algorithms to decide what users see. It joins a similar bill in the Senate. Both are sponsored by high-ranking members of both parties, giving the legislation a viable path to become law. (They are distinct from previous proposals that seek to regulate algorithms through other means, such as by allowing platforms to be sued when they amplify illegal content.)

The political push raises an old question for Facebook: Why not just give users the power to turn off their feed ranking algorithms voluntarily? Would letting users opt to see every post from the people they follow, in chronological order, be so bad?

The documents suggest that Facebook's defense of algorithmic rankings stems not only from its business interests, but from a paternalistic conviction, backed by data, that its sophisticated personalization software knows what users want better than the users themselves. It's a view that likely extends beyond Facebook: Rivals such as Twitter, TikTok and YouTube rely heavily on automated content recommendation systems, as does Facebook's corporate sibling Instagram.

But critics say this view misses something important: the value of giving users more agency over their information diet.

Since 2009, three years after it launched the news feed, Facebook has used software that predicts which posts each user will find most interesting and places those at the top of their feeds while burying others. That system, which has evolved in complexity to take in as many as 10,000 pieces of information about each post, has fueled the news feed's growth into a dominant information source.

The proliferation of false information, conspiracy theories and partisan propaganda on Facebook and other social networks has led some to wonder whether we wouldn't all be better off with a simpler, older system: one that simply shows people all the messages, pictures and videos from everyone they follow, in the order they were posted. That was more or less how Instagram and Twitter worked until 2016. But Facebook has long resisted it.

"Research we've done shows that unranked feeds can lead to integrity issues and other problems," spokeswoman Ariana Anthony said, asked why Facebook won't let users turn off ranking permanently.

Internal documents make clear that Facebook's decisions around feed ranking have not always been guided by concerns about "integrity," which is Facebook's term for content that may be harmful or misleading. Rather, they appear to have been informed mostly by data on user engagement, at least until recently.

"Whenever we've tried to compare ranked and unranked feeds, ranked feeds just seem better," wrote an employee in a memo titled, "Is ranking good?", which was posted to the company's internal network, Facebook Workplace, in 2018. That employee, who said they had worked on and studied the news feed for two years, went on to question whether automated ranking might also come with costs that are harder to measure than the benefits. "Even asking this question feels slightly blasphemous at Facebook," they added.

In 2014, another internal report, titled "Feed ranking is good," summarized the results of tests that found allowing users to turn off the algorithm led them to spend less time in their news feeds, post less often and interact less. Ultimately, they began logging into Facebook less often, imperiling the years-long growth in user engagement that has long powered the company's lucrative advertising business. Without an algorithm deciding which posts to show at the top of users' feeds, concluded the report's author, whose name was redacted, "Facebook would probably be shrinking."

What many users may not realize is that Facebook actually does offer an option to see a mostly chronological feed, called "most recent," if you select it from a settings menu. To reach it today on Facebook's mobile app, you have to tap the tiny "menu" icon at the bottom of your feed, then find and select "most recent." A shortcut that Facebook introduced in March, called the "feed filter bar," did not work at all on this reporter's account.

But there's a catch: The setting only applies for as long as you stay logged in. When you leave and come back, the ranking algorithm will be back on.

In the 2014 test, which has not been previously reported, the company toyed with honoring the "most recent" setting for longer and shorter periods of time after a user selected it from the settings — that is, with leaving the ranking algorithm off for longer and shorter periods before reverting to it. The results were not encouraging, from Facebook's standpoint. The longer Facebook left the user's feed in chronological order, the less time they spent on it, the less they posted, and the less often they returned to Facebook.

In a comment on the report, one Facebook employee asked whether the company would be better off removing the chronological feed option altogether: "It seems like a really bad experience to click 'Most recent' and then have it default back after 12 hours. This seems like it would be more frustrating than not having the option at all."

A separate report from 2018, first described by Alex Kantrowitz's newsletter Big Technology, found that turning off the algorithm unilaterally for a subset of Facebook users, and showing them posts mostly in the order they were posted, led to "massive engagement drops." Notably, it also found that users saw more low-quality content in their feeds, at least at first, although the company's researchers were able to mitigate that with more aggressive "integrity" measures.

That last finding has since become Facebook's go-to justification for its ranking algorithm.

Nick Clegg, the company's vice president of global affairs, said in a TV interview last month that if Facebook were to remove the news feed algorithm, "the first thing that would happen is that people would see more, not less, hate speech; more, not less, misinformation; more, not less, harmful content. Why? Because those algorithmic systems precisely are designed like a great sort of giant spam filter to identify and deprecate and downgrade bad content."

Some critics say that's a straw-man argument. Simply removing automated rankings for a subset of users, on a social network that has been built to rely heavily on those systems, is not the same as designing a service to work well without them, said Ben Grosser, a professor of new media at University of Illinois at Urbana-Champaign. Those users' feeds are no longer curated, but the posts they're seeing are still influenced by the algorithm's reward systems. That is, they're still seeing content from people and publishers who are vying for the likes, shares and comments that drive Facebook's recommendations.

And because the algorithm has always been there, Facebook users haven't been given the time or the tools to curate their feeds for themselves in thoughtful ways. In other words, Facebook has never really given a chronological news feed a fair shot to succeed.

Grosser runs a small, experimental social network called "Minus," which has a chronological feed, no likes or other visible reward system, and no ranking algorithm.

"My experience from watching a chronological feed within a social network that isn't always trying to optimize for growth is that a lot of these problems" — such as hate speech, trolling and manipulative media — "just don't exist."

Facebook is not the only social platform with an opaque ranking algorithm, of course. Twitter also uses machine-learning software to rank the tweets people see in their timelines. Like Facebook, it offers an option to see tweets in chronological order. In Twitter's case, that setting is much more accessible, requiring a single tap on the "sparkle" icon above the main feed. TikTok's "For You" page, meanwhile, is entirely algorithmic, with no option to turn off automated rankings. The same is true of Instagram.

Facebook has not taken an official stand on the legislation that would require social networks to offer a chronological feed option, but Clegg said in an op-ed last month that the company is open to regulation around algorithms, transparency, and user controls.

Twitter, for its part, signaled potential support for the bills.

"We agree that increased transparency and choice in tech are important, and we're encouraged that Congress is focusing on these issues," said Lauren Culbertson, Twitter's head of U.S. public policy. "We firmly believe that people should have meaningful control over their experience on Twitter, and that people should be provided with the information they need to make informed choices."

Interesting as Facebook's own research on chronological feeds might be, it shouldn't be considered definitive for purposes of policymaking, said Nathalie Maréchal, senior policy and partnerships manager for the nonprofit Ranking Digital Rights.

"Only companies themselves can do the experiments to find the answers. And as talented as industry researchers are, we can't trust executives to make decisions in the public interest based on that research, or to let the public and policymakers access that research."

"I think users have the right to expect social media experiences free of recommendation algorithms," Maréchal added. "As a user, I want to have as much control over my own experience as possible, and recommendation algorithms take that control away from me."

Correction: Twitter launched its algorithmic timeline in 2016. An earlier version of this story incorrectly said it launched in 2017.

keanwhernswille.blogspot.com

Source: https://www.washingtonpost.com/technology/2021/11/13/facebook-news-feed-algorithm-how-to-turn-it-off/

0 Response to "How to Turn Off Facebvook News Feed"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel