• Home
  • Latest News
  • Cryptocurrency and NFT
  • Virtual Worlds
  • Learning Zone
  • Videos
    • News
    • Learn
  • FAQ
  • Directory
Monday, March 23, 2026
Metaverse News Outlet
  • Home
  • Latest News
  • Cryptocurrency and NFT
  • Virtual Worlds
  • Learning Zone
  • Videos
    • News
    • Learn
  • FAQ
  • Directory
No Result
View All Result
  • Home
  • Latest News
  • Cryptocurrency and NFT
  • Virtual Worlds
  • Learning Zone
  • Videos
    • News
    • Learn
  • FAQ
  • Directory
No Result
View All Result
Metaverse News Outlet
No Result
View All Result
Home Latest News

Facebook’s Algorithm Is ‘Influential’ but Doesn’t Necessarily Change Beliefs, Researchers Say

2023-07-27
in Latest News
0
Facebook’s Algorithm Is ‘Influential’ but Doesn’t Necessarily Change Beliefs, Researchers Say
189
SHARES
1.5k
VIEWS
Share on FacebookShare on Twitter

The algorithms powering Facebook and Instagram, which drive what billions of people see on the social networks, have been in the cross hairs of lawmakers, activists and regulators for years. Many have called for the algorithms to be abolished to stem the spread of viral misinformation and to prevent the inflammation of political divisions.

But four new studies published on Thursday — including one that examined the data of 208 million Americans who used Facebook in the 2020 presidential election — complicate that narrative.

In the papers, researchers from the University of Texas, New York University, Princeton and other institutions found that removing some key functions of the social platforms’ algorithms had “no measurable effects” on people’s political beliefs. In one experiment on Facebook’s algorithm, people’s knowledge of political news declined when their ability to re-share posts was removed, the researchers said.

At the same time, the consumption of political news on Facebook and Instagram was highly segregated by ideology, according to another study. Ninety-seven percent of the people who read links to “untrustworthy” news stories on the apps during the 2020 election identified as conservative and largely engaged with right-wing content, the research found.

The studies, which were published in the journals Science and Nature, provide a contradictory and nuanced picture of how Americans have been using — and have been affected by — two of the world’s biggest social platforms. The conflicting results suggested that understanding social media’s role in shaping discourse may take years to unwind.

The papers also stood out for the large numbers of Facebook and Instagram users who were included and because the researchers obtained data and formulated and ran experiments with collaboration from Meta, which owns the apps. The studies are the first in a series of 16 peer-reviewed papers. Previous social media studies have relied mostly on publicly available information or were based on small numbers of users with information that was “scraped,” or downloaded, from the internet.

Talia Stroud, the founder and director of the Center for Media Engagement at the University of Texas at Austin, and Joshua Tucker, a professor and co-founder of the Center for Social Media and Politics at New York University, who helped lead the project, said they “now know just how influential the algorithm is in shaping people’s on-platform experiences.”

But Ms. Stroud said in an interview that the research showed the “quite complex social issues we’re dealing with” and that there was likely “no silver bullet” for social media’s effects.

“We must be careful about what we assume is happening versus what actually is,” said Katie Harbath, a former public policy director at Meta who left the company in 2021. She added that the studies upended the “assumed impacts of social media.” People’s political preferences are influenced by many factors, she said, and “social media alone is not to blame for all our woes.”

Meta, which announced it would participate in the research in August 2020, spent $20 million on the work from the National Opinion Research Center at the University of Chicago, a nonpartisan agency that aided in collecting some of the data. The company did not pay the researchers, though some of its employees worked with the academics. Meta was able to veto data requests that violated its users’ privacy.

The work was not a model for future research since it required direct participation from Meta, which held all the data and provided researchers only with certain kinds, said Michael Wagner, a professor of mass communications at the University of Wisconsin-Madison, who was an independent auditor on the project. The researchers said they had final say over the papers’ conclusions.

Nick Clegg, Meta’s president of global affairs, said the studies showed “there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or has meaningful effects on these outcomes.” While the debate about social media and democracy would not be settled by the findings, he said, “we hope and expect it will advance society’s understanding of these issues.”

The papers arrive at a tumultuous time in the social media industry. This month, Meta rolled out Threads, which competes with Twitter. Elon Musk, Twitter’s owner, has changed the platform, most recently renaming it X. Other sites like Discord, YouTube, Reddit and TikTok are thriving, with new entrants such as Mastodon and Bluesky appearing to gain some traction.

In recent years, Meta has also tried shifting the focus away from its social apps to its work on the immersive digital world of the so-called metaverse. Over the past 18 months, Meta has seen more than $21 billion in operating losses from its Reality Labs division, which is responsible for building the metaverse.

Researchers have for years raised questions about the algorithms underlying Facebook and Instagram, which determine what people see in their feeds on the apps. In 2021, Frances Haugen, a former Facebook employee turned whistle-blower, further put a spotlight on them. She provided lawmakers and media with thousands of company documents and testified in Congress that Facebook’s algorithm was “causing teenagers to be exposed to more anorexia content” and was “literally fanning ethnic violence” in countries such as Ethiopia.

Lawmakers including Senator Amy Klobuchar, a Democrat of Minnesota, and Senator Cynthia Lummis, a Republican of Wyoming, later introduced bills to study or limit the algorithms. None have passed.

Of the four studies published on Thursday, Facebook and Instagram users were asked and consented to participate in three of them, with their identifying information obscured. In the fourth study, the company provided researchers with anonymized data of 208 million Facebook users.

One of the studies was titled “How do social media feed algorithms affect attitudes?” In that research, which included more than 23,000 Facebook users and 21,000 Instagram users, researchers replaced the algorithms with reverse chronological feeds, which means people saw the most recent posts first instead of posts that were largely tailored to their interests.

Yet people’s “polarization,” or political knowledge, did not change, the researchers found. In the academics’ surveys, people did not report shifting their behaviors, such as signing more online petitions or attending more political rallies, after their feeds were changed.

Worryingly, a feed in reverse chronological order increased the amount of untrustworthy content that people saw, according to the study.

The study that looked at the data from 208 million American Facebook users during the 2020 election found they were divided by political ideology, with those who identified as conservatives seeing more misinformation than those who identified as liberals.

Conservatives tended to read far more political news links that were also read almost exclusively by other conservatives, according to the research. Of the news articles marked by third-party fact checkers as false, more than 97 percent were viewed by conservatives. Facebook Pages and Groups, which let users follow topics of interest to them, shared more links to hyperpartisan articles than users’ friends.

Facebook Pages and Groups were a “very powerful curation and dissemination machine,” the study said.

Still, the proportion of false news articles that Facebook users read was low compared with all news articles viewed, researchers said.

In another paper, researchers found that reducing the amount of content in 23,000 Facebook users’ feeds that was posted by “like-minded” connections did not measurably alter the beliefs or political polarization of those who participated.

“These findings challenge popular narratives blaming social media echo chambers for the problems of contemporary American democracy,” the study’s authors said.

In a fourth study that looked at 27,000 Facebook and Instagram users, people said their knowledge of political news fell when their ability to re-share posts was taken away in an experiment. Removing the re-share button ultimately did not change people’s beliefs or opinions, the paper concluded.

Researchers cautioned that their findings were affected by many variables. The timing of some of the experiments right before the 2020 presidential election, for instance, could have meant that users’ political attitudes had already been cemented.

Some findings may be outdated. Since the researchers embarked on the work, Meta has moved away from showcasing news content from publishers in users’ main news feeds on Facebook and Instagram. The company also regularly tweaks and adjusts its algorithms to keep users engaged.

The researchers said they nonetheless hoped the papers would lead to more work in the field, with other social media companies participating.

“We very much hope that society, through its policymakers, will take action so this kind of research can continue in the future,” said Mr. Tucker of New York University. “This should be something that society sees in its interest.”

Source link

Tags: AlgorithmBeliefschangeDoesntFacebooksInfluentialNecessarilyResearchers
Share76Tweet47
Previous Post

Bruce Lee avatar to teach martial arts in the metaverse

Next Post

The British Museum Joins The Metaverse: A New Era of Digital Collectibles

Related Posts

Meta backtracks on Horizon Worlds as metaverse shifts to mobile

Meta backtracks on Horizon Worlds as metaverse shifts to mobile

2026-03-23
0

Meta has made a swift reversal on its plans to wind down virtual reality support for Horizon Worlds.Just one day...

Meta and Mark Zuckerberg Didn’t Lose $80 Billion on the Metaverse

Meta and Mark Zuckerberg Didn’t Lose $80 Billion on the Metaverse

2026-03-21
0

Nearly five years ago, Mark Zuckerberg told us the future was the metaverse — an idea that seemed to involve...

Mark Zuckerberg’s Metaverse Failed. but What If It Hadn’t?

Mark Zuckerberg’s Metaverse Failed. but What If It Hadn’t?

2026-03-20
0

In 2021, Facebook renamed itself "Meta" because it was the first part of the word "metaverse." This is fitting because...

Take-Two Interactive Software Inc. (TTWO) Builds Momentum in Gaming and the Metaverse

Take-Two Interactive Software Inc. (TTWO) Builds Momentum in Gaming and the Metaverse

2026-03-19
0

Take-Two Interactive Software Inc. (NASDAQ:TTWO) is one of the best metaverse stocks to buy according to analysts. On March 4,...

‘Uncanny Valley’: Nvidia’s ‘Super Bowl of AI,’ Tesla Disappoints, and Meta’s VR Metaverse ‘Shutdown’

‘Uncanny Valley’: Nvidia’s ‘Super Bowl of AI,’ Tesla Disappoints, and Meta’s VR Metaverse ‘Shutdown’

2026-03-19
0

This week on Uncanny Valley, hosts Brian Barrett and Zoë Schiffer discuss the highlights from Nvidia’s annual developer conference, and...

Load More

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

  • Trending
  • Comments
  • Latest
Top 5 Biggest Metaverse Crypto Projects for 2022

Top 5 Biggest Metaverse Crypto Projects for 2022

2022-01-16
Why Are Photographic Avatars Vital for Work in the Metaverse?

Why Are Photographic Avatars Vital for Work in the Metaverse?

2022-01-02
YouTube’s Primetime Channels bring streaming movies and TV into the YouTube app

YouTube’s Primetime Channels bring streaming movies and TV into the YouTube app

2022-11-01

Gateway to the Metaverse | Monthly Memos from Hrish Lotlikar

2022-01-04
Someone paid 450k to be Snoop Dogg’s virtual neighbour

Someone paid 450k to be Snoop Dogg’s virtual neighbour

0
Metaverse Land On Solana Is Also Gaining Value Rapidly

Metaverse Land On Solana Is Also Gaining Value Rapidly

0
Snoop Dogg Sandbox Sale Goes Live Today!

Snoop Dogg Sandbox Sale Goes Live Today!

0
Metaverse Land Sales Are Flying

Metaverse Land Sales Are Flying

0
Meta backtracks on Horizon Worlds as metaverse shifts to mobile

Meta backtracks on Horizon Worlds as metaverse shifts to mobile

2026-03-23
Mark Zuckerberg is Reportedly Using a Personal AI agent to Speed Up Work

Mark Zuckerberg is Reportedly Using a Personal AI agent to Speed Up Work

2026-03-23
Meta and Mark Zuckerberg Didn’t Lose $80 Billion on the Metaverse

Meta and Mark Zuckerberg Didn’t Lose $80 Billion on the Metaverse

2026-03-21
Mark Zuckerberg’s Metaverse Failed. but What If It Hadn’t?

Mark Zuckerberg’s Metaverse Failed. but What If It Hadn’t?

2026-03-20

Metaverse News Outlet

Our mission is to bring you the latest metaverse news and help you learn about the metaverse. It doesn't matter if you're a metaverse beginner or a metaverse noob, we've got you covered. Be sure to check out our Learning Zone to find out how you can join the metaverse if you want to get involved.

Categories

  • Cryptocurrency and NFT
  • Latest News
  • Learning Zone
  • Virtual Worlds

Recent Posts

  • Meta backtracks on Horizon Worlds as metaverse shifts to mobile
  • Mark Zuckerberg is Reportedly Using a Personal AI agent to Speed Up Work
  • Meta and Mark Zuckerberg Didn’t Lose $80 Billion on the Metaverse

No Result
View All Result
  • Home
  • Latest News
  • Cryptocurrency and NFT
  • Virtual Worlds
  • Learning Zone
  • Videos
    • News
    • Learn
  • FAQ
  • Directory

© 2018 Metaverse News Outlet.