Last week, the first papers from a collaboration between Meta’s Facebook and a team of external researchers studying the 2020 election were finally published. Two of these studies asked: Are we trapped in filter bubbles, and are they tearing us apart? The results suggest that filter bubbles are at least somewhat real, but countering them algorithmically doesn’t seem to bring us any closer together.

Some are interpreting these results as proof that Facebook divides us. Others are claiming these experiments are a vindication of social media. It’s neither.

The first study tried to figure out whether we’re really in informational echo chambers, and if so, why. Unsurprisingly, the segregation in our information diets starts with who we follow. This mirrors offline life, where most people’s in-person social networks are highly segregated.

But what we actually see in our Feed is more politically homogeneous than what is posted by those we follow, suggesting that the Feed algorithm really does amplify the ideological leanings of our social networks.

There are even larger partisan differences in what we engage with, and Facebook, like pretty much every platform, tries to give people more of what they click, like, comment on, or share. In this case, it looks like the algorithm is sort of meeting human behavior halfway. The difference in our information diets is partly due to what we’ve chosen, and partly the result of using computers to guess—often correctly—what buttons we’ll click.

This raises the question of how ideologically similar people’s news should be. You can read the computed values of the “isolation index” in the paper, but it’s not clear what numbers we should be aiming for. Also, this study is strictly concerned with “news and civic content.” This might be democratically important, but it makes up only a few percent of impressions on Facebook. It’s possible that positive interactions with people who are politically different change us the most, even if it’s just reading their posts on unrelated topics.

The second study directly tested whether increasing the political diversity of people and publishers in your feed has an effect on polarization. For about 20,000 consenting participants, researchers reduced the amount of content from like-minded sources by about a third. This increased consumption from both neutral and cross-cutting sources, because the amount of time spent on Facebook didn’t change.

Of the eight polarization variables measured—including affective polarization, extreme ideological views, and respect for election norms—none changed in a statistically significant way. This is pretty good evidence against the most straightforward version of the “algorithmic filter bubbles cause polarization” thesis.

But this is not the end of the story, because filter bubbles aren’t the only way of thinking about the relationship between media, algorithms, and democracy. A review of hundreds of studies has found a positive correlation between general “digital media” use and polarization, worldwide, as well as a positive correlation with political knowledge and participation. Social media use has many effects, both good and bad, and filter bubbles aren’t the only way of thinking about the relationship between media, algorithms, and democracy. For example, there’s evidence that engagement-based algorithms amplify divisive content, and tools to reach targeted audiences can also be used for propaganda or harassment.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums

Consulate information for domestic helper | 健樂護理有限公司 kl home care ltd.