Fast alles, was wir über Social Media denken zu wissen, ist falsch
Radikalisieren sich besonders viele Menschen auf Social Media?
Though we should be very skeptical of responses from technology companies on this point, there is surprisingly little evidence to support the idea that algorithms facilitate radicalization.
A large study of this issue by a group of computational social scientists found that some YouTube users advanced from moderate content to more extreme content, but the study estimated this happens to only 1 out of every 100,000 users.
Similarly, an earlier study of 10.1 million Facebook users concluded that the overwhelming majority of ideological segregation on this platform was driven by the individual decisions of users about what to engage with, rather than the algorithms that determined the order in which users see messages.
More recently, political scientists Kevin Munger and Joseph Phillips conducted an extensive analysis of radical political content on YouTube.
Though they discovered a large amount of such content in their analysis, they also found that most of the traffic to these videos came from new users—which suggests that radicalization may result from unmet demand for more extreme content, and not the radicalization of users.
Even more surprisingly, Munger and Phillips showed that the most extreme content on YouTube has been viewed considerably less often in recent years.
Much more research on the relationship between algorithms and radicalization is needed—ideally using better data from social media companies—but for now machine learning does not seem to be a smoking gun.
Sehen besonders viele Menschen dort Fake News?
Another popular criticism frequently leveled at Facebook, Twitter, and other social media platforms is that they failed to stop malicious foreign actors from spreading fake news and sowing discord.
Though such criticisms are widespread—echoing across the floors of Congress and media studios around the world—there is actually very little evidence that these misinformation campaigns succeeded in dividing Americans.
The political scientist David Lazer led a team of researchers who linked Twitter accounts to voter files (records that states keep about who is registered to vote).
By merging these data, the researchers were able to estimate how many people encountered fake news during the 2016 campaign.
Lazer's team estimated that less than 1 percent of Twitter users were exposed to 80 percent of the fake news that the team could identify, and that 0.1 percent of users were responsible for sharing 80 percent of such messages.
Another study by economists Hunt Allcott and Matthew Gentzkow found the average American saw—and remembered—approximately one fake news article in 2016.
A subsequent study by the political scientists Andrew Guess, Jonathan Nagler, and Joshua Tucker found similar patterns by linking a public opinion survey to a tool that tracked the browsing histories of survey respondents.
The researchers found that only a small fraction of Facebook users shared fake news, and most of that news was shared by elderly conservatives.
Menschen lassen sich viel weniger beeinflussen, als die meisten denken:
This finding fits into a broader trend that many people don't know about: most mass media campaigns have minimal effects.
Political campaigns are not a hypodermic needle that injects opinion into the masses.
Instead, most people ignore the campaigns—and the few who do engage with them already have such strong beliefs that their overall impact is negligible.
It is still possible that fake news and foreign misinformation campaigns can influence voting behavior.
But studies indicate that even the most sophisticated targeted campaigns to persuade voters in the 2016 election probably had little or no impact—and possibly even a negative impact upon voters who were mistargeted.
Voters who are accidentally shown ads intended for others, for example, are less likely to vote for the candidates advertised than those who were not targeted.
Perhaps even more surprisingly, there is also very little evidence that microtargeting ads influences consumer behavior.
Though being targeted by an online advertising campaign surprisingly close to one's interests is certainly creepy, research indicates such campaigns have very little influence on what we buy.
Nur wenige Menschen halten sich in Echokammern auf:
Still another area in which Silicon Valley companies are blamed for political polarization is the concept of the echo chamber.
But if Facebook and Twitter exposed people to those with opposing views—as many of the Silicon Valley apostates described above prescribe—the analysis I've presented throughout this book indicates that this strategy might backfire.
There is also an even deeper issue here, however.
The most recent studies indicate that the prevalence of the echo chamber phenomenon has been greatly exaggerated.
A 2019 study from NYU's Center for Social Media and Politics concluded that 40 percent of Twitter users do not follow any political accounts (that is, accounts of elected officials, political pundits, or media outlets).
The researchers found that of the remaining respondents, most consume information from a mix of liberal and conservative sources.
The only people stuck deep inside echo chambers, they concluded, were the minority of people on the extremes of the liberal-conservative continuum.
Even more surprisingly, another recent study of the browsing histories of more than fifty thousand internet users concluded that people who use social media are exposed to more opposing views than people who do not.
Aus dem Buch “Breaking the Social Media Prism” von Chris Bail.