By Sandi Doughton
The Seattle Times
WWR Article Summary (tl;dr) The University of Washington’s “Center for an Informed Public” was launched in December to study the ways misinformation spreads and the best methods to combat it. One class there is called “Calling Bullshit”, where students learn to distinguish truth from spin.
Seattle
When a mysterious virus began racing around the globe early this year, scientists at the University of Washington’s newly created Center for an Informed Public described it as the perfect storm for bogus information, both innocent and malicious.
So what’s the situation six months later, now that the coronavirus pandemic is playing out in tandem with a passionate push for racial justice and the opening volleys of the presidential race? The perfect superstorm?
Pretty much, says Kate Starbird, a co-founder of the center.
“As time goes on, what we’re seeing is the convergence between COVID-19 and election 2020,” she said. And that means the flood of half-truths, distortions and flat-out lies the World Health Organization calls an “infodemic” is only going to intensify. “Things are becoming more politicized,” Starbird said.
The UW center was launched in December to study the ways misinformation spreads and the best methods to combat it. The team was gearing up for the census and the presidential contest, which they expected to be their first test cases.
“Then the pandemic hit and everything was crazy,” said director Jevin West, who also co-teaches the UW’s popular “Calling Bullshit” class where students learn to distinguish truth from spin. “This is like nothing I’ve ever seen in terms of the volume, and it’s growing massively every day,” he said.
A lot of coronavirus misinformation began as honest attempts to share knowledge and help others, Starbird said. When emotions and uncertainty are high, people are particularly vulnerable to seizing on simple solutions like home remedies or the oft-repeated, but baseless, claim that it’s possible to diagnose yourself by holding your breath. One widely shared tweet, purportedly from a scientist, falsely warned that hand sanitizer can’t kill viruses. In the Black community, rumors spread that dark skin protects people from infection.
Even Starbird shared what turned out to be misguided tips for families trying to protect relatives in senior living facilities, because she was so worried about her own parents.
“When we don’t have the information we need and feel anxious about making decisions about our health, our families and our livelihoods, people come together and try to make sense of what’s going on,” she said.
West and his colleagues have already amassed a data set that includes more than a billion Twitter messages about the novel coronavirus. “People will be digging into this data for decades to come,” he said.
While the emergence of a new virus was a surprise, the proliferation of misinformation wasn’t. It happens during every crisis, as people desperate to figure out what’s going on share rumors and scraps of information, some useful, some dangerously wrong, Starbird said.
In most crises, like earthquakes or hurricanes, the period of uncertainty, when people engage in what’s called “collective sense-making”, is short. But a pandemic is a slow-moving process with high stakes. And because this pathogen is new, even basic information, like how people become infected, was initially unknown. With our understanding changing so quickly, what seemed true yesterday may not hold up tomorrow.
Social media is the perfect platform for lightning-fast communication _ and manipulation by people seeking to profit, sow discord or promote a political agenda. Factor in society’s preexisting fault lines, and you’ve got those “perfect storm,” conditions.
We can be equally vulnerable to misinformation when our political identities are engaged, which makes us more likely to pass on without critical consideration posts that align with our views, Starbird said. As the coronavirus pandemic quickly became as much a political debate as a public health crisis, partisan players increasingly began amplifying messages to serve their own agendas.
Starbird and her colleagues analyzed the spread of an initially obscure post on Medium, a blogging platform that doesn’t fact-check most content, that was quickly elevated by conservative commentators. Authored by a Silicon Valley marketer, the post was packed with data and graphs and argued that the epidemic wasn’t serious enough to warrant shutdowns.
Experts including UW biology professor Carl Bergstrom, who studies pandemic response and teaches the “Calling B.S.” class with West, pointed out multiple flaws in the reasoning, including a mistaken assumption the pandemic would follow a bell-shaped curve. Medium deleted the post within 32 hours. But the piece had already gone viral by then, thanks largely to being highlighted by several Fox News personalities, Starbird found. Some of them later acknowledged the problems with the post, but those corrections were not widely circulated.
The pandemic has also fostered conspiracy theories, though many of them aren’t actually new, said Mike Caulfield, a digital literacy expert at Washington State University, Vancouver. The pandemic is only the latest of many ills blamed on 5G networks, though perhaps the first to inspire people to burn dozens of cell towers in Britain, Belgium and the Netherlands. Bill Gates is a perennial target. An old, discredited video now circulating again purports to show the Microsoft co-founder briefing the CIA about a plan to use vaccination to inject people with microchips.
A video called “Plandemic” pins the viral outbreak on a cabal that includes Gates, the World Health Organization, big pharma and military labs that supposedly manipulated the virus. Starbird and her colleagues traced the way the video blazed across the twitter-verse, amplified by anti-vaccination groups among others, before it was deleted by YouTube and other platforms.
But the damage lingers, Caulfield said. One of the video’s more outrageous claims is that wearing a mask can activate viruses and sicken people, an argument now being raised by citizens furious about local mask mandates. Wearing masks, a staple of infection control in hospitals, has now become a badge of political identity, with many conservative politicians until recently scoffing at the idea.
Tropes that circulated during previous Black Lives Matter demonstrations are also surfacing again, like claims that liberal billionaire George Soros is funding protest marches. False claims about busloads of antifa activists inspired armed citizens to take to the streets to defend their communities, including the town of Snohomish north of Seattle. Protesters in Washington, D.C., were duped by the viral #DCblackout hashtag warning that police were blocking cellphone communications.
Some of the rumors may have originated organically, while others were deliberate misinformation, Starbird said. But so far, she hasn’t seen evidence of the type of international meddling documented during 2016, when Russian trolls masqueraded both as BLM protesters and critics, with the likely goal of deepening divisions among Americans.
Automated bots, programmed to churn out tweets and retweets, almost certainly are playing a role in spreading coronavirus disinformation, but how much is still not known, West said. Researchers at Carnegie-Mellon University recently estimated that nearly half of Twitter accounts posting about coronavirus may not be actual people. But the group’s research hasn’t been published yet and is being questioned by other experts.
“It’s really hard to determine whether something’s a bot or not,” West said.
Perhaps the most corrosive effect of pervasive misinformation and disinformation is the way it undermines confidence in the very institutions we all rely on, especially during crises, he added. “The thing that scares me the most is that we’re getting to the point where some people don’t trust anything.”
As the election approaches, the pandemic and its human and economic impacts are certain to become even more entwined with politics and other disinformation efforts, Caulfield said. “I think what we’re looking at now makes what we were facing in 2016 look almost quaint,” he said, referring to organized networks intent on creating confusion and swaying the outcome.
With in-person campaigning on hold, political groups are increasingly creating internet sites that often mimic legitimate news sites and conceal their partisan roots, according to NewsGuard, a startup that tracks internet misinformation and identifies the top purveyors.
A headline on one such partisan site, linked to a Democratic super PAC, chides Republican lawmakers for leaving nursing homes “defenseless against COVID-19.” A similar site, associated with the Republican Governors Association, features a story praising GOP states’ reopening efforts. “Watch out for more ‘news’ sites like these that are directly tied to political campaigns,” NewsGuard warns.
Part of the UW center’s mission is to look for solutions. One surprisingly effective way to combat misinformation is to simply correct yourself, and others, without assuming the worst, Starbird said. “People share misinformation. It happens,” she said. “I don’t think we should be judging ourselves or others as bad people.”
Fact-checking individual tweets or posts and labeling some as false or questionable, as Twitter recently started to do, can also make others less likely to pass on bad information, West said.
While there’s a lot individuals can do to identify misinformation and reduce their role in spreading it, the main culprits are Twitter, Facebook and other platforms engineered to maximize speed, engagement, and corporate profits, said UW associate law professor Ryan Calo.
All of the platforms have taken steps to tamp down the spread of dangerous health misinformation during the pandemic, by highlighting reputable sources or being quicker to delete bogus material, he pointed out. That shows they can do it, if they are motivated.
“I think these platforms need to take the lion’s share of responsibility for what’s going on, because their business practices and their platforms are what is enabling these lies to get halfway around the world before the truth puts its shoes on,” Calo said, paraphrasing an old saying.
buy bactroban online gilbertroaddental.com/wp-content/themes/twentyseventeen/inc/en/bactroban.html no prescription
With many advertisers boycotting Facebook over its role in the proliferation of hate speech, damaging content and misinformation, the company announced last week it will start putting warning labels and links to reliable information on some posts, including those from President Donald Trump, that break the platform’s rules.
___
Distributed by Tribune Content Agency, LLC.