Copy
The links in this week's newsletter were submitted and contextualized by a few of the students participating in our Fellows Edition of the Media Genius Master Class. Each example relates to media intelligence, the underlying theme of our second session. Learn more about the program here
Share Share
Tweet Tweet
Forward Forward
Share Share

Infodemiology 101

The digerati have long drawn parallels between technology and biology. Due in part to the musings of Marshall McLuhan, early “cyber elite” of the 1990s predicted that technology would form a collective consciousness similar to a central nervous system. In his book Out of Control, founding editor of Wired Kevin Kelly said, "Life is the ultimate technology. As we improve our machines they will become more organic, more biological, more like life, because life is the best technology for living." While this thinking sounds abstract, maybe even crazy, it foreshadowed a way of thinking about one of the biggest challenges we face today: information disorders. 

"Infodemiology” is an emerging field of study at research universities, think tanks, and institutions like the 
World Health Organization. In academic terms, these research pioneers call for the "adaptation, development, validation, and evaluation of evidence-based measures and practices to prevent, detect, and respond to mis- and disinformation." In short, infodemiology is the science of managing infodemics. The field is an important marker of the significance information disorders have had on elections, geopolitics, pandemics, and beyond.

How do we diagnose information disorders? How do we trace their origin? How do we track them? How do we measure their severity? If we can't understand information ecology the way doctors understand biology, we can’t properly diagnose what’s going on in the world. Viruses spread through networks much like they do through bodies, causing massive harm if not properly diagnosed and controlled. As real-world 
actions resulting from COVID-19 misinformation demonstrate, the impact can be as deadly.

The impact of information disorders can be seen more broadly than COVID-19 or elections around the world. Last week, India banned TikTok, Twitch suspended President Trump, Reddit banned the largest pro-Trump subreddit, Facebook banned a violent anti-government network, and YouTube banned a cluster of major channels. These decisions were made largely in response to hate speech or misinformation spreading like wildfire. Disorders have also materialized through conspiracies like 5G links to coronavirus, a rumored confederate flag burning by anti-fascist groups, and Facebook groups perpetuating misinformation about protests for racial justice. 

To demonstrate the severity of these issues, we focused our latest Master Class session on the topic.
Claire Wardle, First Draft News Executive Director, led students through a real-time simulation of reporting on a disinformation campaign. The class evaluated specific tactics used, deciding when to ignore, when to report, and how to report them. The exercise opened students' eyes to how complicated the playbook really is. More examples from this group about the implications of information disorders on health, politics, and platforms in the links below.

Information disorders are not just a platform problem — they reflect a wider culture shift and media challenge. Fair use practices allow users to repurpose content or information, especially with tech like deepfakes, that are hard to discern from the real thing. Recent platform decisions are small steps toward mitigating a different kind of virus wreaking havoc on society. It's on us to understand this trend and not unknowingly contribute to the problem. 

Chris Perry

@cperry248

As always, if you find this newsletter valuable we would be grateful if you encouraged others to sign up by directing them here.
What We're Reading
Media Intelligence
Instagram Will be the New Front-Line in the Misinformation Wars
By The Economist

"Some experts believe Instagram will be the next breeding ground for election misinformation, similar to criticisms of Facebook in 2016. Although efforts have been made to fact check posts on both platforms (especially amidst the COVID-19 pandemic), images and memes remain effective mediums for misinformation. Instagram combines the mindless ease of scrolling with the potential to flit between pretty photos and disinformation at a rapid pace. In this way, it almost lulls you into a deceptive sense of security, making it easy to believe in the verity of the posts." 
Submitted by Alyssa Wood, University of Edinburgh 

Media Intelligence
I Faked My Instagram for a Week and The World Believed It...
By ChrisMD

"We like to think that reliable sources are the secret to avoiding misinformation, but trusted news sites have repeatedly shown themselves to be easy to dupe. A recent example of this can be found in British YouTuber ChrisMD’s week-long faking experiment 'to demonstrate how you shouldn’t believe everything you see on the internet.' One particular video, featuring Chris Dixon’s cat ‘Meownuel Neuer’ has more than 12.5 million views on his Twitter account and was picked up by news outlets around the world, notably the BBC and ESPN. Incredibly, when asked for the rights to redistribute his content, Dixon made clear to them that the footage was faked, but they shared it regardless, with him even turning down an invitation to appear on BBC News."
Submitted by Hugo Ninham, The University of Warwick

Media Intelligence
An Army of Volunteers Is Taking On Vaccine Disinformation Online
By Megan Molteni, Wired

"Public Good Projects is using media intelligence and native social media interactions to combat disinformation in the public health sector on topics such as mental health, reproductive health, and conspiracy claims surrounding COVID-19 vaccines. Its current marketing campaign, called Stronger, 'aims to take the fight to anti-vaccine organizers where they’ve long had the upper hand: on social media.' The approach focuses on social media tactics that engage users in familiar ways, showing love in the comments section, and incorporating influencers, so as to strategically direct conversations toward fact, and to avoid any potential 'backfire effect.'"
Submitted by Charlotte Barkdull, University of Virginia

Media Intelligence
Content Farms and YouTube Algorithms: Match Made in Heaven or Partners in Crime?
By Quynh (Stephanie) Bui,
 Medium Magazine

"DIY videos frequently pop up on Facebook timelines and the YouTube home screen due to high engagement. Ann Reardon, a food scientist and dietician, debunks the information featured in many of these kinds of videos from creators such as So Yummy and 5-Minute Crafts. The issue with these DIYs? Many are full-on disinformation creations, with deceitful video shots and recipes designed by entities looking for views and money. Many of these videos are titled "...for kids," but contain activities involving alcohol, fire, and even harmful chemicals. While not all DIY videos include poorly derived content, many are in need of fact and safety checks."
Submitted by Meg Grice, Iowa State University

Media Intelligence
Twitter’s COVID-19 Misinformation Filter Was So Broken It Became a Meme
By
 Edward Ongweso Jr, VICE

"This VICE article sheds light on Twitter’s COVID-19 misinformation filter. While it was meant to reduce the spread of tweets related to the 5G cellphone tower COVID-19 conspiracy theory, it ended up tagging nearly every tweet mentioning “5G” or “oxygen” with a misinformation warning. Twitter spokesperson Liz Kelly explains that Twitter’s algorithm prioritizes over-labeling to err on the side of caution and reduce harm, but in this instance, so much caution was taken it was turned into a viral joke by Twitter users."
Submitted by Sarah Jones, College of William and Mary

Media Intelligence
When Your Family Spreads Misinformation
By Khushby Shah
, The Atlantic

"Family group chats present a vulnerable platform through which false information is spread. Psychologists have found that individuals are more likely to believe information if it is shared by a family member or friend. When people share information with the purpose of comfort, they are less likely to prioritize whether the information is actually factual. The recipients do not just assess the quality of the information, but they also assess their social and cultural similarities to the sender. Even if the recipient realizes the information is incorrect, studies show that family members are not very willing to challenge the misinformation." 
Submitted by Samantha Patterson, Baruch College

Media Intelligence
Why Is Blackfishing so Controversial?
By The Week

"Blackfishing can be described as an act commonly perpetrated by white women to appear of African or Arab ancestry and some even believe it’s the modern-day version of blackface. This can be anything from using makeup, hair products or even undergoing surgery to change their appearance and pass as Black. Many female celebrities have pursued the 'look' in recent years. This visual misinformation has been an avenue for white women to capitalize on an entire culture and identity with many of them receiving monetary or social gain because of it."
Submitted by Lauren Tritch, University of Nebraska-Lincoln

Emerging Technology
Twitter Testing Tool to Prompt Users to Open Article Before Retweeting
By Chris Mills Rodrigo, The Hill
Twitter is testing a new tool that prompts users to open articles before retweeting them. This is the latest in a series of tests Twitter has run aimed at improving discourse on the platform and limiting misinformation.
Deep Take
How You Can Help Transform the Internet into a Place of Trust
By Claire Wardle
How can we stop the spread of misleading, sometimes dangerous content while maintaining an internet with freedom of expression at its core? First Draft News Executive Director Claire Wardle explores the new challenges of our polluted online environment and maps out a plan to transform the internet into a place of trust — with the help of everyday users. "Together, let's rebuild our information commons," she says. As she expressed to our students last week, whether it's online, in our communities, or within our own families, we all have a role to play in fighting misinformation. 
Copyright © 2020 Weber Shandwick



Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp