Misinformation spread by social media is rampant and can have disastrous results. For example, in 2013, a false tweet briefly wiped out almost $140 billion in U.S. stock market value. This sent traders into chaos until the truth was discovered. (Aral, 2020) If a single tweet can hold that power, imagine the influence of a more organized campaign.


     The spread of misinformation is now back in the news with recent overdoses of the  drug Ivermectin, a horse and cow parasitic, which many falsely believe can prevent or cure Covid-19.  WHO, the CDC and Merck, the company which produces Ivermectin, have all warned against its use. “You are not a horse,” the Food and Drug Administration felt compelled to declare in a tweet, “You are not a cow. Seriously, y’all. Stop it.” Ivermectin is not an anti-viral the agency says, noting reports of more people landing in the hospital from ivermectin overdoses. (Your typical human, it seems, shouldn’t take similar doses of a treatment as what’s given to a 1,000-pound horse.) (Mukherjee, 2021) Yet, these warnings simply seem to have little effect. For example, Lisa VanNatta, a 61-year-old Texas rancher, maintained that the animal medicine is safe in the doses she said she’s been taking. VanNatta said she is still avoiding a coronavirus vaccine and cites videos circulating on Facebook in which a veterinarian claims that mass vaccination will create dangerous virus variants.  (Gowen & Knowles, 2021)  When you consider there were 88,000 prescriptions given out for Ivermectin last week and compare it to a normal weekly prescription request of 3,300, one comes to realize that we need to be concerned about the effects that manipulated scientific information, spread through social media, can have on our behavior, individually and society. (Vijaykumar, 2019). Echo chambers, in which this information gains credence, is a vast concern.


      To better understand how echo chambers emerge, Kellogg professor Brian Uzzi teamed up with a group of researchers to examine the usage of 12 million Facebook and YouTube.  The users started out interacting with both Conspiracy and Science videos, “We thought they would continue to look at a variety of content. But it turned out they quickly became very selective,” Uzzi says. Even people who start out holding two points of view will most likely be in a decided echo chamber by the time they’ve made their fiftieth comment, share, or “like.” (Ford, 2017) Further, while it seems it would be easy to debunk the false information, studies have proved the opposite, that polarized social-media users do not engage with views which contradict their beliefs. In fact, it can cause to cling to them more fiercely. In an Italian study, researchers examined 9 million Facebook users who were polarized to conspiracy-type thinking.  When given a debunking post, only 100,000 commented on it and most of the ones who did interact then reacted by upping their engagement on conspiracy related pages. (Ford, 2017) Once polarized, users tend to become even more polarized; inside an echo chamber, the thing that makes people’s thinking evolve is the even more extreme point of view. (Quattrociocchi,2016) “This problem is complicated further by the personalization algorithms underlying social media. These tend to feed us content consistent with our beliefs and clicking patterns, helping to strengthen the acceptance of misinformation” notes senior researcher Santosh Vijaykumar in digital health at Northumbria University. Further, he states “rapid advances in digital technologies will also ensure that misinformation arrives in unexpected formats and with varying levels of sophistication” as technology continues to evolve. (Vijaykumar, 2019)


       “The platforms profit from misinformation because the more outrageous the content, the more people interact with it – this type of ‘engagement’ is what the platforms are looking for; people reacting to things. It doesn’t matter if it’s true or false as long as they engage,” Roger Entner, technology analyst of Recon Analytics stated. News editors in regular media check, double check and triple check facts before broadcasting/publishing. Social media has no checks, no balances. Misinformation gets clicks; there are no repercussions.  “Everything gets sacrificed on the altar of monetization through engagement.” (Suciu, 2021). Karen Kornbluh, director of GMF Digital, stated that, while tamping down on misinformation runs against the economic incentives of social media firms, it must be done because it’s infecting our discourse and it’s affecting the long-term health of the democracy. (Alba, 2020) Surgeon General Vivek Murthy agrees.  He recently declared misinformation translated through social media about the vaccines a threat to public health.  Further, the White House’s communications director, Kate Bedingfield, stated that social media giants should be held accountable for publishing this misinformation. She stated the Biden administration is reviewing policies and that this could include amending Section 230 of Communications Decency Act.  Acting on the research of a detailed study, the Biden administration has asked Facebook to suspend Facebook accounts of a group of 12 people who are responsible for sharing 65 percent of all anti-vaccine messaging on social media. But there will be a backlash. Experts in digital communications note that these 12 individuals will most likely now become heralded as martyrs of an influential movement who have been targeted and also ignite questions of free speech.



       “Governments blame the platforms for turning a blind eye to the weaponization of their technology. And the people blame their governments and the platforms for inaction. But the truth is, we’ve all been asleep at the switch.”, writes Sinan Aral, director of the MIT Initiative on the Digital Economy. We must step out of our tendency to armchair-theorize about how social media affects us and develop a rigorous scientific understanding of how it works.” (Aral, 2020). We need to teach digital literacy across the board, so from the elementary school level people begin to have the skills to decipher real news from the fraudulent and understand how algorithms work.  People should be educated about new technology like Flipfeed a plug-in that allows users to replace their own Twitter feeds with those of random, ideologically different strangers, and Escape Your Bubble, a plug-in that inserts opposing political views into users’ Facebook newsfeeds. “Social media could deliver an incredible wave of productivity, innovation, social welfare, democratization, equality, health, positivity, unity, and progress. At the same time, it can and, if left unchecked, will deliver death blows to our democracies, our economies, and our public health. Today we are at a crossroads of these realities.” (Aral, 2020) We need to take an informed step in the right direction.


Alba, D. (2020, October 12). On facebook, misinformation is more popular now than in 2016. The New York Times.

Aral, S. (2020, September 15). The promise and peril of the hype machine. MIT Sloan.

Gowen, A., & Knowles, H. (2021, September 2). Doctors dismayed by patients who fear coronavirus vaccines but clamor for unproven ivermectin. The Washington Post.

Mukherjee, S. (2021, September 3). The psychology behind why people will take horse paste, but not covid vaccines. Fortune.

Suciu, P. (2020, April 7). During covid-19 pandemic it isn’t just fake news but seriously bad misinformation that is spreading on social media. Forbes.

Suciu, P. (2021, August 2). Spotting misinformation on social media is increasingly challenging. Forbes.

Vijaykumar, S. (2019, August 7). How pseudoscience is taking over social media and putting us all at risk. The Independent.