Media literacy expert teaches students how to detect fake news

Author Julie Smith of Webster University argued that the amount, design and message of fake information contributes to its proliferation.

by Nicole Hawley ’21

Fake news has become increasingly prevalent in today’s digital world – not only as an accusation, but as an actuality.

In a Nov. 7 speech titled “Click here, Comrade: Misinformation in a Global World,” media literacy expert Julie Smith discussed the dangers of fake content and how to detect false information among the influx of media that Americans consume daily.

She delivered her remarks at Elon University in a talk sponsored by the Oaks Neighborhood, the School of Communications, the Student Government Association, the Society of Professional Journalists and Elon News Network. Her visit was arranged by Colin Donohue, director of student media, instructor of communications and faculty director of the Oaks Neighborhood, and Trianne Smith, community director of the Oaks.

Smith, an instructor of Communications at Webster University, shared a staggering statistic: While 80 percent of people believe the media affect society, only 12 percent think the media affect them personally, which is why she trains her students to “evaluate and critically assess the media we create and consume.”

She used a clip from the Nov. 6 episode of “Jimmy Kimmel Live!” to highlight her point. In it, someone from Kimmel’s show interviewed pedestrians about well-known musician Kid Rock’s election to the U.S. Senate. One problem: Kid Rock was not elected. He didn’t even run.

But all the people shown in the video believed the news, and one woman even acted as though she had already heard it. Smith said people too frequently fail to question and dig deeper into information.

“The media aren’t bad,” said Smith, the author of “Master the Media: How Teaching Media Literacy Can Save Our Plugged-in World.” “But if we’re spending this much time with something, we should be talking about it more than we do.”

Generally, Smith believes that the fake news dilemma boils down to a lack of critical thinking.

She detailed two systems of thinking. System 1 thinking is quick and consists of initial perceptions, while System 2 takes time and requires thinking and research, she said. System 1 thinking is what most frequently leads to the sharing and circulation of fake news and misinformation.

It’s also why information labeled as “breaking news” automatically catches people’s attention. According to a study at the University of Buffalo, only 5-9 percent of people seek to confirm information in breaking news tweets.

“If they don’t yell breaking news, we probably won’t pay attention to it,” Smith said.

Performative sharing is also to blame, and it happens when, without checking the facts, people instantly share what they perceive as significant information so that they may seem up-to-date and educated.

“A lot of people will share news-type information without checking because they want to appear like they’re the first,” Smith said.

But identifying fake news can be difficult because of its sheer volume and the ease with which it can be created.

“Here’s the thing: The amount of information makes it next to impossible to figure out what’s real and what’s not,” Smith said. “The design of information makes it next to impossible to figure out what’s real and what’s not. And the message itself — if we agree with it — typically means that we’re not going to check it for authenticity.

“Anything that’s visually compelling will go viral regardless.”

Take the “liquid mountaineers,” for example  a group of men pretending to be experts in running on water. The video that illustrated their athletic training featured slick production and editing.

“We’ve got slo-mo, we’ve got failure, we’ve got talking about equipment, we’ve got hot guys with accents – don’t discount the accents, they work,” Smith said. “It’s shot in the style of an extreme sports documentary.”

But the message is what speeds up dissemination. In an increasingly political climate, people are quick to trust information they agree with or that fits a narrative.

“If we agree with it, we aren’t going to check its authenticity,” Smith said. “We share what we want to believe.”

And so enter the Russians, who orchestrated the spread of false information via social media during the 2016 presidential election. A website called Hamilton 68 tracks the social media habits of 600 Russian Twitter accounts. Most of them are bot accounts that share common hashtags such as #maga, #votered, #magarally and #trumptrain. And they are meant to rile up Americans.

“They tweet inaccurate things to stir us up,” Smith said.

In fact, most of the bots are programmed to retweet each other based on their shared hashtags and common topics. How can you tell if an account you’re looking at is actually a bot? Ask yourself these questions: Do they go viral despite having few followers? Can they not conjugate English verbs properly? Do they now understand how to use infinitives correctly? If the answers to any of those questions is yes, then the account is likely a bot.

But there are other red flags that may indicate that information is misleading or fabricated.

For instance, Smith showed a picture depicting President Obama serving food to people in a buffet style line. The user who shared the picture said the former president was serving the victims of Hurricane Harvey, a storm that hit Texas in the summer of 2018. But both Obama and the person across from him were wearing sweaters – a telltale sign that the picture was not taken during a Texas summer. The picture was actually from Thanksgiving a few years prior.

Smith shared several different websites people can use to fact check the news and media content they are consuming, including Snopes.com, Emergent.info and FactCheck.org.

She also strongly encouraged the use of Google reverse image search, which she believes is the easiest and clearest way to detect misinformation and prevent it from circulating. “If you remember one thing from tonight, this is it,” Smith said.

But, she said, the best resource is the person viewing the content. Learn to be skeptical, to ask questions and to realize when something just doesn’t look quite right.

“You want to have your B.S. detectors turned on all the time,” Smith said.