Elon University

Breakout Panel Session – Content and Conduct: Countering Violent Extremism and Promoting Human Rights Online

IGF USA LogoBrief session description:

Thursday, July 14, 2016 – In the wake of the uptick in terrorist attacks globally, countering violent extremism (CVE) has become a major policy agenda domestically and internationally. The UN has called on each country to develop its own national strategy. U.S. President Barack Obama convened an international CVE summit last year. The role of the Internet in radicalization and recruitment has been front and center in most discussions. Governments have a legitimate interest in combatting terrorism, but there must be a discussion about the impact of the CVE agenda on human rights online. The debate over countering violent extremism poses critical questions about how to balance human rights and legitimate national security interests online. This session was set up to focus on: how measures being taken to tackle online extremism and those being considered might impact human rights online; potential impacts on marginalized and at-risk communities, journalists and activists; and whether those measures are compatible with human rights standards. You can read a written account and view video highlights on this page or watch full archived video here.

Details of the session:

The session was moderated by Courtney Radsch, advocacy director at the Committee to Protect Journalists. Panelists included:

  • Yolanda Rondon, staff attorney, American-Arab Anti-Discrimination Committee (ADC)
  • Matt Mitchell, Black Lives Matter, a former data journalist who organizes Harlem Cryptoparty
  • Carl Schonander, senior director for international public policy, Software & Information Industry Association
  • JD Maddox, director of analytics, Global Engagement Center, U.S. Department of State

Panelists debated the level of government intervention necessary to monitor potentially violent content online and ongoing implications for human rights. They said both globally and domestically, there is a struggle between monitoring online content while allowing the Internet to remain free and open for expression.

“We can’t just focus on ISIS,” said panel moderator Courtney Radsch, advocacy director at the Committee to Protect Journalists.

Radsch argued that finding a “balance” between surveillance and free expression is the wrong approach, saying there are already established human rights principles that must to continuously be considered.

“We need to make sure we vigorously defend the First Amendment right of freedom of speech,” said Yolanda Rondon, a staff attorney for the American-Arab Anti-Discrimination Committee.

As a self-described advocate for a free and open Internet, Rondon said she fears that with too much restriction, people will exercise self-censorship, and no longer want to express themselves, especially if they feel what they have to say is in opposition to the norm.

“The best way to fight hate speech is more speech, better speech,” she said.

How do you define extremist speech and what is its impact?

Yolanda Rondon at IGF Panel Discussion ImageIn tackling the complex issue or free speech and security issues, moderator Radsch acknowledged that a major part of the problem is a lack of definition regarding extremism. As a highly politicized issue, too often the converation winds up becoming limited to talking only about ISIS.

“Do you expand the definition or do you try to limit it?” Radsch asked.

It was pointed out that violent extremism in the United States encompasses more than just the recent series of suspected ISIS attacks.

Rondon said that there have been 250 Americans targeted in the U.S. by ISIS since 2012.

JD Maddox, director of analytics at the Global Engagement Center, noted that over the last two years there has been a 45 percent decline in pro-ISIS tweets.

This is due in large part because of the dissemination of government information and monitoring techniques against online propaganda, Maddox said. Though Rondon questioned whether it is necessary to focus government resources on terrorist attempts, when unresolved homicide rates continue to rise in the nation.

Though there is no clear answer as to why, Maddox suspects voluntary measures taken by social media companies coupled with legal mechanisms and concrete data are all important facets to consider.

Online companies can take measured responses to bad actors

Carl Schonander at IGF Panel Discussion ImageCarl Schonander, senior director for international public policy at the Software & Information Industry Association, said he believes all companies should be transparent about their policies on hate speech and the inciting of violence. He also said a perfect or uniform solution is not likely. “This is an incredibly difficult area to work in,” he noted. “Beyond just saying we need to have a conversation we need to have good data and annual production of data to find out where we need to go.”

Section 230 of the Communications Decency Act of 1996 protects online companies including social media outlets from bearing any legal responsibilty for what their users post. However, the companies have the right to remove content they deem offensive to their users.

For instance, Facebook has a zero-tolerance policy regarding ISIS accounts, and, according to the Brookings Institute, Twitter has removed thousands of accounts related to extremist groups.

Online content vs. behavior

The discussion turned to the fact that there is a lack of data suggesting that there is any correlation between a person’s posting or viewing of online content and his or her behavior.

Panelists agreed that the main drivers that lead individuals to extremism do not generally include targeted social media campaigns. Those most likely to join groups like ISIS generally have pre-existing personal issues that lead them to view and/or post such materials and possibly involve themselves in potentially extremist reactions.

“Views and expression do not equal violent acts,” Radsch said.

Violence Content Panel Discussion Group PhotoMatt Mitchell of Black Lives Matter and CryptoHarlem concurred, questioning the necessity for government intervention at all.

“We are best when we have diversity of thought,” he said. “We are not a country of blocking and censorship. We have a right to express our views, to assemble, whether that’s at home or in person.”

The same freedoms are not awarded worldwide, and many countries use the global Countering Violent Extremism (CVE) movement to rationalize content censorship. In many countries, exercising free speech is dangerous, and journalists and bloggers endure extreme risks to deliver information. Radsch noted the challenges in coming up with an alternative approach to simply shutting these accounts down.

With a lack of correlation and concrete data, there are challenges on all sides to approaching violent extremism online.

In her final remarks, Radsch pointed out that the issue is multifaceted and complex and approaches to it differ across the world. It touches on the Internet governance issues of freedom of expression online, freedom of association online and privacy.

– By Mackenzie Dunn

Click here to return to IGF-USA 2016 homepage.

The multimedia reporting team for Imagining the Internet at IGF-USA 2016 included the following Elon University School of Communications students, staff and faculty:

Bryan Anderson, Janna Anderson, Bryan Baker, Elizabeth Bilka, Ashley Bohle, Courtney Campbell, Colin Donohue, Melissa Douglas, Mackenzie Dunn, Maya Eaglin, Christina Elias, Rachel Ellis, Caroline Hartshorn, Paul LeBlanc, Emmanuel Morgan, Joey Nappa, Diego Pineda Davila, Alyssa Potter, Kailey Tracy, Andrew Steinitz, Anna Zwingelberg