SAN FRANCISCO – A recent lawsuit filed in the U.S. District Court in the Northern District of California against Google, Facebook and Twitter raises questions about how far is too far for freedom of speech.
Reynaldo Gonzalez, father of Nohemi Gonzalez – a California State Long Beach student who died in the Paris terrorist attacks – is suing the three social media platforms, alleging they ‘knowingly permitted’ the Islamic State group to recruit, raise money and spread propaganda via their services.
Currently, under the U.S. Communications Decency Act, internet companies are generally exempt from liability for what their users post. Though legal reform may seem like the solution, it is important to keep it this way, Aaron Mackey, a Frank Stanton Legal Fellow at the Electronic Frontier Foundation, told the Northern California Record.
“If, at any time, those platforms had to be careful about their liability, they would limit a lot of what is allowed on their platforms," Mackey said. "What a new law would do is jeopardize the robust free speech and usefulness of these platforms.”
Gonzalez’s lawsuit is not the first of its kind that social media has seen in recent years.
In a similar case, Fields vs. Twitter, a widow filed a suit against Twitter after her husband died in a Jordan terrorist attack, claiming the site gave a voice to the Islamic State. Ultimately, Twitter won a motion to dismiss the lawsuit with a causation issue, as the complaint did not show how Twitter’s action or inaction led to the violence that occurred.
“I feel awful about what happened to Mrs. Fields and the father of this girl, but there is danger in holding online platforms liable for terrorists’ activities,” Mackey said.
Causation is often the biggest difficulty in cases like Field’s and Gonzalez’s, as it is nearly impossible to know if a social media platform could have changed anything.
Beyond stepping on the toes of freedom of speech, monitoring the platforms for any suspicious activity would also be detrimental to the political climate and to new availability for the public.
“The worry is that filtering posts on platforms, through filters for mentions or things that look like terrorist speech, will also filter out a bunch of important political speech,” Mackey said. “Terrorism is one of the most interesting and newsworthy topics of our time. If we filter out any terrorist speech, we lose any political debate and discussion on the topic at all, including researchers and reporters losing access to it.”
Although the companies will not be completely filtering their content, Facebook and YouTube have since begun blocking all extremist videos.