Regulating Speech In Cyberspace: Dr. Emily Laidlaw on Corporate Social Responsibility

Regulating Speech In Cyberspace: Dr. Emily Laidlaw on Corporate Social Responsibility

From Facebook Groups dedicated to rape jokes to death threats on Twitter, the Internet can seem like a free speech free-for-all. Anyone can say anything, because who is going to stop them? In her presentation, Regulating Speech in Cyberspace, University of Calgary Professor, Dr. Emily Laidlaw answers that question.

She first examined the current state of affairs on the Internet. Key corporate players in the technology industry, like Google and Facebook, hold a significant amount of power over what can and can’t be said. They may not control every shadowy back alley of the internet, but they do control the platforms that most internet users use to find and read content. And that gives them control over not only what is written on their own sites, but on whether you can or will see what is written in other places. They are the gatekeepers to our participation online.

In particular, Laidlaw highlighted a lack of accountability, predictability and transparency in the current handling of online speech issues. Despite the great power they wield, technology companies are understandably reluctant to face greater government regulation. They flourished in an era of light restrictions and do not welcome rigid, unresponsive policies imposed from outside. Leaving it all up to self-governance has its own problems.

In her research, she found that the technology industry has government regulation and many of the broader working groups addressing free speech, privacy and other governance issues. Tech companies prefer to band with other tech companies to create their own industry working groups. With few exceptions, those groups are voluntary. Corporations are not required to join the groups, nor are they required to comply with any recommendations issued by the groups. The result is that these corporations are not actually accountable to any outside groups for their actions or inaction.

Laidlaw also criticized the way that tech companies address complaints. When users file complaints, or have complaints filed against them, there is often little information available on the progress of issues or the potential outcome. Decisions are usually final and often opaque, leaving no opportunity for appeal. Without any information on how past complaints have been handled, there's little room for users to learn from the process -- at best, they can look to the densely written, ever-changing terms of service and privacy policies to figure out what is and isn't appropriate on a given platform.
Even aggregate information is rarely made available to the public or to researchers, so there is no information to study for trends or to spot problems in how complaints are handled. The issues that get addressed may be subjected to the whims of publicity or corporate public relations goals, rather than any systematic understanding of what is appropriate or important to take action on.

She offered a few examples of what these corporate practices look like in action:

First, she pointed to past occurrences where Facebook hosted groups dedicated to sharing rape jokes. People reported those groups to Facebook asking that they be removed as hate speech. But is a rape joke hate speech? Or is it simply a joke? Facebook’s initial stance was that these groups did not violate any of their terms of service, and they were allowed to stay. After a much greater public protest, Facebook reversed that decision, removed the groups and updated their terms of service to disallow that type of group. The new policy is a shift in how these specific complaints are dealt with, but does not alter the overall process which is complaint based and opaque. You can file a complaint, but you have no path to appeal Facebook’s decision.

Another of her examples concerned the Internet Watch Foundation (IWF), a group created by the UK ISP industry to respond to concerns about child sexual abuse photos. The group was created under threat of government regulation — if the industry did not self-regulate to the satisfaction of the government, they would be regulated externally. The IWF responds to complaints about child porn and manages a block list for UK ISPs to ensure they do not allow access to known repositories. While the IWF was created as a self-governance mechanism, it has limited oversight. Remedies for incorrectly blocked sites are minimal, and there is no notification that a site even has been blocked. She also raised concerns that the tools IWF has developed for addressing child sexual abuse photos could be expanded into other areas like copyright enforcement where the lines between what's acceptable and what's criminal or infringing are less clearly drawn.

In place of ad-hoc, inadequate mechanisms, Laidlaw proposes a new model for corporate social responsibility and governance. In this new model, governments would mandate that corporations have policies and procedures for handling free speech and human rights issues. They would have to include complaint mechanisms that are transparent, predictable and accountable. The government regulations would stop short of specifying what those processes would actually look like, leaving it up to individual corporations to develop procedures that met both their internal requirements and the government requirements.

Without a specific procedure dictated by the government, corporations would require additional support in implementing these procedures. Laidlaw suggests that support could range from sample policies to assist in drafting policies to audit tools that would allow corporations to ensure they were properly executing their procedures. With greater transparency built into the new processes, additional data would be available for research and education. Researchers could assess various policies to determine best practices, and to hold corporations accountable to their own procedures. The information could also be used to educate the public, raising their expectations about how corporations should be acting when faced with complaints.

With the history of pushback from the tech industry, getting corporate buy-in is key to the success of any initiative. Laidlaw’s model may well offer enough flexibility that corporations will accept it, without so much flexibility that they can continue to escape responsibility altogether.

 

Jacquilynne Schlesier is an IPilogue Editor and a JD candidate at Osgoode Hall Law School.