SlideShare une entreprise Scribd logo
1  sur  22
Télécharger pour lire hors ligne
C H A R T I N G A W A Y F O R W A R D 1
Online Content
Regulation
C H A R T I N G A WAY F O R WA R D
Monika Bickert
V P, C O N T E N T P O L I C Y
F E B R U A R Y 2 0 2 0
C H A R T I N G A W A Y F O R W A R D 2
I. A New Governance Challenge
II. Four Key Questions
09 
Q U E S T I O N 1
How can content regulation best achieve the goal of
reducing harmful speech while preserving free expression?
10 
Q U E S T I O N 2
How should regulation enhance the accountability of
internet platforms to the public?
13 
Q U E S T I O N 3
Should regulation require internet companies to meet
certain performance targets?
16 
Q U E S T I O N 4
Should regulation define which “harmful content” should
be prohibited on internet platforms?
III. Principles for Future Regulators
IV. The Beginning of a Conversation
End Notes
03
06
19
21
22
Table of Contents
3
C H A R T I N G A W A Y F O R W A R D
Billions of people have come online in the past decade, gaining unprecedented
access to information and the ability to share their ideas and opinions with a wide
audience. Global connectivity has improved economies, grown businesses, reunited
families, raised money for charity, and helped bring about political change. At the
same time, the internet has also made it easier to find and spread content that could
contribute to harm, like terror propaganda. While this type of speech is not new,
society’s increased reliance on the internet has shifted the power among those who
control and are affected by mass communication.
For centuries, political leaders, philosophers, and activists have wrestled with the
question of how and when governments should place limits on freedom of
expression to protect people from content that can contribute to harm. Increasingly,
privately run internet platforms are making these determinations, as more speech
flows through their systems. Consistent with human rights norms, internet platforms
generally respect the laws of the countries in which they operate, and they are
also free to establish their own rules about permissible expression, which are often
more restrictive than laws. As a result, internet companies make calls every day
that influence who has the ability to speak and what content can be shared on
their platform.
Problems arise when people do not understand the decisions that are being made
or feel powerless when those decisions impact their own speech, behavior, or
experience. People may expect similar due process channels to those that they
01
A New Governance
Challenge
4
01 | A New Governance Challenge
C H A R T I N G A W A Y F O R W A R D
enjoy elsewhere in modern society. For example, people in liberal democracies may
expect platforms to draft content standards with user and civil society input in mind,
as well as the ability to seek redress if they feel a decision was made in error. People
are able to register preferences through market signals by moving to other
platforms, but they may not be satisfied by this as their only option.
As a result, private internet platforms are facing increasing questions about how
accountable and responsible they are for the decisions they make. They hear from
users who want the companies to reduce abuse but not infringe upon freedom
of expression. They hear from governments, who want companies to remove not
only illegal content but also legal content that may contribute to harm, but make
sure that they are not biased in their adoption or application of rules. Faced
with concerns about bullying or child sexual exploitation or terrorist recruitment,
for instance, governments may find themselves doubting the effectiveness
of a company’s efforts to combat such abuse or, worse, unable to satisfactorily
determine what those efforts are.
This tension has led some governments to pursue regulation over the speech
allowed online and companies’ practices for content moderation. The approaches
under consideration depend on each country’s unique legal and historical traditions.
In the United States, for example, the First Amendment protects a citizen’s ability to
engage in dialogue online without government interference except in the narrowest
of circumstances. Citizens in other countries often have different expectations about
freedom of expression, and governments have different expectations about platform
accountability. Unfortunately, some of the laws passed so far do not always
strike the appropriate balance between speech and harm, unintentionally pushing
platforms to err too much on the side of removing content.
Facebook has therefore joined the call for new regulatory frameworks for online
content—frameworks that ensure companies are making decisions about online
speech in a way that minimizes harm but also respects the fundamental right
to free expression. This balance is necessary to protect the open internet, which
is increasingly threatened—even walled off—by some regimes.
Facebook wants to be a constructive partner to governments as they weigh the
most effective, democratic, and workable approaches to address online content
governance. As Mark Zuckerberg wrote in a recent op-ed:
It’s impossible to remove all harmful content from the Internet, but when
people use dozens of different sharing services—all with their own policies
and processes—we need a more standardized approach … Regulation
could set baselines for what’s prohibited and require companies to build
systems for keeping harmful content to a bare minimum.1
5
01 | A New Governance Challenge
C H A R T I N G A W A Y F O R W A R D
This paper explores possible regulatory structures for content governance
outside the United States and identifies questions that require further discussion.
It builds off recent developments on this topic, including legislation proposed or
passed into law by governments, as well as scholarship that explains the various
content governance approaches that have been adopted in the past and may be
taken in the future.2
Its overall goal is to help frame a path forward—taking into
consideration the views not only of policymakers and private companies, but also
civil society and the people who use Facebook’s platform and services.
This debate will be central to shaping the character of the internet for decades
to come. If designed well, new frameworks for regulating harmful content can
contribute to the internet’s continued success by articulating clear, predictable, and
balanced ways for government, companies, and civil society to share responsibilities
and work together. Designed poorly, these efforts may stifle expression, slow
innovation, and create the wrong incentives for platforms.
6
C H A R T I N G A W A Y F O R W A R D
Since the invention of the printing press, developments in communications
technology have always been met with calls for state action. In the case of internet
companies, however, it is uniquely challenging to develop regulation to ensure
accountability. There are four primary challenges.
1. Legal environments and speech norms vary.
Many internet companies have a global user base with a broad spectrum
of expectations about what expression should be permitted online and
how decision makers should be held accountable. The cross-border nature
of communication is also a defining feature of many internet platforms,
so companies generally maintain one set of global policies rather than
country-specific policies that would interfere with that experience.
2. Technology and speech are dynamic.
Internet services are varied and ever changing. Some are focused on
video or images, while some are primarily text. Some types of internet
communications are one-to-one and analogous to the private sphere,
while others are more like a town square or broadcasting—where thousands
or millions of people can access the content in question. Some services
are ephemeral, and some are permanent. Among these different interaction
types, norms for acceptable speech may vary. Just as people may say
things in the privacy of a family dinner conversation that they would not say
at a town hall meeting, online communities cultivate their own norms that
are enforced both formally and informally. All are constantly changing to
compete and succeed.
02
Four Key Questions
7
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
3. Enforcement will always be imperfect.
Given the dynamic nature and scale of online speech, the limits of
enforcement technology, and the different expectations that people have
over their privacy and their experiences online, internet companies’
enforcement of content standards will always be imperfect. Their ability
to review speech for policy or legal violations will always be limited. Even
in a hypothetical world where enforcement efforts could perfectly track
language trends and identify likely policy violations, companies would still
struggle to apply policies because they lack context that is often necessary,
such as whether an exchange between two people is in jest or in anger, or
whether graphic content is posted for sadistic pleasure or to call attention
to an atrocity.
4. Companies are intermediaries, not speakers.
Publisher liability laws that punish the publication of illegal speech are
unsuitable for the internet landscape. Despite their best efforts to thwart
the spread of harmful content, internet platforms are intermediaries,
not the speakers, of such speech, and it would be impractical and harmful
to require internet platforms to approve each post before allowing it.
The ability of individuals to communicate without journalistic intervention
has been central to the internet’s growth and positive impact. Imposing
traditional publishing liability would seriously curtail that impact by limiting
the ability of individuals to speak. Companies seeking assurance that
their users’ speech is lawful would need to review and approve each post
before allowing it to appear on the site. Companies that could afford to
offer services under such a model would be incentivized to restrict service
to a limited set of users and err on the side of removing any speech close
to legal lines.
Legal experts have cautioned that holding internet companies liable for
the speech of their users could lead to the end of these services altogether.
As explained by Jennifer Granick, surveillance and cybersecurity counsel
at the American Civil Liberties Union, “There is no way you can have a
YouTube where somebody needs to watch every video. There is no way you
can have a Facebook if somebody needs to watch every post. There would
be no Google if someone had to check every search result.”3
Such liability
would stifle innovation as well as individuals’ freedom of expression. This
model’s flaws appear all the more clear in light of the significant efforts
many companies make to identify and remove harmful speech—efforts that
often require the protective shield of intermediary liability protection laws.
(Instead, as this paper explains, other models for regulating internet
platforms would better empower and encourage them to effectively address
harmful content.)
8
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
These challenges suggest that retrofitting the rules that regulate offline
speech for the online world may be insufficient. Instead, new frameworks are
needed. This paper will put forward four questions that underpin the debate
about that framework:
01 
How can content regulation best achieve the goal of reducing
harmful speech while preserving free expression?
02 
How should regulation enhance the accountability of internet platforms?
03 
Should regulation require internet companies to meet certain
performance targets?
04 
Should regulation define which “harmful content” should be prohibited
on internet platforms?
9
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
While governments have different thresholds for limiting speech, the general
goal of most content regulation is to reduce harmful speech while preserving free
expression. Broadly speaking, regulation could aim to achieve this goal in three
ways: (1) holding internet companies accountable for having certain systems and
procedures in place, (2) requiring companies meet specific performance targets
when it comes to content that violates their policies, or (3) requiring companies
to restrict specific forms of speech, even if the speech is not illegal. The discussion
of Questions 2, 3, and 4 will discuss these approaches in more detail.
Facebook generally takes the view that the first approach—requiring companies to
maintain certain systems and procedures—is the best way to ensure an appropriate
balancing of safety, freedom of expression, and other values. By requiring systems
such as user-friendly channels for reporting content or external oversight of
policies or enforcement decisions, and by requiring procedures such as periodic
public reporting of enforcement data, regulation could provide governments
and individuals the information they need to accurately judge social media
companies’ efforts.
Governments could also consider requiring companies to hit specific performance
targets, such as decreasing the prevalence of content that violates a site’s hate
speech policies, or maintaining a specified median response time to user or
government reports of policy violations. While such targets may have benefits, they
could also create perverse incentives for companies to find ways to decrease
enforcement burdens, perhaps by defining harmful speech more narrowly, making
it harder for people to report possible policy violations, or stopping efforts to
proactively identify violating content that has not been reported.
Finally, regulators could require that internet companies remove certain content—
beyond what is already illegal. Regulators would need to clearly define that content,
and the definitions would need to be different from the traditional legal definitions
that are applied through a judicial process where there is more time, more context,
and independent fact-finders.
These approaches are not mutually exclusive nor are they the only options.
However, they do represent some of the most common ideas that we have heard
in our conversations with policymakers, academics, civil society experts, and
regulators. The approach a government may pursue will depend on what sort of
platform behavior it wants to incentivize. Ultimately, the most important elements
of any system will be due regard for each of the human rights and values at stake,
as well as clarity and precision in the regulation.
Question 1: How can content regulation best
achieve the goal of reducing harmful speech while
preserving free expression?
10
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
People in democratic societies are accustomed to being able to hold their
governments accountable for their decisions. When internet companies make
decisions that have an impact on people’s daily lives, those people expect the same
accountability. This is why internet companies, in their conversations with people
using their services, face questions like, “Who decides your content standards?,”
“Why did you remove my post?,” and “What can I do to get my post reinstated when
you make a mistake?”
Regulation could help answer those questions by ensuring that internet content
moderation systems are consultative, transparent, and subject to meaningful
independent oversight. Specifically, procedural accountability regulations could
include, at a minimum, requirements that companies publish their content
standards, provide avenues for people to report to the company any content
that appears to violate the standards, respond to such user reports with a
decision, and provide notice to users when removing their content from the site.
Such regulations could require a certain level of performance in these areas to
receive certifications or to avoid regulatory consequences.
Regulation could also incentivize—or where appropriate, require—additional
measures such as:
Insight into a company’s development of its content standards.
A requirement to consult with stakeholders when making significant
changes to standards.
An avenue for users to provide their own input on content standards.
A channel for users to appeal the company’s removal (or non-removal)
decision on a specific piece of content to some higher authority within
the company or some source of authority outside the company.
Public reporting on policy enforcement (possibly including how much
content was removed from the site and for what reasons, how much content
was identified by the company through its own proactive means before
users reported it, how often the content appears on its site, etc.).
Regulations of this sort should take into account a company’s size and reach,
as content regulation should not serve as a barrier to entry for new competitors
in the market.
Question 2: How should regulation enhance the
accountability of internet platforms to the public?
11
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
An important benefit of this regulatory approach is that it incentivizes an
appropriate balancing of competing interests, such as freedom of expression, safety,
and privacy. Required transparency measures ensure that the companies’ balancing
efforts are laid bare for the public and governments to see.
If they pursued this approach, regulators would not be starting from scratch; they
would be expanding and adding regulatory effect to previous efforts of governments,
civil society, and industry. For example, the Global Network Initiative Principles4
and other civil society efforts have helped create baselines for due process and
transparency linked to human rights principles. The GNI has also created an
incentives structure for compliance. Similarly, the European Union Code of Conduct
on Countering Illegal Hate Speech Online has created a framework that has led to
all eight signatory companies meeting specific requirements for transparency and
due process.
In adopting this approach, however, regulators must be cautious. Specific product
design mandates—“put this button here,” “use this wording,” etc.—will make it harder
for internet companies to test what options work best for users. It will also make
it harder for companies to tailor the reporting experience to the device being used:
the best way to report content while using a mobile phone is likely to differ from
the best way to report content while using a laptop. In fact, we have found that
certain requests from regulators to structure our reporting mechanisms in specific
ways have actually reduced the likelihood that people will report dangerous
content to us quickly.
Another key component of procedural accountability regulation is transparency.
Transparency, such as required public reporting of enforcement efforts, would
certainly subject internet companies to more public scrutiny, as their work would be
available for governments and the public to grade. This scrutiny could have both
positive and negative effects.
The potential negative effects could arise from the way transparency requirements
shape company incentives. If a company feared that people would judge harshly of
its efforts and would therefore stop using or advertising on its service, the company
might be tempted to sacrifice in unmeasured areas to help boost performance in
measured areas. For instance, if regulation required the measuring and public
annual reporting of the average time a company took to respond to user reports of
violating content, a company might sacrifice quality in its review of user reports in
order to post better numbers. If regulation required public reporting of how many
times users appealed content removal decisions, a company might make its appeal
channel difficult to find or use.
On the other hand, companies taking a long-term view of their business incentives
will likely find that public reactions to their transparency efforts—even if painful
at times—can lead them to invest in ways that will improve public and government
12
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
confidence in the longer term. For example, spurred by multi-stakeholder initiatives
like GNI or the EU Code of Conduct mentioned above, Facebook has increased
efforts to build accountability into our content moderation process by focusing on
consultation, transparency, and oversight. In response to reactions to our reports,
we have sought additional input from experts and organizations.
Going forward, we will continue to work with governments, academics, and civil
society to consider what more meaningful accountability could look like in this
space. As companies get better at providing this kind of transparency, people and
governments will have more confidence that companies are improving their
performance, and people will be empowered to make choices about which platform
to use—choices that will ultimately determine which services will be successful.
13
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
Question 3: Should regulation require internet
companies to meet certain performance targets?
Governments could also approach accountability by requiring that companies meet
specific targets regarding their content moderation systems. Such an approach
would hold companies accountable for the ends they have achieved rather than
ensuring the validity of how they got there.
For example, regulators could say that internet platforms must publish annual data
on the “prevalence” of content that violates their policies, and that companies must
make reasonable efforts to ensure that the prevalence of violating content remains
below some standard threshold. If prevalence rises above this threshold, then the
company might be subject to greater oversight, specific improvement plans, or—in
the case of repeated systematic failures—fines.
Regulators pursuing this approach will need to decide which metrics to prioritize.
That choice will determine the incentives for internet companies in how they
moderate content. The potential for perverse incentives under this model is greater
than under the procedural accountability model, because companies with
unsatisfactory performance reports could face direct penalties in the form of fines
or other legal consequences, and will therefore have a greater incentive to shortcut
unmeasured areas where doing so could boost performance in measured areas.
Governments considering this approach may also want to consider whether and
how to establish common definitions—or encourage companies to do so—to be able
to compare progress across companies and ensure companies do not tailor their
definitions in such a way to game the metrics.
With that in mind, perhaps the most promising area for exploring the development
of performance targets would be the prevalence of violating content (how much
violating content is actually viewed on a site) and the time-to-action (how long the
content exists on the site before removal or other intervention by the platform).
Prevalence: Generally speaking, some kinds of content are harmful only to
the extent they are ever actually seen by people. A regulator would likely
care more about stopping one incendiary hateful post that would be seen by
millions of people than twenty hateful posts that are seen only by the person
who posted them. For this reason, regulators (and platforms) will likely be
best served by a focus on reducing the prevalence of views of harmful
content. Regulators might consider, however, the incentives they might be
inadvertently creating for companies to more narrowly define harmful
content or to take a minimalist approach in prevalence studies.
Time-to-Action: By contrast, some types of content could be harmful even
without an audience. In the case of real-time streaming of child sexual
14
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
exploitation, coordination of a violent attack, or an expressed plan to harm
oneself, quick action is often necessary to save lives, and the weighing of
values may be so different as to justify an approach that incentivizes fast
decision-making even where it may lead to more incorrect content removals
(thus infringing on legitimate expression) or more aggressive detection
regimes (such as in the case of using artificial intelligence to detect threats
of self harm, thus potentially affecting privacy interests).
There are significant trade-offs regulators must consider when identifying metrics
and thresholds. For example, a requirement that companies “remove all hate speech
within 24 hours of receiving a report from a user or government” may incentivize
platforms to cease any proactive searches for such content, and to instead use
those resources to more quickly review user and government reports on a first-
in-first-out basis. In terms of preventing harm, this shift would have serious costs.
The biggest internet companies have developed technology that allows them to
detect certain types of content violations with much greater speed and accuracy
than human reporting. For instance, from July through September 2019, the vast
majority of content Facebook removed for violating its hate speech, self-harm,
child exploitation, graphic violence, and terrorism policies was detected by the
company’s technology before anyone reported it. A regulatory focus on response
to user or government reports must take into account the cost it would pose to
these company-led detection efforts.
Even setting aside proactive detection, a model that focuses on average time-to-
action of reported content would discourage companies from focusing on removal
of the most harmful content. Companies may be able to predict the harmfulness of
posts by assessing the likely reach of content (through distribution trends and likely
virality), assessing the likelihood that a reported post violates (through review with
artificial intelligence), or assessing the likely severity of reported content (such as if
the content was reported by a safety organization with a proven record of reporting
only serious violations). Put differently, companies focused on average speed of
assessment would end up prioritizing review of posts unlikely to violate or unlikely
to reach many viewers, simply because those posts are closer to the 24-hour
deadline, even while other posts are going viral and reaching millions.
A regulation requiring companies to “remove all hate speech within 24 hours
of upload” would create still more perverse incentives. Companies would have a
strong incentive to turn a blind eye to content that is older than 24 hours (and
unlikely to be seen by a government), even though that content could be causing
harm. Companies would be disincentivized from developing technology that can
identify violating private content on the site, and from conducting prevalence
studies of the existing content on their site.
The potential costs of this sort of regulation can be illustrated through a review of
Facebook’s improvements in countering terror propaganda. When Facebook first
15
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
began measuring its average time to remove terror propaganda, it had technical
means to automatically identify some terror propaganda at the time of upload.
After considerable time and effort, Facebook engineers developed better detection
tools, including a means for identifying propaganda that had been on the site for
years. Facebook used this tool to find and remove propaganda, some of which had
long existed on the site without having been reported by users or governments.
This effort meant that Facebook’s measurement of average time on site for terror
propaganda removed went up considerably. If Facebook would have faced
consequences for an increase in its “time-to-action,” it would have been
discouraged from ever creating such a tool.
Whichever metrics regulators decide to pursue, they would be well served
to first identify the values that are at stake (such as safety, freedom of
expression, and privacy) and the practices they wish to incentivize, and then
structure their regulations in a way that minimizes the risk of perverse
incentives. This would require close coordination among various regulators
and policymaking bodies focused on sometimes competing interests, such as
privacy, law enforcement, consumer protection, and child safety. A regulation
that serves the regulatory interests of one body well may in fact harm the
interests of another.
16
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
Many governments already have in place laws and systems to define illegal speech
and notify internet platforms when illegal speech is present on their services.
Governments are also considering whether to develop regulations that specifically
define the “harmful content” internet platforms should remove—above and beyond
speech that is already illegal.
International human rights instruments provide the best starting point for analysis
of government efforts to restrict speech. Article 19 of the International Covenant
on Civil and Political Rights (ICCPR)5
holds:
1. Everyone shall have the right to hold opinions without interference.
2. Everyone shall have the right to freedom of expression; this right shall
include freedom to seek, receive and impart information and ideas of
all kinds, regardless of frontiers, either orally, in writing or in print, in the
form of art, or through any other media of his choice.
3. The exercise of the rights provided for in paragraph 2 of this article carries
with it special duties and responsibilities. It may therefore be subject
to certain restrictions, but these shall only be such as are provided by law
and are necessary:
(a) For respect of the rights or reputations of others;
(b) For the protection of national security or of public order
(ordre public), or of public health or morals.
Similarly, Article 10 of the European Convention on Human Rights carves out room
for governments to pass speech laws that are necessary for:
…[T]he interests of national security, territorial integrity or public safety,
for the prevention of disorder or crime, for the protection of health or morals,
for the protection of the reputation or rights of others, for preventing
the disclosure of information received in confidence, or for maintaining
the authority and impartiality of the judiciary.6
In countries with strong democratic institutions, laws governing speech are passed
as part of a transparent process that weighs different societal interests, with
due respect for human rights and with the check of an impartial judiciary. Drawing
a clear line between legal and illegal speech helps citizens to know what is and
is not permissible. Judicial checks and balances help to reduce the likelihood of
capricious restrictions on legitimate speech.
Question 4: Should regulation define which “harmful
content” should be prohibited on internet platforms?
17
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
Despite these international human rights norms, laws affecting speech and
even criminalizing it vary widely both by jurisdiction and by the type of harm
contemplated, even among liberal democratic nations.
Among types of content most likely to be outlawed, such as content supporting
terrorism and content related to the sexual exploitation of children, there is
significant variance in laws and enforcement. In some jurisdictions, for instance,
images of child sexual exploitation are lawful if they are computer generated. In
others, praise of terror groups is permitted, and lists of proscribed terror groups
vary widely by country and region. Laws vary even more in areas such as hate
speech, harassment, or misinformation. Many governments are grappling with how
to approach the spread of misinformation, but few have outlawed it. As the UN
and others have noted, the general criminalization of sharing misinformation would
be “incompatible with international standards for restrictions on freedom of
expression.”7
Thresholds also vary for defamatory speech and offensive material.
The variation among laws is significant because it underscores the difficulty of
weighing competing values like safety and expression. If governments do not agree
on how to balance these interests when writing laws, it is likely they will have very
different ideas of how to define online content standards, which require even more
specificity than laws in order to ensure consistent application.
Laws restricting speech are generally implemented by law enforcement officials
and the courts. Fact finders investigate and due process is afforded to the speaker.
The penalty for illegal speech can include fines or imprisonment.
Internet content moderation is fundamentally different. Companies use a
combination of technical systems and employees and often have only the text in
a post to guide their assessment. The assessment must be made quickly, as people
expect violations to be removed within hours or days rather than the months
that a judicial process might take. The penalty is generally removal of the post
or the person’s account.
Given these complexities, governments that seek to define for internet companies
what content they should allow on their platforms should seek to:
Create a standard that can be enforced practically, at scale, with limited
context about the speaker and content, without undermining the goal of
promoting expression.
Take account of different user preferences and expectations for different
sorts of internet services. Different definitions may be appropriate based
on the nature of the service (search engines versus social media) or the
nature of the audience (private chat versus public post, ephemeral versus
more permanent, etc).
18
02 | Four Key Questions
C H A R T I N G A W A Y F O R W A R D
Take account of individual controls, preferences, settings, etc. that a
company may provide its users for some kinds of potentially harmful content
(e.g., nudity).
Provide flexibility so that platforms can adapt policies to emerging language
trends and adversarial efforts to avoid enforcement. For instance, hateful
speech, bullying, and threats of self-harm are often expressed through
a lexicon of words that fall in and out of favor or evolve over time. Similarly,
proscribed terror groups and hate groups may rename themselves or
fracture into different groups.
19
C H A R T I N G A W A Y F O R W A R D
In some countries, a regulatory system for online content may require a new type
of regulator. Addressing online communication goes beyond simply adopting the
core capabilities of traditional regulators and channeling them towards the task
of effective oversight. Instead, any regulator in this space will need proficiency in
data, operations, and online content. Governments will also need to ensure that
regulatory authorities pay due regard to innovation and protect users’ rights online,
including consideration of the following:
I N C E N T I V E S
Ensuring accountability in companies’ content moderation systems
and procedures will be the best way to create the incentives for
companies to responsibly balance values like safety, privacy, and
freedom of expression.
T H E G L O B A L N AT U R E O F T H E I N T E R N E T
Any national regulatory approach to addressing harmful content should
respect the global scale of the internet and the value of cross-border
communications. It should aim to increase interoperability among regulators
and regulations. However, governments should not impose their standards
onto other countries’ citizens through the courts or any other means.
03
Principles for
Future Regulators
20
03 | Principles for Future Regulators
C H A R T I N G A W A Y F O R W A R D
F R E E D O M O F E X P R E S S I O N
In addition to complying with Article 19 of the ICCPR (and related
guidance), regulators should consider the impacts of their decisions
on freedom of expression.
T E C H N O L O G Y
Regulators should develop an understanding of the capabilities and
limitations of technology in content moderation and allow internet
companies the flexibility to innovate. An approach that works for one
particular platform or type of content may be less effective (or even
counterproductive) when applied elsewhere.
P R O P O R T I O N A L I T Y A N D N E C E S S I T Y
Regulators should take into account the severity and prevalence of
the harmful content in question, its status in law, and the efforts already
underway to address the content.
21
C H A R T I N G A W A Y F O R W A R D
We hope that this paper contributes to the ongoing conversations among
policymakers and other stakeholders. These are challenging issues and sustainable
solutions will require collaboration. As we learn more from interacting with experts
and refining our approach to content moderation, our thinking about the best ways
to regulate will continue to evolve. We welcome feedback on the issues discussed
in this paper.
04
The Beginning
of a Conversation
22
C H A R T I N G A W A Y F O R W A R D
E N D N O T E S
Online Content Regulation
1. Zuckerberg, Mark. “The Internet Needs New Rules. Let’s Start in
These Four Areas.” Washington Post, 30 March 2019.
2. Bowers, John, and Jonathan Zittrain. “Answering Impossible
Questions: Content Governance in an Age of Disinformation.”
Harvard Kennedy School Misinformation Review, 2020.
3. As quoted in: Reynolds, Matt. “The Strange Story of Section 230,
the Obscure Law That Created Our Flawed, Broken Internet.” Wired,
28 March 2019, www.wired.co.uk/article/
section-230-communications-decency-act.
4. “The GNI Principles.” Global Network Initiative, 10 January 2020.
https://globalnetworkinitiative.org/gni-principles/.
5. UN General Assembly, International Covenant on Civil and Political
Rights, 16 December 1966, United Nations, Treaty Series, vol. 999, p. 171.
6. Council of Europe, European Convention for the Protection of Human
Rights and Fundamental Freedoms, as amended by Protocols Nos.
11 and 14, 4 November 1950.
7. “Joint Declaration on Freedom of Expression and ‘Fake News’,
Disinformation and Propaganda.” The United Nations (UN) Special
Rapporteur on Freedom of Opinion and Expression, the Organization
for Security and Co-operation in Europe (OSCE) Representative
on Freedom of the Media, the Organization of American States (OAS)
Special Rapporteur on Freedom of Expression and the African
Commission on Human and Peoples’ Rights (ACHPR) Special
Rapporteur on Freedom of Expression and Access to Information,
3 March 2015. As noted in the Joint Declaration and explored infra,
those International Standards are: “that they be provided for by law,
serve one of the legitimate interests recognised under international
law, and be necessary and proportionate to protect that interest.”

Contenu connexe

Tendances

Renoir powering successful digital transformation
Renoir powering successful digital transformationRenoir powering successful digital transformation
Renoir powering successful digital transformationrun_frictionless
 
Top 8 digital transformation trends shaping 2021
Top 8 digital transformation trends shaping 2021Top 8 digital transformation trends shaping 2021
Top 8 digital transformation trends shaping 2021run_frictionless
 
Measuring Marketing Insights
Measuring Marketing InsightsMeasuring Marketing Insights
Measuring Marketing Insightsrun_frictionless
 
Marketing to Account Based Marketing Better, Effective and Rewarding
Marketing to Account Based Marketing  Better, Effective and RewardingMarketing to Account Based Marketing  Better, Effective and Rewarding
Marketing to Account Based Marketing Better, Effective and Rewardingrun_frictionless
 
The Google Cloud Adoption Framework
The Google Cloud Adoption FrameworkThe Google Cloud Adoption Framework
The Google Cloud Adoption Frameworkrun_frictionless
 
The 25 Most Beautiful B2B White Papers On The Planet
The 25 Most Beautiful B2B White Papers On The PlanetThe 25 Most Beautiful B2B White Papers On The Planet
The 25 Most Beautiful B2B White Papers On The Planetrun_frictionless
 
Lifting the Barriers to Retail Innovation in ASEAN | A.T. Kearney
Lifting the Barriers to Retail Innovation in ASEAN | A.T. KearneyLifting the Barriers to Retail Innovation in ASEAN | A.T. Kearney
Lifting the Barriers to Retail Innovation in ASEAN | A.T. KearneyKearney
 
FAST Digital Telco
FAST Digital TelcoFAST Digital Telco
FAST Digital TelcoCapgemini
 
Pursuing Customer Inspired Growth
Pursuing Customer Inspired GrowthPursuing Customer Inspired Growth
Pursuing Customer Inspired GrowthKearney
 
A.T. Kearney Consolidation of the US Banking Industry
A.T. Kearney Consolidation of the US Banking IndustryA.T. Kearney Consolidation of the US Banking Industry
A.T. Kearney Consolidation of the US Banking IndustryKearney
 
Accenture Communications Industry 2021 - Connectivity Optimizer
Accenture Communications Industry 2021 - Connectivity OptimizerAccenture Communications Industry 2021 - Connectivity Optimizer
Accenture Communications Industry 2021 - Connectivity Optimizeraccenture
 
The CMO Survey Highlights and Insights Report - Feb 2019
The CMO Survey Highlights and Insights Report - Feb 2019The CMO Survey Highlights and Insights Report - Feb 2019
The CMO Survey Highlights and Insights Report - Feb 2019christinemoorman
 
Driving change: Unlocking data to transform the front office
Driving change: Unlocking data to transform the front officeDriving change: Unlocking data to transform the front office
Driving change: Unlocking data to transform the front officeAccenture Operations
 
VAYNER3 - Web3 trends for 2023.pdf
VAYNER3 - Web3 trends for 2023.pdfVAYNER3 - Web3 trends for 2023.pdf
VAYNER3 - Web3 trends for 2023.pdfdigitalinasia
 
FinTech in Africa.pdf
FinTech in Africa.pdfFinTech in Africa.pdf
FinTech in Africa.pdfAlemayehu
 
Digital 2022 France (February 2022) v02
Digital 2022 France (February 2022) v02Digital 2022 France (February 2022) v02
Digital 2022 France (February 2022) v02DataReportal
 

Tendances (20)

Social Selling at Scale
Social Selling at ScaleSocial Selling at Scale
Social Selling at Scale
 
Renoir powering successful digital transformation
Renoir powering successful digital transformationRenoir powering successful digital transformation
Renoir powering successful digital transformation
 
Inclusive Fintech 50
Inclusive Fintech 50Inclusive Fintech 50
Inclusive Fintech 50
 
Top 8 digital transformation trends shaping 2021
Top 8 digital transformation trends shaping 2021Top 8 digital transformation trends shaping 2021
Top 8 digital transformation trends shaping 2021
 
Measuring Marketing Insights
Measuring Marketing InsightsMeasuring Marketing Insights
Measuring Marketing Insights
 
The CRM Buyer's Guide
The CRM Buyer's GuideThe CRM Buyer's Guide
The CRM Buyer's Guide
 
Marketing to Account Based Marketing Better, Effective and Rewarding
Marketing to Account Based Marketing  Better, Effective and RewardingMarketing to Account Based Marketing  Better, Effective and Rewarding
Marketing to Account Based Marketing Better, Effective and Rewarding
 
The Google Cloud Adoption Framework
The Google Cloud Adoption FrameworkThe Google Cloud Adoption Framework
The Google Cloud Adoption Framework
 
The 25 Most Beautiful B2B White Papers On The Planet
The 25 Most Beautiful B2B White Papers On The PlanetThe 25 Most Beautiful B2B White Papers On The Planet
The 25 Most Beautiful B2B White Papers On The Planet
 
Lifting the Barriers to Retail Innovation in ASEAN | A.T. Kearney
Lifting the Barriers to Retail Innovation in ASEAN | A.T. KearneyLifting the Barriers to Retail Innovation in ASEAN | A.T. Kearney
Lifting the Barriers to Retail Innovation in ASEAN | A.T. Kearney
 
FAST Digital Telco
FAST Digital TelcoFAST Digital Telco
FAST Digital Telco
 
Pursuing Customer Inspired Growth
Pursuing Customer Inspired GrowthPursuing Customer Inspired Growth
Pursuing Customer Inspired Growth
 
Digital 2022
Digital 2022 Digital 2022
Digital 2022
 
A.T. Kearney Consolidation of the US Banking Industry
A.T. Kearney Consolidation of the US Banking IndustryA.T. Kearney Consolidation of the US Banking Industry
A.T. Kearney Consolidation of the US Banking Industry
 
Accenture Communications Industry 2021 - Connectivity Optimizer
Accenture Communications Industry 2021 - Connectivity OptimizerAccenture Communications Industry 2021 - Connectivity Optimizer
Accenture Communications Industry 2021 - Connectivity Optimizer
 
The CMO Survey Highlights and Insights Report - Feb 2019
The CMO Survey Highlights and Insights Report - Feb 2019The CMO Survey Highlights and Insights Report - Feb 2019
The CMO Survey Highlights and Insights Report - Feb 2019
 
Driving change: Unlocking data to transform the front office
Driving change: Unlocking data to transform the front officeDriving change: Unlocking data to transform the front office
Driving change: Unlocking data to transform the front office
 
VAYNER3 - Web3 trends for 2023.pdf
VAYNER3 - Web3 trends for 2023.pdfVAYNER3 - Web3 trends for 2023.pdf
VAYNER3 - Web3 trends for 2023.pdf
 
FinTech in Africa.pdf
FinTech in Africa.pdfFinTech in Africa.pdf
FinTech in Africa.pdf
 
Digital 2022 France (February 2022) v02
Digital 2022 France (February 2022) v02Digital 2022 France (February 2022) v02
Digital 2022 France (February 2022) v02
 

Similaire à Charting a Way Forward Online Content Regulation

online-disinformation-human-rights
online-disinformation-human-rightsonline-disinformation-human-rights
online-disinformation-human-rightsarchiejones4
 
CS4001 Final Ethics Paper
CS4001 Final Ethics PaperCS4001 Final Ethics Paper
CS4001 Final Ethics PaperGino McCarty
 
Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri...
 Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri... Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri...
Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri...CRO Cyber Rights Organization
 
Social Media Intelligence
Social Media IntelligenceSocial Media Intelligence
Social Media IntelligenceTwittercrisis
 
Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!"
Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!" Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!"
Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!" Damian Radcliffe
 
I. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docx
I. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docxI. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docx
I. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docxwilcockiris
 
Cst group project#2, intro++
Cst group project#2, intro++Cst group project#2, intro++
Cst group project#2, intro++Roxanne St Ives
 
A human rights approach to the mobile network
A human rights approach to the mobile networkA human rights approach to the mobile network
A human rights approach to the mobile networkDr Lendy Spires
 
IACP Social Media Concepts and Issues Paper September 2010
IACP Social Media Concepts and Issues Paper September 2010IACP Social Media Concepts and Issues Paper September 2010
IACP Social Media Concepts and Issues Paper September 2010Twittercrisis
 
Team rocket presentation
Team rocket presentationTeam rocket presentation
Team rocket presentationmoralsus
 
Team rocket presentation
Team rocket presentationTeam rocket presentation
Team rocket presentationbollmanp
 
Net Neutrality in Education
Net Neutrality in EducationNet Neutrality in Education
Net Neutrality in EducationCraig Geffre
 
Discussion 1 By Karthik KondlaThe capability to connect indivi.docx
Discussion 1 By Karthik KondlaThe capability to connect indivi.docxDiscussion 1 By Karthik KondlaThe capability to connect indivi.docx
Discussion 1 By Karthik KondlaThe capability to connect indivi.docxcuddietheresa
 
Social media: Councils, citizens and service transformation
Social media: Councils, citizens and service transformationSocial media: Councils, citizens and service transformation
Social media: Councils, citizens and service transformationIngrid Koehler
 
2012: NJ GMIS: The Double Edge Sword of the Social Network
2012: NJ GMIS: The Double Edge Sword of the Social Network2012: NJ GMIS: The Double Edge Sword of the Social Network
2012: NJ GMIS: The Double Edge Sword of the Social NetworkCarol Spencer
 
Ecrea1d Schwate Jan Paper
Ecrea1d Schwate Jan PaperEcrea1d Schwate Jan Paper
Ecrea1d Schwate Jan Paperimec.archive
 
Barack obama connecting and empowering all americans through technology and...
Barack obama   connecting and empowering all americans through technology and...Barack obama   connecting and empowering all americans through technology and...
Barack obama connecting and empowering all americans through technology and...Dr Lendy Spires
 
Getting Under the Skin of Government 2.0 - Issues, Insights and Implications
Getting Under the Skin of Government 2.0 - Issues, Insights and ImplicationsGetting Under the Skin of Government 2.0 - Issues, Insights and Implications
Getting Under the Skin of Government 2.0 - Issues, Insights and ImplicationseGovernment Resource Centre
 

Similaire à Charting a Way Forward Online Content Regulation (20)

online-disinformation-human-rights
online-disinformation-human-rightsonline-disinformation-human-rights
online-disinformation-human-rights
 
CS4001 Final Ethics Paper
CS4001 Final Ethics PaperCS4001 Final Ethics Paper
CS4001 Final Ethics Paper
 
Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri...
 Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri... Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri...
Francesca Scapolo “Oversight Board Protects Facebook and Instagram Users' Ri...
 
Social Media Intelligence
Social Media IntelligenceSocial Media Intelligence
Social Media Intelligence
 
Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!"
Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!" Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!"
Media Regulation & Democracy: "Hey! Regulator! Leave those Hyperlocals alone!"
 
I. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docx
I. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docxI. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docx
I. INTRODUCTIONA. Purpose of DocumentThis paper1 was des.docx
 
Cst group project#2, intro++
Cst group project#2, intro++Cst group project#2, intro++
Cst group project#2, intro++
 
A human rights approach to the mobile network
A human rights approach to the mobile networkA human rights approach to the mobile network
A human rights approach to the mobile network
 
IACP Social Media Concepts and Issues Paper September 2010
IACP Social Media Concepts and Issues Paper September 2010IACP Social Media Concepts and Issues Paper September 2010
IACP Social Media Concepts and Issues Paper September 2010
 
Team rocket presentation
Team rocket presentationTeam rocket presentation
Team rocket presentation
 
Team rocket presentation
Team rocket presentationTeam rocket presentation
Team rocket presentation
 
Net Neutrality in Education
Net Neutrality in EducationNet Neutrality in Education
Net Neutrality in Education
 
G social
G socialG social
G social
 
Discussion 1 By Karthik KondlaThe capability to connect indivi.docx
Discussion 1 By Karthik KondlaThe capability to connect indivi.docxDiscussion 1 By Karthik KondlaThe capability to connect indivi.docx
Discussion 1 By Karthik KondlaThe capability to connect indivi.docx
 
Social media: Councils, citizens and service transformation
Social media: Councils, citizens and service transformationSocial media: Councils, citizens and service transformation
Social media: Councils, citizens and service transformation
 
2012: NJ GMIS: The Double Edge Sword of the Social Network
2012: NJ GMIS: The Double Edge Sword of the Social Network2012: NJ GMIS: The Double Edge Sword of the Social Network
2012: NJ GMIS: The Double Edge Sword of the Social Network
 
Dharmendra Rama
Dharmendra RamaDharmendra Rama
Dharmendra Rama
 
Ecrea1d Schwate Jan Paper
Ecrea1d Schwate Jan PaperEcrea1d Schwate Jan Paper
Ecrea1d Schwate Jan Paper
 
Barack obama connecting and empowering all americans through technology and...
Barack obama   connecting and empowering all americans through technology and...Barack obama   connecting and empowering all americans through technology and...
Barack obama connecting and empowering all americans through technology and...
 
Getting Under the Skin of Government 2.0 - Issues, Insights and Implications
Getting Under the Skin of Government 2.0 - Issues, Insights and ImplicationsGetting Under the Skin of Government 2.0 - Issues, Insights and Implications
Getting Under the Skin of Government 2.0 - Issues, Insights and Implications
 

Plus de run_frictionless

Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...run_frictionless
 
Overcoming Business Challenges with Azure
Overcoming Business Challenges with AzureOvercoming Business Challenges with Azure
Overcoming Business Challenges with Azurerun_frictionless
 
AWS Well-Architected Framework
AWS Well-Architected FrameworkAWS Well-Architected Framework
AWS Well-Architected Frameworkrun_frictionless
 
Electronic contracts and electronic signatures under Australian law
Electronic contracts and electronic signatures under Australian lawElectronic contracts and electronic signatures under Australian law
Electronic contracts and electronic signatures under Australian lawrun_frictionless
 
Google's guide to innovation: How to unlock strategy, resources and technology
Google's guide to innovation: How to unlock strategy, resources and technologyGoogle's guide to innovation: How to unlock strategy, resources and technology
Google's guide to innovation: How to unlock strategy, resources and technologyrun_frictionless
 
Google Cloud Industries: The impact of COVID-19 on manufacturers
Google Cloud Industries: The impact of COVID-19 on manufacturersGoogle Cloud Industries: The impact of COVID-19 on manufacturers
Google Cloud Industries: The impact of COVID-19 on manufacturersrun_frictionless
 
Six Must-Haves When Using Slack
Six Must-Haves When Using SlackSix Must-Haves When Using Slack
Six Must-Haves When Using Slackrun_frictionless
 
Journey mapping in banking & finance
Journey mapping in banking & financeJourney mapping in banking & finance
Journey mapping in banking & financerun_frictionless
 
Strengthening Operational Resilience in Financial Services by Migrating to Go...
Strengthening Operational Resilience in Financial Services by Migrating to Go...Strengthening Operational Resilience in Financial Services by Migrating to Go...
Strengthening Operational Resilience in Financial Services by Migrating to Go...run_frictionless
 
The Reconciliation Maturity Model
The Reconciliation Maturity ModelThe Reconciliation Maturity Model
The Reconciliation Maturity Modelrun_frictionless
 
Information Security Whitepaper
Information Security WhitepaperInformation Security Whitepaper
Information Security Whitepaperrun_frictionless
 
Constellation Labs - Business Whitepaper
Constellation Labs - Business WhitepaperConstellation Labs - Business Whitepaper
Constellation Labs - Business Whitepaperrun_frictionless
 
What is Account-Based Marketing (ABM)
What is Account-Based Marketing (ABM)What is Account-Based Marketing (ABM)
What is Account-Based Marketing (ABM)run_frictionless
 
DIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your Dealership
DIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your DealershipDIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your Dealership
DIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your Dealershiprun_frictionless
 
Digital Marketing Trends | 2019
Digital Marketing Trends | 2019Digital Marketing Trends | 2019
Digital Marketing Trends | 2019run_frictionless
 
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...run_frictionless
 
White Paper: successful with marketing automation
White Paper: successful with marketing automationWhite Paper: successful with marketing automation
White Paper: successful with marketing automationrun_frictionless
 

Plus de run_frictionless (20)

Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
 
Overcoming Business Challenges with Azure
Overcoming Business Challenges with AzureOvercoming Business Challenges with Azure
Overcoming Business Challenges with Azure
 
How AWS Pricing Works
How AWS Pricing WorksHow AWS Pricing Works
How AWS Pricing Works
 
AWS Well-Architected Framework
AWS Well-Architected FrameworkAWS Well-Architected Framework
AWS Well-Architected Framework
 
Electronic contracts and electronic signatures under Australian law
Electronic contracts and electronic signatures under Australian lawElectronic contracts and electronic signatures under Australian law
Electronic contracts and electronic signatures under Australian law
 
Sidecar
SidecarSidecar
Sidecar
 
Google's guide to innovation: How to unlock strategy, resources and technology
Google's guide to innovation: How to unlock strategy, resources and technologyGoogle's guide to innovation: How to unlock strategy, resources and technology
Google's guide to innovation: How to unlock strategy, resources and technology
 
Google Cloud Industries: The impact of COVID-19 on manufacturers
Google Cloud Industries: The impact of COVID-19 on manufacturersGoogle Cloud Industries: The impact of COVID-19 on manufacturers
Google Cloud Industries: The impact of COVID-19 on manufacturers
 
Six Must-Haves When Using Slack
Six Must-Haves When Using SlackSix Must-Haves When Using Slack
Six Must-Haves When Using Slack
 
Journey mapping in banking & finance
Journey mapping in banking & financeJourney mapping in banking & finance
Journey mapping in banking & finance
 
Strengthening Operational Resilience in Financial Services by Migrating to Go...
Strengthening Operational Resilience in Financial Services by Migrating to Go...Strengthening Operational Resilience in Financial Services by Migrating to Go...
Strengthening Operational Resilience in Financial Services by Migrating to Go...
 
The Reconciliation Maturity Model
The Reconciliation Maturity ModelThe Reconciliation Maturity Model
The Reconciliation Maturity Model
 
Security and Compliance
Security and ComplianceSecurity and Compliance
Security and Compliance
 
Information Security Whitepaper
Information Security WhitepaperInformation Security Whitepaper
Information Security Whitepaper
 
Constellation Labs - Business Whitepaper
Constellation Labs - Business WhitepaperConstellation Labs - Business Whitepaper
Constellation Labs - Business Whitepaper
 
What is Account-Based Marketing (ABM)
What is Account-Based Marketing (ABM)What is Account-Based Marketing (ABM)
What is Account-Based Marketing (ABM)
 
DIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your Dealership
DIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your DealershipDIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your Dealership
DIGITAL MARKETING FOR FIXED OPS: Tips & Techniques for Your Dealership
 
Digital Marketing Trends | 2019
Digital Marketing Trends | 2019Digital Marketing Trends | 2019
Digital Marketing Trends | 2019
 
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
Pharma Marketing: Get Started on Creating Great Customer Experiences with Jou...
 
White Paper: successful with marketing automation
White Paper: successful with marketing automationWhite Paper: successful with marketing automation
White Paper: successful with marketing automation
 

Dernier

Introducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applicationsIntroducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applicationsKnowledgeSeed
 
Planetary and Vedic Yagyas Bring Positive Impacts in Life
Planetary and Vedic Yagyas Bring Positive Impacts in LifePlanetary and Vedic Yagyas Bring Positive Impacts in Life
Planetary and Vedic Yagyas Bring Positive Impacts in LifeBhavana Pujan Kendra
 
Darshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdfDarshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdfShashank Mehta
 
Cybersecurity Awareness Training Presentation v2024.03
Cybersecurity Awareness Training Presentation v2024.03Cybersecurity Awareness Training Presentation v2024.03
Cybersecurity Awareness Training Presentation v2024.03DallasHaselhorst
 
EUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exportersEUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exportersPeter Horsten
 
digital marketing , introduction of digital marketing
digital marketing , introduction of digital marketingdigital marketing , introduction of digital marketing
digital marketing , introduction of digital marketingrajputmeenakshi733
 
Jewish Resources in the Family Resource Centre
Jewish Resources in the Family Resource CentreJewish Resources in the Family Resource Centre
Jewish Resources in the Family Resource CentreNZSG
 
20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdf
20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdf20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdf
20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdfChris Skinner
 
Memorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQMMemorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQMVoces Mineras
 
Traction part 2 - EOS Model JAX Bridges.
Traction part 2 - EOS Model JAX Bridges.Traction part 2 - EOS Model JAX Bridges.
Traction part 2 - EOS Model JAX Bridges.Anamaria Contreras
 
Driving Business Impact for PMs with Jon Harmer
Driving Business Impact for PMs with Jon HarmerDriving Business Impact for PMs with Jon Harmer
Driving Business Impact for PMs with Jon HarmerAggregage
 
Cyber Security Training in Office Environment
Cyber Security Training in Office EnvironmentCyber Security Training in Office Environment
Cyber Security Training in Office Environmentelijahj01012
 
Effective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold JewelryEffective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold JewelryWhittensFineJewelry1
 
How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...
How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...
How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...Hector Del Castillo, CPM, CPMM
 
Church Building Grants To Assist With New Construction, Additions, And Restor...
Church Building Grants To Assist With New Construction, Additions, And Restor...Church Building Grants To Assist With New Construction, Additions, And Restor...
Church Building Grants To Assist With New Construction, Additions, And Restor...Americas Got Grants
 
Entrepreneurship lessons in Philippines
Entrepreneurship lessons in  PhilippinesEntrepreneurship lessons in  Philippines
Entrepreneurship lessons in PhilippinesDavidSamuel525586
 
Healthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare NewsletterHealthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare NewsletterJamesConcepcion7
 
Appkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptxAppkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptxappkodes
 

Dernier (20)

Introducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applicationsIntroducing the Analogic framework for business planning applications
Introducing the Analogic framework for business planning applications
 
Planetary and Vedic Yagyas Bring Positive Impacts in Life
Planetary and Vedic Yagyas Bring Positive Impacts in LifePlanetary and Vedic Yagyas Bring Positive Impacts in Life
Planetary and Vedic Yagyas Bring Positive Impacts in Life
 
Darshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdfDarshan Hiranandani [News About Next CEO].pdf
Darshan Hiranandani [News About Next CEO].pdf
 
Cybersecurity Awareness Training Presentation v2024.03
Cybersecurity Awareness Training Presentation v2024.03Cybersecurity Awareness Training Presentation v2024.03
Cybersecurity Awareness Training Presentation v2024.03
 
EUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exportersEUDR Info Meeting Ethiopian coffee exporters
EUDR Info Meeting Ethiopian coffee exporters
 
digital marketing , introduction of digital marketing
digital marketing , introduction of digital marketingdigital marketing , introduction of digital marketing
digital marketing , introduction of digital marketing
 
Jewish Resources in the Family Resource Centre
Jewish Resources in the Family Resource CentreJewish Resources in the Family Resource Centre
Jewish Resources in the Family Resource Centre
 
WAM Corporate Presentation April 12 2024.pdf
WAM Corporate Presentation April 12 2024.pdfWAM Corporate Presentation April 12 2024.pdf
WAM Corporate Presentation April 12 2024.pdf
 
20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdf
20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdf20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdf
20220816-EthicsGrade_Scorecard-JP_Morgan_Chase-Q2-63_57.pdf
 
Memorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQMMemorándum de Entendimiento (MoU) entre Codelco y SQM
Memorándum de Entendimiento (MoU) entre Codelco y SQM
 
Traction part 2 - EOS Model JAX Bridges.
Traction part 2 - EOS Model JAX Bridges.Traction part 2 - EOS Model JAX Bridges.
Traction part 2 - EOS Model JAX Bridges.
 
Driving Business Impact for PMs with Jon Harmer
Driving Business Impact for PMs with Jon HarmerDriving Business Impact for PMs with Jon Harmer
Driving Business Impact for PMs with Jon Harmer
 
Cyber Security Training in Office Environment
Cyber Security Training in Office EnvironmentCyber Security Training in Office Environment
Cyber Security Training in Office Environment
 
Effective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold JewelryEffective Strategies for Maximizing Your Profit When Selling Gold Jewelry
Effective Strategies for Maximizing Your Profit When Selling Gold Jewelry
 
How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...
How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...
How Generative AI Is Transforming Your Business | Byond Growth Insights | Apr...
 
Church Building Grants To Assist With New Construction, Additions, And Restor...
Church Building Grants To Assist With New Construction, Additions, And Restor...Church Building Grants To Assist With New Construction, Additions, And Restor...
Church Building Grants To Assist With New Construction, Additions, And Restor...
 
Entrepreneurship lessons in Philippines
Entrepreneurship lessons in  PhilippinesEntrepreneurship lessons in  Philippines
Entrepreneurship lessons in Philippines
 
Healthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare NewsletterHealthcare Feb. & Mar. Healthcare Newsletter
Healthcare Feb. & Mar. Healthcare Newsletter
 
Appkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptxAppkodes Tinder Clone Script with Customisable Solutions.pptx
Appkodes Tinder Clone Script with Customisable Solutions.pptx
 
The Bizz Quiz-E-Summit-E-Cell-IITPatna.pptx
The Bizz Quiz-E-Summit-E-Cell-IITPatna.pptxThe Bizz Quiz-E-Summit-E-Cell-IITPatna.pptx
The Bizz Quiz-E-Summit-E-Cell-IITPatna.pptx
 

Charting a Way Forward Online Content Regulation

  • 1. C H A R T I N G A W A Y F O R W A R D 1 Online Content Regulation C H A R T I N G A WAY F O R WA R D Monika Bickert V P, C O N T E N T P O L I C Y F E B R U A R Y 2 0 2 0
  • 2. C H A R T I N G A W A Y F O R W A R D 2 I. A New Governance Challenge II. Four Key Questions 09 Q U E S T I O N 1 How can content regulation best achieve the goal of reducing harmful speech while preserving free expression? 10 Q U E S T I O N 2 How should regulation enhance the accountability of internet platforms to the public? 13 Q U E S T I O N 3 Should regulation require internet companies to meet certain performance targets? 16 Q U E S T I O N 4 Should regulation define which “harmful content” should be prohibited on internet platforms? III. Principles for Future Regulators IV. The Beginning of a Conversation End Notes 03 06 19 21 22 Table of Contents
  • 3. 3 C H A R T I N G A W A Y F O R W A R D Billions of people have come online in the past decade, gaining unprecedented access to information and the ability to share their ideas and opinions with a wide audience. Global connectivity has improved economies, grown businesses, reunited families, raised money for charity, and helped bring about political change. At the same time, the internet has also made it easier to find and spread content that could contribute to harm, like terror propaganda. While this type of speech is not new, society’s increased reliance on the internet has shifted the power among those who control and are affected by mass communication. For centuries, political leaders, philosophers, and activists have wrestled with the question of how and when governments should place limits on freedom of expression to protect people from content that can contribute to harm. Increasingly, privately run internet platforms are making these determinations, as more speech flows through their systems. Consistent with human rights norms, internet platforms generally respect the laws of the countries in which they operate, and they are also free to establish their own rules about permissible expression, which are often more restrictive than laws. As a result, internet companies make calls every day that influence who has the ability to speak and what content can be shared on their platform. Problems arise when people do not understand the decisions that are being made or feel powerless when those decisions impact their own speech, behavior, or experience. People may expect similar due process channels to those that they 01 A New Governance Challenge
  • 4. 4 01 | A New Governance Challenge C H A R T I N G A W A Y F O R W A R D enjoy elsewhere in modern society. For example, people in liberal democracies may expect platforms to draft content standards with user and civil society input in mind, as well as the ability to seek redress if they feel a decision was made in error. People are able to register preferences through market signals by moving to other platforms, but they may not be satisfied by this as their only option. As a result, private internet platforms are facing increasing questions about how accountable and responsible they are for the decisions they make. They hear from users who want the companies to reduce abuse but not infringe upon freedom of expression. They hear from governments, who want companies to remove not only illegal content but also legal content that may contribute to harm, but make sure that they are not biased in their adoption or application of rules. Faced with concerns about bullying or child sexual exploitation or terrorist recruitment, for instance, governments may find themselves doubting the effectiveness of a company’s efforts to combat such abuse or, worse, unable to satisfactorily determine what those efforts are. This tension has led some governments to pursue regulation over the speech allowed online and companies’ practices for content moderation. The approaches under consideration depend on each country’s unique legal and historical traditions. In the United States, for example, the First Amendment protects a citizen’s ability to engage in dialogue online without government interference except in the narrowest of circumstances. Citizens in other countries often have different expectations about freedom of expression, and governments have different expectations about platform accountability. Unfortunately, some of the laws passed so far do not always strike the appropriate balance between speech and harm, unintentionally pushing platforms to err too much on the side of removing content. Facebook has therefore joined the call for new regulatory frameworks for online content—frameworks that ensure companies are making decisions about online speech in a way that minimizes harm but also respects the fundamental right to free expression. This balance is necessary to protect the open internet, which is increasingly threatened—even walled off—by some regimes. Facebook wants to be a constructive partner to governments as they weigh the most effective, democratic, and workable approaches to address online content governance. As Mark Zuckerberg wrote in a recent op-ed: It’s impossible to remove all harmful content from the Internet, but when people use dozens of different sharing services—all with their own policies and processes—we need a more standardized approach … Regulation could set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.1
  • 5. 5 01 | A New Governance Challenge C H A R T I N G A W A Y F O R W A R D This paper explores possible regulatory structures for content governance outside the United States and identifies questions that require further discussion. It builds off recent developments on this topic, including legislation proposed or passed into law by governments, as well as scholarship that explains the various content governance approaches that have been adopted in the past and may be taken in the future.2 Its overall goal is to help frame a path forward—taking into consideration the views not only of policymakers and private companies, but also civil society and the people who use Facebook’s platform and services. This debate will be central to shaping the character of the internet for decades to come. If designed well, new frameworks for regulating harmful content can contribute to the internet’s continued success by articulating clear, predictable, and balanced ways for government, companies, and civil society to share responsibilities and work together. Designed poorly, these efforts may stifle expression, slow innovation, and create the wrong incentives for platforms.
  • 6. 6 C H A R T I N G A W A Y F O R W A R D Since the invention of the printing press, developments in communications technology have always been met with calls for state action. In the case of internet companies, however, it is uniquely challenging to develop regulation to ensure accountability. There are four primary challenges. 1. Legal environments and speech norms vary. Many internet companies have a global user base with a broad spectrum of expectations about what expression should be permitted online and how decision makers should be held accountable. The cross-border nature of communication is also a defining feature of many internet platforms, so companies generally maintain one set of global policies rather than country-specific policies that would interfere with that experience. 2. Technology and speech are dynamic. Internet services are varied and ever changing. Some are focused on video or images, while some are primarily text. Some types of internet communications are one-to-one and analogous to the private sphere, while others are more like a town square or broadcasting—where thousands or millions of people can access the content in question. Some services are ephemeral, and some are permanent. Among these different interaction types, norms for acceptable speech may vary. Just as people may say things in the privacy of a family dinner conversation that they would not say at a town hall meeting, online communities cultivate their own norms that are enforced both formally and informally. All are constantly changing to compete and succeed. 02 Four Key Questions
  • 7. 7 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D 3. Enforcement will always be imperfect. Given the dynamic nature and scale of online speech, the limits of enforcement technology, and the different expectations that people have over their privacy and their experiences online, internet companies’ enforcement of content standards will always be imperfect. Their ability to review speech for policy or legal violations will always be limited. Even in a hypothetical world where enforcement efforts could perfectly track language trends and identify likely policy violations, companies would still struggle to apply policies because they lack context that is often necessary, such as whether an exchange between two people is in jest or in anger, or whether graphic content is posted for sadistic pleasure or to call attention to an atrocity. 4. Companies are intermediaries, not speakers. Publisher liability laws that punish the publication of illegal speech are unsuitable for the internet landscape. Despite their best efforts to thwart the spread of harmful content, internet platforms are intermediaries, not the speakers, of such speech, and it would be impractical and harmful to require internet platforms to approve each post before allowing it. The ability of individuals to communicate without journalistic intervention has been central to the internet’s growth and positive impact. Imposing traditional publishing liability would seriously curtail that impact by limiting the ability of individuals to speak. Companies seeking assurance that their users’ speech is lawful would need to review and approve each post before allowing it to appear on the site. Companies that could afford to offer services under such a model would be incentivized to restrict service to a limited set of users and err on the side of removing any speech close to legal lines. Legal experts have cautioned that holding internet companies liable for the speech of their users could lead to the end of these services altogether. As explained by Jennifer Granick, surveillance and cybersecurity counsel at the American Civil Liberties Union, “There is no way you can have a YouTube where somebody needs to watch every video. There is no way you can have a Facebook if somebody needs to watch every post. There would be no Google if someone had to check every search result.”3 Such liability would stifle innovation as well as individuals’ freedom of expression. This model’s flaws appear all the more clear in light of the significant efforts many companies make to identify and remove harmful speech—efforts that often require the protective shield of intermediary liability protection laws. (Instead, as this paper explains, other models for regulating internet platforms would better empower and encourage them to effectively address harmful content.)
  • 8. 8 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D These challenges suggest that retrofitting the rules that regulate offline speech for the online world may be insufficient. Instead, new frameworks are needed. This paper will put forward four questions that underpin the debate about that framework: 01 How can content regulation best achieve the goal of reducing harmful speech while preserving free expression? 02 How should regulation enhance the accountability of internet platforms? 03 Should regulation require internet companies to meet certain performance targets? 04 Should regulation define which “harmful content” should be prohibited on internet platforms?
  • 9. 9 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D While governments have different thresholds for limiting speech, the general goal of most content regulation is to reduce harmful speech while preserving free expression. Broadly speaking, regulation could aim to achieve this goal in three ways: (1) holding internet companies accountable for having certain systems and procedures in place, (2) requiring companies meet specific performance targets when it comes to content that violates their policies, or (3) requiring companies to restrict specific forms of speech, even if the speech is not illegal. The discussion of Questions 2, 3, and 4 will discuss these approaches in more detail. Facebook generally takes the view that the first approach—requiring companies to maintain certain systems and procedures—is the best way to ensure an appropriate balancing of safety, freedom of expression, and other values. By requiring systems such as user-friendly channels for reporting content or external oversight of policies or enforcement decisions, and by requiring procedures such as periodic public reporting of enforcement data, regulation could provide governments and individuals the information they need to accurately judge social media companies’ efforts. Governments could also consider requiring companies to hit specific performance targets, such as decreasing the prevalence of content that violates a site’s hate speech policies, or maintaining a specified median response time to user or government reports of policy violations. While such targets may have benefits, they could also create perverse incentives for companies to find ways to decrease enforcement burdens, perhaps by defining harmful speech more narrowly, making it harder for people to report possible policy violations, or stopping efforts to proactively identify violating content that has not been reported. Finally, regulators could require that internet companies remove certain content— beyond what is already illegal. Regulators would need to clearly define that content, and the definitions would need to be different from the traditional legal definitions that are applied through a judicial process where there is more time, more context, and independent fact-finders. These approaches are not mutually exclusive nor are they the only options. However, they do represent some of the most common ideas that we have heard in our conversations with policymakers, academics, civil society experts, and regulators. The approach a government may pursue will depend on what sort of platform behavior it wants to incentivize. Ultimately, the most important elements of any system will be due regard for each of the human rights and values at stake, as well as clarity and precision in the regulation. Question 1: How can content regulation best achieve the goal of reducing harmful speech while preserving free expression?
  • 10. 10 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D People in democratic societies are accustomed to being able to hold their governments accountable for their decisions. When internet companies make decisions that have an impact on people’s daily lives, those people expect the same accountability. This is why internet companies, in their conversations with people using their services, face questions like, “Who decides your content standards?,” “Why did you remove my post?,” and “What can I do to get my post reinstated when you make a mistake?” Regulation could help answer those questions by ensuring that internet content moderation systems are consultative, transparent, and subject to meaningful independent oversight. Specifically, procedural accountability regulations could include, at a minimum, requirements that companies publish their content standards, provide avenues for people to report to the company any content that appears to violate the standards, respond to such user reports with a decision, and provide notice to users when removing their content from the site. Such regulations could require a certain level of performance in these areas to receive certifications or to avoid regulatory consequences. Regulation could also incentivize—or where appropriate, require—additional measures such as: Insight into a company’s development of its content standards. A requirement to consult with stakeholders when making significant changes to standards. An avenue for users to provide their own input on content standards. A channel for users to appeal the company’s removal (or non-removal) decision on a specific piece of content to some higher authority within the company or some source of authority outside the company. Public reporting on policy enforcement (possibly including how much content was removed from the site and for what reasons, how much content was identified by the company through its own proactive means before users reported it, how often the content appears on its site, etc.). Regulations of this sort should take into account a company’s size and reach, as content regulation should not serve as a barrier to entry for new competitors in the market. Question 2: How should regulation enhance the accountability of internet platforms to the public?
  • 11. 11 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D An important benefit of this regulatory approach is that it incentivizes an appropriate balancing of competing interests, such as freedom of expression, safety, and privacy. Required transparency measures ensure that the companies’ balancing efforts are laid bare for the public and governments to see. If they pursued this approach, regulators would not be starting from scratch; they would be expanding and adding regulatory effect to previous efforts of governments, civil society, and industry. For example, the Global Network Initiative Principles4 and other civil society efforts have helped create baselines for due process and transparency linked to human rights principles. The GNI has also created an incentives structure for compliance. Similarly, the European Union Code of Conduct on Countering Illegal Hate Speech Online has created a framework that has led to all eight signatory companies meeting specific requirements for transparency and due process. In adopting this approach, however, regulators must be cautious. Specific product design mandates—“put this button here,” “use this wording,” etc.—will make it harder for internet companies to test what options work best for users. It will also make it harder for companies to tailor the reporting experience to the device being used: the best way to report content while using a mobile phone is likely to differ from the best way to report content while using a laptop. In fact, we have found that certain requests from regulators to structure our reporting mechanisms in specific ways have actually reduced the likelihood that people will report dangerous content to us quickly. Another key component of procedural accountability regulation is transparency. Transparency, such as required public reporting of enforcement efforts, would certainly subject internet companies to more public scrutiny, as their work would be available for governments and the public to grade. This scrutiny could have both positive and negative effects. The potential negative effects could arise from the way transparency requirements shape company incentives. If a company feared that people would judge harshly of its efforts and would therefore stop using or advertising on its service, the company might be tempted to sacrifice in unmeasured areas to help boost performance in measured areas. For instance, if regulation required the measuring and public annual reporting of the average time a company took to respond to user reports of violating content, a company might sacrifice quality in its review of user reports in order to post better numbers. If regulation required public reporting of how many times users appealed content removal decisions, a company might make its appeal channel difficult to find or use. On the other hand, companies taking a long-term view of their business incentives will likely find that public reactions to their transparency efforts—even if painful at times—can lead them to invest in ways that will improve public and government
  • 12. 12 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D confidence in the longer term. For example, spurred by multi-stakeholder initiatives like GNI or the EU Code of Conduct mentioned above, Facebook has increased efforts to build accountability into our content moderation process by focusing on consultation, transparency, and oversight. In response to reactions to our reports, we have sought additional input from experts and organizations. Going forward, we will continue to work with governments, academics, and civil society to consider what more meaningful accountability could look like in this space. As companies get better at providing this kind of transparency, people and governments will have more confidence that companies are improving their performance, and people will be empowered to make choices about which platform to use—choices that will ultimately determine which services will be successful.
  • 13. 13 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D Question 3: Should regulation require internet companies to meet certain performance targets? Governments could also approach accountability by requiring that companies meet specific targets regarding their content moderation systems. Such an approach would hold companies accountable for the ends they have achieved rather than ensuring the validity of how they got there. For example, regulators could say that internet platforms must publish annual data on the “prevalence” of content that violates their policies, and that companies must make reasonable efforts to ensure that the prevalence of violating content remains below some standard threshold. If prevalence rises above this threshold, then the company might be subject to greater oversight, specific improvement plans, or—in the case of repeated systematic failures—fines. Regulators pursuing this approach will need to decide which metrics to prioritize. That choice will determine the incentives for internet companies in how they moderate content. The potential for perverse incentives under this model is greater than under the procedural accountability model, because companies with unsatisfactory performance reports could face direct penalties in the form of fines or other legal consequences, and will therefore have a greater incentive to shortcut unmeasured areas where doing so could boost performance in measured areas. Governments considering this approach may also want to consider whether and how to establish common definitions—or encourage companies to do so—to be able to compare progress across companies and ensure companies do not tailor their definitions in such a way to game the metrics. With that in mind, perhaps the most promising area for exploring the development of performance targets would be the prevalence of violating content (how much violating content is actually viewed on a site) and the time-to-action (how long the content exists on the site before removal or other intervention by the platform). Prevalence: Generally speaking, some kinds of content are harmful only to the extent they are ever actually seen by people. A regulator would likely care more about stopping one incendiary hateful post that would be seen by millions of people than twenty hateful posts that are seen only by the person who posted them. For this reason, regulators (and platforms) will likely be best served by a focus on reducing the prevalence of views of harmful content. Regulators might consider, however, the incentives they might be inadvertently creating for companies to more narrowly define harmful content or to take a minimalist approach in prevalence studies. Time-to-Action: By contrast, some types of content could be harmful even without an audience. In the case of real-time streaming of child sexual
  • 14. 14 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D exploitation, coordination of a violent attack, or an expressed plan to harm oneself, quick action is often necessary to save lives, and the weighing of values may be so different as to justify an approach that incentivizes fast decision-making even where it may lead to more incorrect content removals (thus infringing on legitimate expression) or more aggressive detection regimes (such as in the case of using artificial intelligence to detect threats of self harm, thus potentially affecting privacy interests). There are significant trade-offs regulators must consider when identifying metrics and thresholds. For example, a requirement that companies “remove all hate speech within 24 hours of receiving a report from a user or government” may incentivize platforms to cease any proactive searches for such content, and to instead use those resources to more quickly review user and government reports on a first- in-first-out basis. In terms of preventing harm, this shift would have serious costs. The biggest internet companies have developed technology that allows them to detect certain types of content violations with much greater speed and accuracy than human reporting. For instance, from July through September 2019, the vast majority of content Facebook removed for violating its hate speech, self-harm, child exploitation, graphic violence, and terrorism policies was detected by the company’s technology before anyone reported it. A regulatory focus on response to user or government reports must take into account the cost it would pose to these company-led detection efforts. Even setting aside proactive detection, a model that focuses on average time-to- action of reported content would discourage companies from focusing on removal of the most harmful content. Companies may be able to predict the harmfulness of posts by assessing the likely reach of content (through distribution trends and likely virality), assessing the likelihood that a reported post violates (through review with artificial intelligence), or assessing the likely severity of reported content (such as if the content was reported by a safety organization with a proven record of reporting only serious violations). Put differently, companies focused on average speed of assessment would end up prioritizing review of posts unlikely to violate or unlikely to reach many viewers, simply because those posts are closer to the 24-hour deadline, even while other posts are going viral and reaching millions. A regulation requiring companies to “remove all hate speech within 24 hours of upload” would create still more perverse incentives. Companies would have a strong incentive to turn a blind eye to content that is older than 24 hours (and unlikely to be seen by a government), even though that content could be causing harm. Companies would be disincentivized from developing technology that can identify violating private content on the site, and from conducting prevalence studies of the existing content on their site. The potential costs of this sort of regulation can be illustrated through a review of Facebook’s improvements in countering terror propaganda. When Facebook first
  • 15. 15 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D began measuring its average time to remove terror propaganda, it had technical means to automatically identify some terror propaganda at the time of upload. After considerable time and effort, Facebook engineers developed better detection tools, including a means for identifying propaganda that had been on the site for years. Facebook used this tool to find and remove propaganda, some of which had long existed on the site without having been reported by users or governments. This effort meant that Facebook’s measurement of average time on site for terror propaganda removed went up considerably. If Facebook would have faced consequences for an increase in its “time-to-action,” it would have been discouraged from ever creating such a tool. Whichever metrics regulators decide to pursue, they would be well served to first identify the values that are at stake (such as safety, freedom of expression, and privacy) and the practices they wish to incentivize, and then structure their regulations in a way that minimizes the risk of perverse incentives. This would require close coordination among various regulators and policymaking bodies focused on sometimes competing interests, such as privacy, law enforcement, consumer protection, and child safety. A regulation that serves the regulatory interests of one body well may in fact harm the interests of another.
  • 16. 16 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D Many governments already have in place laws and systems to define illegal speech and notify internet platforms when illegal speech is present on their services. Governments are also considering whether to develop regulations that specifically define the “harmful content” internet platforms should remove—above and beyond speech that is already illegal. International human rights instruments provide the best starting point for analysis of government efforts to restrict speech. Article 19 of the International Covenant on Civil and Political Rights (ICCPR)5 holds: 1. Everyone shall have the right to hold opinions without interference. 2. Everyone shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of his choice. 3. The exercise of the rights provided for in paragraph 2 of this article carries with it special duties and responsibilities. It may therefore be subject to certain restrictions, but these shall only be such as are provided by law and are necessary: (a) For respect of the rights or reputations of others; (b) For the protection of national security or of public order (ordre public), or of public health or morals. Similarly, Article 10 of the European Convention on Human Rights carves out room for governments to pass speech laws that are necessary for: …[T]he interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.6 In countries with strong democratic institutions, laws governing speech are passed as part of a transparent process that weighs different societal interests, with due respect for human rights and with the check of an impartial judiciary. Drawing a clear line between legal and illegal speech helps citizens to know what is and is not permissible. Judicial checks and balances help to reduce the likelihood of capricious restrictions on legitimate speech. Question 4: Should regulation define which “harmful content” should be prohibited on internet platforms?
  • 17. 17 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D Despite these international human rights norms, laws affecting speech and even criminalizing it vary widely both by jurisdiction and by the type of harm contemplated, even among liberal democratic nations. Among types of content most likely to be outlawed, such as content supporting terrorism and content related to the sexual exploitation of children, there is significant variance in laws and enforcement. In some jurisdictions, for instance, images of child sexual exploitation are lawful if they are computer generated. In others, praise of terror groups is permitted, and lists of proscribed terror groups vary widely by country and region. Laws vary even more in areas such as hate speech, harassment, or misinformation. Many governments are grappling with how to approach the spread of misinformation, but few have outlawed it. As the UN and others have noted, the general criminalization of sharing misinformation would be “incompatible with international standards for restrictions on freedom of expression.”7 Thresholds also vary for defamatory speech and offensive material. The variation among laws is significant because it underscores the difficulty of weighing competing values like safety and expression. If governments do not agree on how to balance these interests when writing laws, it is likely they will have very different ideas of how to define online content standards, which require even more specificity than laws in order to ensure consistent application. Laws restricting speech are generally implemented by law enforcement officials and the courts. Fact finders investigate and due process is afforded to the speaker. The penalty for illegal speech can include fines or imprisonment. Internet content moderation is fundamentally different. Companies use a combination of technical systems and employees and often have only the text in a post to guide their assessment. The assessment must be made quickly, as people expect violations to be removed within hours or days rather than the months that a judicial process might take. The penalty is generally removal of the post or the person’s account. Given these complexities, governments that seek to define for internet companies what content they should allow on their platforms should seek to: Create a standard that can be enforced practically, at scale, with limited context about the speaker and content, without undermining the goal of promoting expression. Take account of different user preferences and expectations for different sorts of internet services. Different definitions may be appropriate based on the nature of the service (search engines versus social media) or the nature of the audience (private chat versus public post, ephemeral versus more permanent, etc).
  • 18. 18 02 | Four Key Questions C H A R T I N G A W A Y F O R W A R D Take account of individual controls, preferences, settings, etc. that a company may provide its users for some kinds of potentially harmful content (e.g., nudity). Provide flexibility so that platforms can adapt policies to emerging language trends and adversarial efforts to avoid enforcement. For instance, hateful speech, bullying, and threats of self-harm are often expressed through a lexicon of words that fall in and out of favor or evolve over time. Similarly, proscribed terror groups and hate groups may rename themselves or fracture into different groups.
  • 19. 19 C H A R T I N G A W A Y F O R W A R D In some countries, a regulatory system for online content may require a new type of regulator. Addressing online communication goes beyond simply adopting the core capabilities of traditional regulators and channeling them towards the task of effective oversight. Instead, any regulator in this space will need proficiency in data, operations, and online content. Governments will also need to ensure that regulatory authorities pay due regard to innovation and protect users’ rights online, including consideration of the following: I N C E N T I V E S Ensuring accountability in companies’ content moderation systems and procedures will be the best way to create the incentives for companies to responsibly balance values like safety, privacy, and freedom of expression. T H E G L O B A L N AT U R E O F T H E I N T E R N E T Any national regulatory approach to addressing harmful content should respect the global scale of the internet and the value of cross-border communications. It should aim to increase interoperability among regulators and regulations. However, governments should not impose their standards onto other countries’ citizens through the courts or any other means. 03 Principles for Future Regulators
  • 20. 20 03 | Principles for Future Regulators C H A R T I N G A W A Y F O R W A R D F R E E D O M O F E X P R E S S I O N In addition to complying with Article 19 of the ICCPR (and related guidance), regulators should consider the impacts of their decisions on freedom of expression. T E C H N O L O G Y Regulators should develop an understanding of the capabilities and limitations of technology in content moderation and allow internet companies the flexibility to innovate. An approach that works for one particular platform or type of content may be less effective (or even counterproductive) when applied elsewhere. P R O P O R T I O N A L I T Y A N D N E C E S S I T Y Regulators should take into account the severity and prevalence of the harmful content in question, its status in law, and the efforts already underway to address the content.
  • 21. 21 C H A R T I N G A W A Y F O R W A R D We hope that this paper contributes to the ongoing conversations among policymakers and other stakeholders. These are challenging issues and sustainable solutions will require collaboration. As we learn more from interacting with experts and refining our approach to content moderation, our thinking about the best ways to regulate will continue to evolve. We welcome feedback on the issues discussed in this paper. 04 The Beginning of a Conversation
  • 22. 22 C H A R T I N G A W A Y F O R W A R D E N D N O T E S Online Content Regulation 1. Zuckerberg, Mark. “The Internet Needs New Rules. Let’s Start in These Four Areas.” Washington Post, 30 March 2019. 2. Bowers, John, and Jonathan Zittrain. “Answering Impossible Questions: Content Governance in an Age of Disinformation.” Harvard Kennedy School Misinformation Review, 2020. 3. As quoted in: Reynolds, Matt. “The Strange Story of Section 230, the Obscure Law That Created Our Flawed, Broken Internet.” Wired, 28 March 2019, www.wired.co.uk/article/ section-230-communications-decency-act. 4. “The GNI Principles.” Global Network Initiative, 10 January 2020. https://globalnetworkinitiative.org/gni-principles/. 5. UN General Assembly, International Covenant on Civil and Political Rights, 16 December 1966, United Nations, Treaty Series, vol. 999, p. 171. 6. Council of Europe, European Convention for the Protection of Human Rights and Fundamental Freedoms, as amended by Protocols Nos. 11 and 14, 4 November 1950. 7. “Joint Declaration on Freedom of Expression and ‘Fake News’, Disinformation and Propaganda.” The United Nations (UN) Special Rapporteur on Freedom of Opinion and Expression, the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media, the Organization of American States (OAS) Special Rapporteur on Freedom of Expression and the African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on Freedom of Expression and Access to Information, 3 March 2015. As noted in the Joint Declaration and explored infra, those International Standards are: “that they be provided for by law, serve one of the legitimate interests recognised under international law, and be necessary and proportionate to protect that interest.”