Internet Architecture Board

RFC2850

IAB Minutes 2021-07-21

Home»Documents»Minutes»Minutes 2021»IAB Minutes 2021-07-21

Minutes of the 2021-07-21 IAB Teleconference

1. Administrivia

1.1. Attendance

Present:
  • Jari Arkko
  • Deborah Brungard
  • Ben Campbell
  • Lars Eggert (IETF Chair)
  • Wes Hardaker
  • Mirja Kühlewind (IAB Chair)
  • Zhenbin Li
  • Jared Mauch
  • Cindy Morgan (IAB Executive Administrative Manager)
  • Tommy Pauly
  • Karen O’Donoghue (ISOC Liaison)
  • Colin Perkins (IRTF Chair)
  • David Schinazi
  • Amy Vezza (IETF Secretariat)
  • Jiankang Yao
Guests:
  • Carl Gahnberg
  • Konstantinos Komaitis
Regrets:
  • Cullen Jennings
  • Martin Vigoureux (IESG Liaison)
  • Russ White
Observers:
  • Chris Box
  • Warren Kumari
  • Daniel Migault
  • Greg Wood

1.2. Agenda bash and announcements

Zhenbin Li asked to add a summary of the current IETF outreach work to the Joint IAB/IESG agenda for 2021-07-23.

2. Technical Discussion: Content Moderation, etc.

Konstantinos Komaitis and Carl Gahnberg from the Internet Society joined the IAB to discuss current activities around content moderation.

In the United States, Section 230 says that:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

A number of proposals from both sides of the aisle have been made that would amend, limit, or repeal Section 230:

  1. Bill name: 21st Century Foundation for the Right to Express and Engage in Speech Act (21st Century FREE Speech Act)
    • Date introduced: April 27
    • Status: Referred to the Committee on Commerce, Science, and Transportation
    • Category: Repeal; Good Samaritan
  2. Bill name: Protecting Americans From Dangerous Algorithms Act
    • Date introduced: March 23
    • Status: Referred to the House Committee on Energy and Commerce
    • Category: Limiting the Scope
  3. Bill Name: Stop Shielding Culpable Platforms Act
    • Date introduced: March 18, 2021
    • Status: Referred to House Energy and Commerce Committee
    • Category: Imposing New Obligations
  4. Bill name: Platform Accountability and Consumer Transparency (PACT) Act
    • Date introduced: March 17, 2021
    • Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation
    • Category: Imposing New Obligations
  5. Bill name: Abandoning Online Censorship (AOC) Act
    • Date introduced: Feb. 5, 2021
    • Status: Referred to the House Committee on Energy and Commerce
    • Category: Repeal
  6. Bill Name: Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms (SAFE TECH) Act
    • Date introduced: Announced Feb. 5, 2021
    • Status: Announced but not yet introduced
    • Category: Limiting the Scope
  7. Bill Name: See Something, Say Something Online Act of 2021
    • Date introduced: Jan. 22, 2021
    • Status: Read twice and referred to the Senate Committee on Commerce, Science, and Transportation
    • Category: Imposing New Obligations
  8. Bill name: Curbing Abuse and Saving Expression in Technology (CASE-IT) Act
    • Date introduced: Jan. 12, 2021
    • Status: Referred to the House Committee on Energy and Commerce
    • Category: Imposing New Obligations, Good Samaritan
  9. Bill name: Protecting Constitutional Rights From Online Platform Censorship Act
    • Date introduced: Jan. 4, 2021
    • Status: Referred to the House Committee on Energy and Commerce
    • Category: Good Samaritan

Outside of the United States, the rest of the world is trying to figure out digital sovereignty. This idea centers on imposing a legal framework on the Internet, regardless of what effect it would have on other jurisdictions.

In Australia, the Online Safety Bill 2021 establishes schemes to protect cyber abuse victims by forcing social media companies, internet and hosting providers to remove harmful material within 24 hours.

In Canada, the Online Harms Legislation (proposal) aims to amend the Canadian Human Rights Act to define a new discriminatory practice of communicating hate speech online, and to provide individuals with additional remedies to address hate speech; demands harmful content to be removed in 24 hours.

In the United Kingdom, the Online Safety Bill imposes a duty of care on digital service providers to moderate user-generated content in a way that prevents users from being exposed to illegal and/or harmful content online. Harmful and/or illegal content should be removed in 24 hours.

In the European Union, the TEREG Regulation compels online platforms to swiftly – in 24 hours – remove or block access to online content deemed “terrorist” in nature. If the company fails to do so, it would risk a fine that could go up to 4% of its global turnover.

Also in the EU, the Digital Services Act is billed as a rebalancing “of the rights and responsibilities of users, intermediary platforms, and public authorities,” all based on “European values.” Articles 26 and 27 (i.e. mitigation risks) incentivizes the swift removal of content, though no time-limit is identified.

Jari Arkko asked whether these regulations apply only to the social media companies, or if they would also apply to things like cloud platforms and ISPs. Konstantinos Komaitis replied that TEREG would only apply to the social media platforms, but the DSA includes hosting providers and cloud services, although the final text is still being negotiated.

The common denominator in these proposals is the swift removal of content. Platforms have responded to this by deploying upload filters. Konstantinos Komaitis said that the Internet Society has started looking at these upload filters and what the couple mean for the Internet.

User-generated-content platforms are inserting automated filtering mechanisms deep in their networks to deliver services to their users. Platforms with significant market power have convened a forum called the Global Internet Forum to Counter Terrorism (GIFCT), through which approved participants (but not everyone) collaborate to create shared upload filters. The idea is that these filters are interoperable amongst platforms, which, prima facie, is good for openness and inclusiveness. However, allowing the design choices of filters to be made by a handful of companies could turn them into de facto standards bodies. Upload filters are not based on collaborative, open, voluntary standards but on closed, proprietary ones, owned by specific companies. Unlike traditional building blocks, these upload filters would end up not being interoperable with other similar filters. Smaller online platforms will need to license them. New entrants may find the barriers to entry too high.

Ben Campbell asked if the filter framework in GIFCT can handle having different policies in different jurisdictions. Konstantinos Komaitis replied that he does not think so; the idea is that the filters behave the same regardless of jurisdiction.

Jari Arkko asked what more the IAB can do to promote openness. Konstantinos Komaitis replied that the problem is that the companies aren’t willing to share their algorithms because they are proprietary information.

Jiankang Yao said that the Internet is a mirror for the real world, and when the real world has problems, they will show up on the Internet, even if the issue is not with the structure of the Internet itself. Different countries have different laws and customs, and the tools need to be able to reflect those differences. The IETF can help at the technology layer, but needs help from the Internet Society for the political layer.

Carl Gahnberg talked a bit about some of the related work that ISOC is doing in this space. The Internet Society’s 2019 Global Internet Report recognized that trends of consolidation in the Internet economy, particularly at the application layer and in web services, has spurred public debates on the need to regulate “big tech” and online platforms.

Among the proposed measures by policymakers, academics and other thought leaders across the world is for software services and systems to be legally required to provide open interfaces and interoperability. There are three main motivations for this:

  1. Data portability: User can move his/hers data from Service A to Service B.
  2. Inter-service Interoperability: Users can communicate across different service providers and their networks of users
  3. Data and functionality access: Service Provider B can access data and other resources from a dominant provider

ISOC does not have a position on whether or not these policies are good or bad at addressing competition concerns. However, given that the interfaces in question are de facto building blocks for Internet applications, and could become important pieces of infrastructure for online services, there is a risk that these mandates create undesirable consequences.

Provisions in the draft US ACCESS Act of 2019 are applicable to consumer-facing communications and information services providers that “[generate] income, directly or indirectly, from the collection, processing, sale, or sharing of user data” with > 100m monthly active users in the US,” and include:

  • Portability: “A large communications platform provider shall, for each large communications platform it operates, maintain a set of transparent, third-party-accessible interfaces (including application programming interfaces) to initiate the secure transfer of user data to a user, or to a competing communications provider acting at the direction of a user, in a structured, commonly used, and machine-readable format.”
  • Interoperability: “A large communications platform provider shall, for each large communications platform it operates, maintain a set of transparent, third-party-accessible interfaces (including application programming interfaces) to facilitate and maintain technically compatible, interoperable communications with a user of a competing communications provider.”

Mirja Kühlewind noted that the IAB has already had some discussion about interoperability between applications and is planning to have more discussion about in the future, so this was good input.

Mirja Kühlewind asked the IAB if they think there is anything that the IAB can do related to content moderation.

Tommy Pauly replied that any sort of concrete response seems like it would be out of scope for the IAB, but that the topic does relate to other relevant work about the mechanisms that people use for content blocking and filtering.

Jared Mauch agreed, noting that the IETF has been reluctant in the past to work on topics like lawful intercept. He said that it is good for the IAB to be informed about what is happening in the public policy space, but that it is difficult to know what sort of technical work will come out of that.

Ben Campbell noted that the IAB has made statements about blocking and filtering in the past, e.g. RFC 7754: Technical Considerations for Internet Service Blocking and Filtering, and wondered if the IAB should think about re-stating some of those things in the context of this discussion.

Mirja Kühlewind agreed that reviewing what the IAB has already said is a good first step, and that the IAB should continue to discuss this more.