Photo by Rahul Chakraborty on Unsplash

Three legal changes that may reshape BigTech in 2023

Olivier Kamanda

--

Two EU regulations & a US Supreme Court decision will land this year.

1. The Digital Services Act (“DSA”):

The European Union (“EU”) has taken the lead in regulating the technology industry. The 2016 General Data Protection Regulation (“GDPR”) was the landmark law held technology platforms accountable for safeguarding EU consumers’ data and privacy. In addition to enshrining the ‘right to be forgotten’ and the right to data portability, among others, GDPR enables the EU’s 450 million residents to control what data they choose to provide online and the right to redress when that data is accessed without their permission.

The next chapter in this longstanding consumer protection effort is the DSA, which aims to set clear rules for online platforms that serve as intermediaries between businesses and consumers (think Amazon, YouTube, Facebook, Google, TikTok). In response to claims that these platforms have an outsized-role in e-commerce, the DSA aims to set standards for transparency and accountability.

Changes

Clear policies and rights to appeal
Platforms (with more than 45 million monthly active users in the EU) must publish clearly articulated policies and terms of service for their business clients (including online advertisers). And if those business clients are denied access to these services, the platforms must give the businesses an explanation and a chance to appeal the decision.

Advertising transparency
Platforms that allow advertising will also face more transparency requirements. Each online advertisement will need to display (i) a clear designation that the information is an advertisement, (ii) on whose behalf the ad is presented, (iii) who paid for the ad, and (iv) how the recipients of the ad where determined.

Illegal content reporting
Consumers are also entitled to flag any illegal content they come across on the platforms. In doing so, the platforms are required to provide a response to those consumers regarding what action, if any, they’ve taken in response to the consumer’s notice.

Penalties

The DSA entered into force in November 2022 and platforms will need to comply by February 2024. Platforms that don’t comply will face penalties of 6% of global turnover.

2. The Digital Market Act (“DMA”)

The DMA is the anti-trust counterpart to the DSA. Whereas the DSA aims to promote transparency of online platforms, the DMA aims to promote open competition with and among these platforms.

Changes

Similarly to the DSA, the DMA also applies to platforms with more than 45 million monthly active users in the EU. These ‘gatekeepers’ will face new rules to prevent them from locking in customers or unfairly competing against the businesses that use the gatekeeper’s platform.

Lowering switching costs
Gatekeepers will need to make it easier for businesses and consumers to leave these platforms. As a matter of principle, it should be as easy to unsubscribe from these services as it was to subscribe to them in the first place. This principle extends to changing software defaults and removing pre-installed applications. In addition, the DMA imposes new requirements of vertical and horizontal interoperability so that consumers aren’t locked into a closed ecosystem of software or hardware tools. This should allow third-party services to compete. For example, enabling Apple users to backup their data to GoogleDrive or Dropbox just as easily as it is to backup to iCloud.

Promoting fair competition with clients
Businesses should use the gatekeeper services without sacrificing their own data to the platforms. One of the practices that landed Amazon in hot water in the US was using data about the performance of sellers in order to compete against them. This enabled Amazon to boost the performance of its own branded products at the expense of its business customers. This is a no-no under the DMA. In fact, the DMA goes further in requiring gatekeepers to provide their business clients with access to the data generated by their activity on the gatekeeper’s platform.

Penalties

The DMA will start to apply in May 2023 but won’t become fully applicable until February or March 2024. Platforms that don’t comply will face penalties of 10% of global turnover.

3. The Future of Section 230 of the Communications Decency Act

In contrast to the EU, which legislated its way to reigning in big tech, the biggest game changer in US laws may be a result of Supreme Court litigation. Two pivotal cases: Gonzales v. Google LLC and Twitter Inc. v. Taamneh will determine whether online platforms will continue to be shielded from legal liability for the content that appears on their sites.

At the heart of these cases is Section 230 of the Communications Decency Act of 1996. The law, passed in the days Prodigy and America Online, provides that an ‘interactive computer service’ shouldn’t be treated as the publisher of third-party content (e.g. user content like chat messages, comments or posts). As a result, the interactive computer service shouldn’t be liable for illegal content found on the platform (it also states that the interactive computer service shouldn’t be liable for restricting access to obscene or offensive content, even if that content would otherwise be constitutionally-protected free speech).

Twenty-seven years later, every U.S. technology platform that features user generated content — from Wikipedia to Yelp — relies on the protections provided by Section 230. As the Electronic Frontier Foundation argued in their Gonzalez v. Google LLC amicus brief, Section 230’s immunities “protect the architecture of the internet — the services that provide the ‘essential venues for public gatherings to celebrate some views, to protest others or simply to learn and inquire.’

But the two cases argued on February 21st (Gonzalez) and February 22nd (Twitter) challenge these protections.

In Gonzalez, the plaintiffs argue that YouTube’s video recommendations don’t fall within the scope of protected content because YouTube creates the thumbnails that users see. Specifically, the thumbnails that YouTube created for ISIS videos encouraged people to look at the terrorist propaganda, subjecting Google (parent company of YouTube) to liability for the death of an American victim of a terrorist attack under the Justice Against Sponsors of Terrorism Act.

In Twitter, which also featured Google and Meta as defendants, the key question is whether a platform should be liable for aiding and abetting ISIS terrorism as a result of actions they did or did not take to remove terrorist content. Similarly to Gonzalez, the family of a victim of a terrorist attack brought the lawsuit against the tech platforms, citing the platforms’ actions as ‘a’ cause of her death.

Even after the oral arguments, it is difficult to know which way the Supreme Court will land. Legal experts will parse the justices’ questions for hints as to the outcome of the decision. The Court could narrow the scope of 230, provide a definition of ‘recommendation’, send the case back to the lower courts on a technicality, repeal the law, or do none of the above!

And they may decide that the future of 230 may be best handled by Congress. As Justice Kagan noted, the current members of the Supreme Court “…are not, like, the nine greatest experts on the internet.” (although they probably aren’t the 535 greatest experts on the internet, either).

To add more uncertainty, we won’t know when the decisions on these cases will be announced. All we know is that the decision will be issued before the last day of the Court’s term in July. If the suspense is killing you, you’re not alone.

--

--

Olivier Kamanda

Product @Google. Term Member @Council on Foreign Relations; former White House Presidential Innovation Fellow; eng+law+policy