There was no fancy Hill hearing room for this all-virtual event, so Twitter CEO Jack Dorsey dialed in from... a kitchen.
Enlarge / There was no fancy Hill hearing room for this all-virtual event, so Twitter CEO Jack Dorsey dialed in from… a kitchen.

A trio of major tech CEOs—Alphabet’s Sundar Pichai, Facebook’s Mark Zuckerberg, and Twitter’s Jack Dorsey—once again went before Congress this week to explain their roles in the social media ecosystem. The hearing nominally focused on disinformation and extremism, particularly in the wake of the January 6 events at the US Capitol. But as always, the members asking the questions frequently ventured far afield.

The hearing focused less on specific posts than previous Congressional grillings, but it was mainly an exercise in people talking to plant their stakes. Considered in totality, fairly little of substance was accomplished during the hearing’s lengthy six-hour runtime.

Nonetheless, a few important policy nuggets did manage to come up.

On Section 230

Section 230 is an enormously misunderstood snippet of law that has become a rallying cry for reformers in both parties. At a high level, Section 230 basically means that Internet companies have legal immunity for the content their users generate or for the moderation choices they do or don’t make around that content.

On the left, proposed Section 230 reforms are largely targeted at limiting abuse and disinformation. From the right, proposed Section 230 reforms, including repeal proposals, tend to focus more on claims of alleged “bias” among social media firms. All manner of bills to amend or repeal the law have been introduced in both the previous and current Congress, by both Republican and Democratic sponsors.

Zuckerberg set the stage ahead of time by including a plea to reform Section 230 in his written testimony (PDF).

“I believe that Section 230 would benefit from thoughtful changes to make it work better for people,” he wrote, adding:

We believe Congress should consider making platforms’ intermediary liability protection for certain types of unlawful content conditional on companies’ ability to meet best practices to combat the spread of this content. Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Platforms should not be held liable if a particular piece of content evades its detection—that would be impractical for platforms with billions of posts per day—but they should be required to have adequate systems in place to address unlawful content.

Pichai’s prepared testimony (PDF), on the other hand, basically asked Congress to leave well enough alone. “Regulation has an important role to play in ensuring that we protect what is great about the open web, while addressing harm and improving accountability,” he wrote, adding:

We are concerned that many recent proposals to change Section 230—including calls to repeal it altogether—would not serve that objective well. In fact, they would have unintended consequences—harming both free expression and the ability of platforms to take responsible action to protect users in the face of constantly evolving challenges.

During the hearing, neither Pichai nor Dorsey seemed particularly inclined to back Zuckerberg’s take on what’s best for the future of the Internet. Pichai said he thought the accountability and transparency the Facebook CEO mentioned were “important principles” and that there were some legislative proposals floating around in Congress that Google would “welcome.”

Dorsey, however, pointed out that most platforms are not anything like the size of Facebook, which reaches about 2.8 billion monthly users. “I think it’s going to be very hard to determine what’s a large platform and a small platform, and it may incentivize the wrong things,” he cautioned.

On violence, Trump, and deplatforming

All three CEOs also were asked to say, yes or no, if they felt their platforms had a role in the violence of the Capitol riot.

“I think the responsibility lies with the people who took action to break the law and do the insurrection,” Zuckerberg replied. “And secondarily with the people who spread that content, including the former president.

It’s something of a tightrope, Zuckerberg seemed to say, indicating that Facebook tried to act proactively in the fall “to secure the integrity of the election” against a likely tide of misinformation. “And then on January 6, President Trump gave a speech… calling on people to fight.”

Dorsey was the only witness to agree outright that his company played a role. “Yes,” he confirmed. “But you have to take into consideration the broader ecosystem [of misinformation]. It’s not just about the technological systems that we use.”

Republican members of the committee remained frustrated with the bans and suspensions former President Donald Trump earned from Facebook, Twitter, and YouTube in the wake of the January 6 insurrection at the Capitol.

Facebook has called in its Oversight Board to make the final ruling on Trump’s suspension, and Zuckerberg confirmed in the hearing that if the board says Trump’s account should be reinstated, “then we will honor that.”