YouTube CEO Says Trump Suspension to Last Until Violence Risk Decreases


YouTube’s chief executive said the company still sees a risk of violence in the aftermath of the Jan. 6 Capitol Hill riot and that it would only lift its suspension of former President Donald Trump from the video-sharing platform once it determines “the risk of violence has decreased.”

Susan Wojcicki,

CEO of the

Alphabet Inc.


GOOG 1.10%

unit, on Thursday said, “Given just the warnings by the Capitol Police yesterday about a potential attack today, I think it’s pretty clear that that elevated violence risk still remains.” She was speaking at a virtual event hosted by the Atlantic Council, a nonpartisan think tank.

U.S. Capitol police on Wednesday warned of a potential plot by an identified militia group to breach the building.

Representatives for Mr. Trump didn’t immediately respond to a request for comment.

Former President Donald Trump spoke at the Conservative Political Action Conference in Orlando, Fla., on Feb. 28.



Photo:

octavio jones/Reuters

YouTube suspended Mr. Trump on Jan. 12 for violating its policies against content it believes could incite violence. The company said at the time that the suspension would last at least seven days.

YouTube was one of several social-media companies, including

Twitter Inc.

and

Facebook Inc.,

that ejected Mr. Trump from their platforms this year. The moves came shortly after the attack on the U.S. Capitol by a pro-Trump mob. The actions by the companies stirred debate over whether they were preventing future violence or infringing on free speech. Mr. Trump was later accused of inciting the attack by both Democratic and Republican lawmakers, including then-Senate Majority Leader

Mitch McConnell.

“We will lift the suspension of the Donald Trump channel when we determine the risk of violence has decreased,” Ms. Wojcicki said, adding it was unclear when that might occur. “Right now, where we stand, there still is that elevated risk of violence.”

Twitter said its decision to ban the former president was permanent. Facebook in late January assigned its outside oversight board to determine whether it should reinstate Mr. Trump’s Facebook and Instagram accounts. The former president last month submitted a statement to the board arguing the ban was unjustified.

At Thursday’s event, Ms. Wojcicki said YouTube looks to government warnings and increases in the presence of law enforcement around the country in determining whether the risk of violence remains high. She also said a rise in violent rhetoric on YouTube would serve as a red flag.

“We have an intelligence desk where we look and try to understand everything to get ahead of what’s happening on our platform,” she said.

Once YouTube is ready to reinstate Mr. Trump’s account on the platform, Ms. Wojcicki said he will be subject to the same policies as every other user. YouTube generally employs a “three strikes” rule in which users get three warnings before an account is removed. But, she said there are exceptions when an account is blocked more swiftly, such as violent extremism. When YouTube suspended Mr. Trump’s account in January, it came after the first strike, she said.

A reduced presence of law enforcement in capitals in the U.S. and fewer warnings coming from government agencies about violence would trigger YouTube reinstating Mr. Trump’s account. “Those would be all signals to us that it would be safe to turn the channel back on,” she said.

Big Tech’s deplatforming of former President Donald Trump has sparked a debate about the future of content moderation on social media. WSJ speaks with a disinformation and moderation expert about what comes next. (Originally published Jan. 22, 2021)

Write to Sarah E. Needleman at sarah.needleman@wsj.com

Copyright ©2020 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8



Source link Tech

Be the first to comment

Leave a Reply

Your email address will not be published.


*