Prime News Ghana

TED 2019: Twitter boss offers to demote likes and follows

By Jennifer Kim Essel
Shares
facebook sharing button Share
twitter sharing button Tweet
email sharing button Email
sharethis sharing button Share

Twitter co-founder Jack Dorsey has again admitted there is much work to do to improve Twitter and cut down on the amount of abuse and misinformation on the platform.

He said the firm might demote likes and follows, adding that in hindsight he would not have designed the platform to highlight these.

He said that Twitter currently incentivised people “to post outrage”.

Instead, he said it should invite people to unite around topics and communities.

“It may be best if it becomes an interest-based network,” he told TED curators, Chris Anderson and Whitney Pennington Rodgers.

Rather than focus on following individual accounts, users could be encouraged to follow hashtags, trends and communities.

Doing so would require a systematic change that represented a “huge shift” for Twitter.

READ ALSO:Twitter reactions: Mahama,Trump, Obama join others to eulogize Tiger Woods

On the topic of abuse, he admitted that it was happening “at scale”.
“We’ve seen harassment, manipulation, misinformation which are dynamics we did not expect 13 years ago when we founded the company,” he told TED curator Chris Anderson.

“What worries me is how we address them in a systematic way.”

He has previously discussed the role played by likes and follows, which were designed to be prominent.

“One of the choices we made was to make the number of people that follow you big and bold. If I started Twitter now I would not emphasise follows and I would not create likes.

“We have to look at how we display follows and likes,” he added.

Ms Pennington Rodgers asked him why, according to Amnesty, women of colour on average received abuse in one of 10 tweets they posted.

“It’s a pretty terrible situation,” Mr Dorsey admitted.

“The dynamics of the system makes it super-easy to harass others.”

He said that Twitter was increasingly using machine-learning to spot abuse and claimed that 38% of abusive tweets were now identified by algorithms and then highlighted to humans, who decide whether to remove them from the platform.

He also said that the firm was working on making it easier to find its policies on abuse and was simplifying them.

Asked if he would show urgency in dealing with the issues, he replied simply: “Yes.”

Ask Jack

The TED audience were invited to contribute to the conversation via the hashtag #askJackatTED, which received more than 1,000 questions within 10 minutes of the talk starting.

One of the questions came from journalist Carole Cadwalladr who spoke at TED on Monday and called on the tech firms, including Twitter, to directly address the issue of misinformation being shared widely on their platforms.

But in her question to Mr Dorsey, she turned her attention to the abuse she has received on Twitter.

“I’d like to know why a video that showed me being beaten up and threatened with a gun to the soundtrack of Russian anthem stayed up for 72 hours despite 1000s of complaints?” she wrote.

Mr Dorsey did not address that question and neither did he answer another one about how to deal with the huge number of malicious bots posting misinformation.

He was also shown a graph created by Zignal Labs which showed the number of human tweets versus tweets from suspected bots talking about topics in the recent election campaign in Israel.

Bots seemed to dominate when it came to tweets about contender Benny Gantz, who was narrowly defeated by Benjamin Netanyahu.

Mr Dorsey was asked about this but did not answer.

Instead, he said that the company was in the middle of measuring the “conversational health” of the platform, using a number of metrics, including how toxic conversations were and how much people are exposed to a variety of opinions.

“We have to create a healthy contribution to the network and a healthy conversation. On Twitter right now you don’t necessarily walk away feeling you learned something.”

 

Source: BBC

Â