Tag: x

  • Bluesky CEO Jay Graber Says She Won’t ‘Enshittify the Network With Ads’

    Bluesky CEO Jay Graber Says She Won’t ‘Enshittify the Network With Ads’

    [ad_1]

    Our goal is to combine both approaches—to run a moderation service that tries to provide a baseline and to also have an open ecosystem where anyone who wants to innovate can come in and start building. I think this is particularly useful around cases where information is really fast moving and there’s specialized knowledge. There are organizations out there already in the business of fact-checking, or figuring out if a verified account is actually a politician or not. They can start annotating and putting that information into the network, and we can build off that collective intelligence.

    Recently there was a very high-profile incident on X where deepfake porn of Taylor Swift started spreading and the platform was not super prompt at clamping down. What’s your approach to moderating deepfakes?

    From the start we’ve been using some AI-detection services—image labeling services—but this is an area where there’s a lot of innovation and we’ve been looking at other alternatives.

    This is also where a third-party labeling system could really come into use. We can move faster as an open collective of people—she has lots of fans who could help identify content like this very proactively.

    What are the benefits of federation—where a social network is decentralized, consisting of a bunch of independent servers instead of one central hub—for the casual internet user?

    The goals here are to give developers the freedom to build, and users the right to leave. The ability for people to host their own data means that users always have other alternatives, and that their experience doesn’t have to just come from us. For example, if a user wants to try a wholly different app, or a whole different experience, or they want to move to a parallel social network.

    If someone was to use your protocol and build, say, a Taylor Swift deepfake porn community, is there anything you could do to stop that?

    With the open web model, someone can always put their own website on the internet, but it doesn’t have to be indexed. We’re also playing a role in surfacing and indexing content. For really bad stuff out there, we’re trying to make sure that it never gets shown, by de-promoting it and not connecting to it.

    Can you explain your business model?

    We really think that money follows value. There’s been skepticism that this whole model of social can work. People are even wondering what it is. So, first of all, we’re trying to prove that this ecosystem has value to users and developers, and that it can kick off an era of open innovation.

    From there, we’re going to monetize while following our values. Early on, Twitter was very open and everyone built on it. But then they shut down at some point, right? They turned into much more of a platform, and less something that looked like a protocol.

    Our whole approach is getting back to protocols, not platforms, and there are certain guarantees that we’ve built into the protocol. It’s locked open. Once we have proven out this approach, I think there’s lots of ways that money is going to flow through the ecosystem. We’re going to start exploring some of those models this year.

    [ad_2]

    Source link

  • Linda Yaccarino Says X Needs More Moderators After All

    Linda Yaccarino Says X Needs More Moderators After All

    [ad_1]

    When Elon Musk took over Twitter, since rebranded as X, his favorite letter of the alphabet, he went on a firing spree. Chief among those ejected were people working on trust and safety, the work of keeping bad content from hate speech to child exploitation off the platform.

    In front of a US senate committee today, X CEO Linda Yaccarino appeared to tacitly acknowledge Musk went too far in tearing down the platform’s guardrails, indicating the company was partially reversing course. She said that X had increased the number of trust and safety staff by ten percent in the past fourteen months and planned to hire 100 new moderators in Austin focused on child sexual exploitation.

    Yaccarino spoke at a senate hearing called to discuss social networks’ failure to curb child sexual abuse, alongside the CEOs of Meta, TikTok, Snap, and Discord. She also said multiple times that “less than one percent” of X users were under 18. That claim and her announcement that after 14 months of Musk’s ownership and deep cuts to trust and safety the company was now hiring new moderators raised the eyebrows of social platform experts and former Twitter employees.

    Theodora Skeadas, a former member of Twitter’s trust and safety team laid off by Musk in November 2022, says even after making the hires Yaccarino boasted of X is still woefully under-staffed for a major social platform. “Unless their technical systems for flagging and removing content have really improved, 100 is not enough,” says Skeadas. “And that seems unlikely because they’ve fired so many engineers.” X did not immediately respond to a request for comment.

    Bonfire of the Mods

    Shortly after acquiring Twitter in October 2022, Musk laid off nearly half of Twitter’s employees, making deep cuts into the trust and safety teams. Researchers and civil society organizations that had built relationships with the platform’s trust and safety teams in order to alert them to hateful or problematic content quickly found themselves without anyone left at the platform to contact.

    The platform was nearly banned in Brazil in the run up to the country’s 2022 presidential runoffs, after the country’s Electoral Court worried that Musk would allow election-related lies to spread. A team of academic researchers found that hate speech spiked after Musk took the helm, and in last September, ahead of a historic election year, X fired five of the remaining trust and safety workers focusing on combating mis- and disinformation.

    Skeadas says that before Musk took over, there were about 400 Twitter staff working on trust and safety, plus some 5,000 contractors who helped review content on the platform. Most of those staffers and more than 4,000 of the contractors were laid off.

    Even after Yaccarino’s claimed recent increase in trust and safety staff of more than 10 percent, the platform likely still has far fewer people working on keeping users safe. There’s “no way” the company has more trust and safety staff than it did before Musk, Skeadas says. “If there were twenty people left and they hired two people, then that is a ten percent increase but that’s still nothing compared to before,” she says.



    [ad_2]

    Source link