Giver of skulls

Verified icon

  • 0 Posts
  • 304 Comments
Joined 102 years ago
cake
Cake day: June 6th, 1923

help-circle
  • Mastodon is just one of many applications that uses AP for their own custom purposes. MissKey and derived software has some kind of emoji response feature to posts that’s basically unimplemented anywhere else. Lemmy’s boosting trick to make comment sync make interoperability with timeline based social media a spamfest.

    Maybe I should check again, but last time I looked into it there were no commonly used ActivityPub compliant servers. Everyone does their own thing just a little different to make the protocol work for their purposes. Even similar tools (see: MissKey/Mastodon, Lemmy/Kbin) took a while to actually interoperate.

    As far as I can tell, the idea behind the original design, where servers are mostly content agnostic and clients decide on rendering content in specific ways, hasn’t been executed by anyone; servers and clients have been mixed together for practical reasons and that’s why we get these issues.


  • Building trust is hard. It’s easier to trust a few companies than to trust a million unknown servers. It’s why I prefer Wikipedia over amazingnotskgeneratedatalltopicalinformarion.biz when I’m looking up simple facts.

    Furthermore, Facebook isn’t selling data directly. At least, not if they’re following the law. They got caught doing and fined doing that once and it’s not their main mode of operation. Like Google, their data is their gold mine, selling it directly would be corporate suicide. They simply provide advertisers with spots to put an ad, but when it comes to data processing, they’re doing all the work before advertisers get a chance to look at a user’s profile.

    On the other hand, scraping ActivityPub for advertisers would be trivial. It’d be silly to go through the trouble to set up something like Threads if all you want is information, a basic AP server that follows ever Lemmy community and soaks up gigabytes an hour can be written as a weekend project.

    Various Chinese data centers are scraping the hell out of my server, and they carry referer headers from other Fediverse servers. I’ve blocked half of East Asia and new IP addresses keep popping up. Whatever data you think Facebook may be selling, someone else is already selling based on your Fediverse behaviour. Whatever Petal Search and all the others are doing, I don’t believe for a second they’re being honest about it.

    Most Fediverse software defaults to federation and accepting inbound follow requests. At least, Mastodon, Lemmy, GoToSocial, Kbin, and one of those fish named mastodonlikes did. Profiles are often public by default too. The vulnerability applies to a large section of the Fediverse default settings.

    I’d like to think people would switch to the Fediverse despite the paradigm shift. The privacy risks are still there if there’s only one company managing them, so I’d prefer it if people used appropriate tools for sharing private stuff. I think platforms like Circles (a Matrix-based social media system) which leverage encryption to ensure nobody can read things they shouldn’t have been able to, are much more appropriate. Perhaps a similar system can be laid on top of ActivityPub as well (after all, every entity already has a public/private key pair).


  • I don’t think dansup was in the wrong here. Yes, it’s a security issue I suppose, but the problem lies within the underlying protocol. Any server you interact with can ignore any privacy markers you add to posts, you’re just not supposed to do that.

    Whether this is a 0day depends on what you expect out of the Fediverse. If you treat it like a medium where every user or server has the potential to be hostile, like you probably should, this is a mere validation logic bug. If you treat it like the social media many of its servers are trying to be, it’s a gross violation of your basic privacy expectations.


  • This is exactly why ActivityPub makes for such a mediocre replacement for the big social media apps. You have to let go of any assumptions that at least some of your data remains exclusive to the ad algorithm and accept that everything you post or look at or scroll past is being recorded by malicious servers. Which, in turn, kind of makes it a failure, as replacing traditional social media is exactly what it’s supposed to do.

    The Fediverse also lacks tooling to filter out the idiots and assholes. That kind of moderation is a lot easier when you have a centralised database and moderation staff on board, but the network of tiny servers with each their own moderation capabilities will promote the worst behaviour as much as the best behaviour.

    But really, the worst part is the UX for apps. Fediverse apps suck at setting expectations. Of course Lemmy publishes when you’ve upvoted what posts, that’s essential for how the protocol works, but what other Reddit clone has a public voting history? Same with anyone using any form of the word “private” or even “unlisted”, as those only apply in a perfect world where servers have no bugs and where there are no malicious servers.














  • This issue/this issue at the corporate Element fork seem to indicate that the process is entirely manual. As in, make a client side backup per account, make all accounts leave all rooms, log out all sessions, kill Dendrite, start Synapse, recreate accounts, restore backups, pray.

    Even if you don’t migrate the data, you still need to deactivate all the accounts. If you can’t keep the host signing key material, you also need to make sure all servers know about all users having left or you start getting tons of phantom events and possibly even make existing rooms impossible to join.

    As for client backups: I know Element can do backups/data dumps, but I’m not sure about restoring.

    Alternatively, you may be able to write SQL migrations to convert Dendrite to Synapse, but that sounds like absolute developer agony.




  • Heavy moderation. Some real people will be flagged as bots, but it’s the only way. A well-prepared prompt can be indistinguishable from human language. Registration questions will only block bots because they stop bot makers from registering thousands of accounts at once.

    Servers with lax signup requirements will need to be blocked (temporarily?) when spammers find their way in. Detecting links and other promoted content can easily get rid of the commercial spammers, but it gets harder when you’re trying to block political bot farms. Even still, Fediverse software lacks any kind of standard “firewall” against spam, as a few Japanese teenagers demonstrated with a script not that long ago, and the trolls spamming child porn before them.

    On the wider Fediverse, it’s quite possible to set up a Lemmy server consisting entirely of bots, or a whole network of them, first only interacting with each other but soon becoming part of the wider Fediverse.

    Luckily, almost nobody cares about the Fediverse and the few bad actors I’ve seen don’t care enough to actually put in any effort. If the Fediverse does ever grow to become a sizable influence on the internet, we’re pretty much fucked.