

This is exactly what I wanted. I might make the switch now.
Alternate account for @[email protected]
This is exactly what I wanted. I might make the switch now.
Discord is a terrible format to manage large, complex communities and projects
Terrible how, though? That’s exactly what it gets right. You have easy-to-setup roles and channel accesses, onboarding experiences for people joining a larger server, a huge ecosystem of bots for various purposes, etc.
okay, it is bad for not being indexable, but it’s good at what it does and it’s popular for a reason.
Articles like this are constantly like “if only there was something we can do about it” while omitting the thing people are doing about it because the writer is too lazy to research properly
Yeah I remember voicing this concern when all online communities seemed to be going to discord and people seemed to mainly laugh at me in response at the time.
Because there hasn’t been a single proper alternative until very recently, and even then they’re not as user friendly.
Bookwyrm does have a feature to fetch book data from OpenLibrary which is detailed enough. The problem last I checked is that the feature doesn’t replace data of already-existing books, so even if it exists in OL it won’t replace the empty listing in Bookwyrm.
If that feature worked properly or the admin of the instance would update to the newest database of OpenLibrary then it should work fine.
How does it compare to regular deepseek distills though?
The middle east is very big so you’ll need to be more specific. I live in the ME and it works fine on my end
I’ll take a guess that they are going back to Reddit so they can control what gets posted and shown to others. They got railed on Mastodon following the drama, which is probably why they’re leaving.
It’s absolutely positioned to be a cheap AI PC. Mac Studio started gaining popularity due to its shared RAM, Nvidia responded with their home server thing, and now AMD responds with this.
It being way cheaper and potentially faster is huge. the bad news is that it’s probably going to be scalped and out of stock for some time.
Windows/Linux laptops with 128gb unified memory is gon be soooooo good.
I just hope the other laptops coming out with the 395 won’t be as ridiculously priced.
Tom Cruise stunt double checking in
TL;DR is that Lemmy was originally created to make a Reddit alternative where communists could talk freely after they got tired of Reddit. Both .ML and Lemmygrad were created by the founders and were the most popular instances by far until we moved here.
Despite terrible moderation and really questionable activity, the devs never called ML the official instance and encouraged that people spread out to other instances. It’s still known as the tankie instance today.
I understand it well. It’s still relevant to mention that you can run the distilled models on consumer hardware if you really care about privacy. 8GB+ VRAM isn’t crazy, especially if you have a ton of unified memory on macbooks or some Windows laptops releasing this year that have 64+GB unified memory. There are also websites re-hosting various versions of Deepseek like Huggingface hosting the 32B model which is good enough for most people.
Instead, the article is written like there is literally no way to use Deepseek privately, which is literally wrong.
DeepSeek is open source, meaning you can modify code(new window) on your own app to create an independent — and more secure — version. This has led some to hope that a more privacy-friendly version of DeepSeek could be developed. However, using DeepSeek in its current form — as it exists today, hosted in China — comes with serious risks for anyone concerned about their most sensitive, private information.
Any model trained or operated on DeepSeek’s servers is still subject to Chinese data laws, meaning that the Chinese government can demand access at any time.
What??? Whoever wrote this sounds like he has 0 understanding of how it works. There is no “more privacy-friendly version” that could be developed, the models are already out and you can run the entire model 100% locally. That’s as privacy-friendly as it gets.
“Any model trained or operated on DeepSeek’s servers are still subject to Chinese data laws”
Operated, yes. Trained, no. The model is MIT licensed, China has nothing on you when you run it yourself. I expect better from a company whose whole business is on privacy.
Yup, I spotted a few Lemmy accounts that were less than a week old recently. Very nice.
This place just needs better moderation. I’ve said this multiple times before but there is a serious lack of moderation and most admins go dark for long periods of time. Make it clear this behavior isn’t okay, ban people who run their mouth, and remove low-effort posts.
I literally just got called a snowflake in another thread for saying people should stop posting US politics in general communities. People still wonder why Lemmy has a bad reputation even in the entire Fediverse… Sometimes I wonder why I still bother here.
Everything about this looks like it’s part of a failed theme park that they salvaged and are trying to sell as a home
The slide next to the main stairs gets me everytime.
It’s because you can actually have discussions on Lemmy, whereas microblogging like mastodon is just “old man shouts at cloud” multiplied by 2 million people. I never understood the appeal.
My biggest issue with Piefed is how much space the UI uses. Last I checked it didn’t have a “compact mode” like current Lemmy or Alexandrite. Browsing communities is also a bit awkward since it shows you so many topics without a way to sort or remove them.