Just install Chrome or Firefox. Problem solved.
weak. compile them
Yup I max out 32GB building librewolf from source
compile in tmpfs
I compile them in swap and swap is of course Google Drive
and a vm or 2
I have 32GB and regularly fill both that and my swap space to the point where my system freezes up and i have to restart.
i am quite tabby though. And vscode has become quite a memory hog and i usually have several of those open too as i work across different projects
I have this misunderstanding even if I use Linux a lot that when I work for a long time with a lot of things opened… my RAM fill up and never get down.
I heard it had to do with swap, can you quickly explain why?
Its more likely caching. They just keep the cache of files opened earlier so that its ready for you if you need it immediately again. Also unused ram is wasted ram
The extra space is for two Electron apps of your choice.
discord and microsoft teams 😍
You picked two of the crappiest apps ever.
That’s the point
Teams in browser is okay
Let’s start with one and see how it goes.
Just like the human eye can only register 60fps and no more, your computer can only register 4gb of RAM and no more. Anything more than that is just marketing.
Fucking /S since you clowns can’t tell.
That’s not sarcasm, it’s misinformation. Not surprising that people downvoted you even though it was just a joke.
That’s what makes it a joke. Does anyone here unironcally think the human eye can only see 60 fps or that more than 4 gigs of ram is just marketing?
Human eye can’t see more than 1080p anyway, so what’s the point
It doesn’t matter honestly, everyone knows humans can’t see screens at all
This is only true if you’re still using a 32 bit cpu, which almost nobody is. 64 bit cpus can use up to 16 million TB of RAM.
Sorry I forgot to put my giant /s.
Jokes on you, because i looked into this once. I don’t know the exact ms the light-sensitive rods in human eyes need to refresh the chemical anymore but it resulted in about 70 fps, so about 13 ms i guess (the color-sensitive cones are far slower). But psycho-optical effects can drive that number up to 100 fps in LCD displays. Though it looks like you can train yourself with certain computer tasks to follow movements with your eye, being far more sensible to flickering.
It’s not about training, eye tracking is just that much more sensitive to pixels jumping
You can immediately see choppy movement when you look around in a 1st person view game. Or if it’s an RTS you can see the trail behind your mouse anyway
I can see this choppiness at 280 FPS. The only way to get rid of it is to turn on strobing, but that comes with double images at certain parts of the screen
Just give me a 480 FPS OLED with black frame insertion already, FFS
Well, i do not follow movements (jump to the target) with my eyes and see no difference between 30 and 60 FPS, run comfortably Ark Survival on my iGPU at 20 FPS. And i’m still pretty good in shooters.
Yeah, it’s bad that our current tech stack doesn’t allow to just change image where change happens.
According to this study, the eye can see a difference as high as 500 fps. While this is a specific scenario, it’s a scenario that could possibly happen in a video game, so I guess it means we can go to around 500 hz monitors before it becomes too much or unnessessary.
Does that refresh take place across the entire eye simultaneously or is each rod and/or cone doing its own thing?
Are your eyeballs progressive scan or interlaced, son?
You’ve clearly never lived with a cat. Your metaphor is crushed by the Kitty Expansion Theory: No piece of furniture is large enough for a cat and any other additional being.
Caching be like
Caching do indeed be like.
The kitty expansion theory is incomplete, any piece of furniture is large enough for both a cat and an additional being provided the additional being was there first
Now snap some pics of this kitty laying in different places all over this couch; you now have a new meme: Address Space Layout Randomization.
As somebody with a System76 laptop, I’m feeling personally attacked.
I installed 64gb of ram on my gaming laptop and Chrome took all of it.
I genuinely don’t know how people are having their web browser use so much ram. How many tabs do you have open? Even at work where I run a commercial loan origination system and our core customer system in a web browser, at most I’ll have 15-20 tabs open. I don’t know how people are having dozens and dozens of tabs open that they’re using 64 gb of RAM.
4GB of RAM: load a model into llama.cpp
Explodes
Apple be like: our 4gb is like 16gb from others
That’s right. Prize wise
My 2010 arm board with 256MB ram running openmediavault and minidlna for music streaming. Still lots of RAM left.
mount $HOME in tmpfs
Op doesn’t run applications, just an os…
About 10 years ago I was like “FINE, clearly 512MB of memory isn’t enough to avoid swapping hell, I’ll get 1 GB of extra memory.” …and that was that!
These days I’m like “4 GB on a single board computer? Oh that’s fine. You may need that much to run a browser. And who’s going to run a browser regularly on a SBC? …oh I’ve done it a lot of times and it’s… fine.”
The thing I learned is that you can run a whole bunch of SHIT HOT server software on a system with less than a gigabyte of memory. The moment you run a web browser? FUCK ALL THAT.
And that’s basically what I found out long ago. I had a laptop that had like 32 megs of memory. Could be a perfectly productive person with that. Emacs. Darcs. SSH over a weird USB Wi-Fi dongle. But running a web browser? Can’t do Firefox. Opera kinda worked. Wouldn’t work nowadays, no. But Emacs probably still would.
Someone clearly doesn’t play Cities: Skylines with mods
Current 4 year old laptop with 128GB of ECC RAM is wonderful and is used all the time with simulations, LLMs, ML modelling, and the real heavy lifter, Google Chrome.