It took me a while to get a clearer view of what seems to be happening, although I'm only aware of the results and not the cause. I'm not sure if it's a normal thing on Linux, though I don't see any reason for this behavior. Here is the problem:
My machine has 9GB of DDR3 memory (triple-channel hence the impair value) plus 8GB of SWAP. Typically about 2GB of RAM and 0GB of SWAP are used by the system. Everything is 64bit, including the processes involved.
If I open a process that uses a lot of RAM (I estimate over 1 GB triggers this visibly) the system becomes very slow while some applications begin to crash. The victim processes seem to be those that use a bit more memory themselves (even 300 MB can make them targeted). Even after I close the process eating +1GB of RAM, the system continues to be slow for a while, which I assume means it's freeing and optimizing the memory.
The most common test case for me is Blender 3D (memory eater) and Firefox (common victim). If I open a blend file that makes Blender use +2GB of RAM, Firefox occasionally crashes and I think plasma-desktop does too. Trying to open and use other small processes (such as Dolphin or Kwrite) is also very slow.
I'm confused because this is the behavior I'd expect when memory is full. Yet it happens when the RAM is barely half filled. I have 9GB out of which 2GB are normally used, so a new process eating 2GB only fills the memory up to 4GB (leaving 5GB free). Still, it acts like memory is full and the system barely manages. I don't remember getting this back when I used Windows on the same hardware either.
Why does the system get so slow and unstable at this RAM usage? Are there any safe ways to tweak how memory is handled by the Kernel? Perhaps a BIOS setting could be the cause also... or Linux issues with triple-channel RAM?
My machine has 9GB of DDR3 memory (triple-channel hence the impair value) plus 8GB of SWAP. Typically about 2GB of RAM and 0GB of SWAP are used by the system. Everything is 64bit, including the processes involved.
If I open a process that uses a lot of RAM (I estimate over 1 GB triggers this visibly) the system becomes very slow while some applications begin to crash. The victim processes seem to be those that use a bit more memory themselves (even 300 MB can make them targeted). Even after I close the process eating +1GB of RAM, the system continues to be slow for a while, which I assume means it's freeing and optimizing the memory.
The most common test case for me is Blender 3D (memory eater) and Firefox (common victim). If I open a blend file that makes Blender use +2GB of RAM, Firefox occasionally crashes and I think plasma-desktop does too. Trying to open and use other small processes (such as Dolphin or Kwrite) is also very slow.
I'm confused because this is the behavior I'd expect when memory is full. Yet it happens when the RAM is barely half filled. I have 9GB out of which 2GB are normally used, so a new process eating 2GB only fills the memory up to 4GB (leaving 5GB free). Still, it acts like memory is full and the system barely manages. I don't remember getting this back when I used Windows on the same hardware either.
Why does the system get so slow and unstable at this RAM usage? Are there any safe ways to tweak how memory is handled by the Kernel? Perhaps a BIOS setting could be the cause also... or Linux issues with triple-channel RAM?