When working with datasets from PATHspider it’s very easy to max out your memory. When running an analysis recently, I found my computer locking up and generally being unresponsive. This new machine should be able to cope so this was odd, and then I realised that I was running two jupyter notebooks and the other had previously loaded all the data into memory, so this second one pushed me into swap. 6GB of memory got swapped and then the machine was near useless almost instantly.
Now that I have 24GB of RAM, swap isn’t actually useful. When I get to be using too much memory, I would prefer it if processes die instead.
Hopefully now I won’t be able to accidentally lock up my machine again.
This post was syndicated on: