On 04/08/2015 01:03 AM, Bjørn Roald wrote:
On 07. april 2015 22:33, Vladimir Prus wrote:
On 04/07/2015 10:42 PM, Bjørn Roald wrote:
Also, in principle, bjam could check available free RAM before invoking a new parallel task. I think -j 4 does not mean it _has_ to run 4 tasks in parallel, rather it means up to 4 tasks in paralell. An implicit -j
certainly should throttle on system resources, so why not available RAM as well as available cores. I don't think it's easy. If I run 4 compilations in parallel, and it consumes so much RAM and I/O that computer becomes unresponsive, it means the OS could not throttle these tasks effectively.
The OS can not throttle a running process' greediness for RAM, at least I don't think so. It could prevent new processes to start, but that is also tricky for the OS in a general sense. This is however trivial for a build system when deciding whether it should start additional parallel compiler invocations that are totally optional tuning for build speed. It make no sense to start additional compilations in parallel if you see the physical RAM is consumed. The tricky part is knowing when to stop adding parallel tasks to prevent getting in a consumed RAM state in the first place. And hopefully still leave ample space for the rest of the system to live.
On my system right now, there's 142M of free memory (and similar amount of buffers). That does sound sufficient for any compilation these days, still -j4 build is OK, since OS discards most of that memory and swaps the other quite easily. I am not sure we can answer "is there enough RAM" question reliably. Likewise, "will 4 jobs too much I/O" question is not easy to answer. - Volodya -- Vladimir Prus CodeSourcery / Mentor Embedded http://vladimirprus.com