Hi all,
I want to confirm with you about the memory size exposed to Condor jobs.
My understanding is, the memory size exposed to Condor jobs equals to (MEMORY - RESERVED_MEMORY) if
MEMORY is defined in Condor's config file or (physical memory detected by Condor - RESERVED_MEMORY)
if MEMORY is not set. In both cases, RESERVED_MEMORY will be replaced by zero if it is not defined
in the config file. For example, if a machine's physical memory is 512MB, the value of MEMORY is set
as 396MB, and RESERVED_MEMORY is unset, the maximum memory the Condor jobs can use is 396MB.
Thanks.
Xuehai
condor does NOT enfoce any VM related splits...
A process startred by condor can use as much memory/disk etc as the OS
will give it.
In the case of memory this is almost always "as much as it wants to
the OS limit"
These values just alter what condor *reports* to the outside world.
If your jobs requirements settings indicate that it needs more than
the reported value it will not both using that machine...