Dear Ben,I am running on centos7 machines, I can see:el7.x86_64From inside the job. The program does write files, but only about 200-300 Mb, It definitely does not look like I am using more than a few hundred megabytes of memory.Cheers.On Tue, Mar 5, 2024 at 5:22âPM Ben Jones <ben.dylan.jones@xxxxxxx> wrote:_______________________________________________Do you know if youâre running on an el9 machine or similar? I think that on cgroup v2 machines the memory reporting currently reports file cache. Are you writing a large file?Â
On 5 Mar 2024, at 17:08, Angel Campoverde <angelfcampoverde@xxxxxxxxx> wrote:
ï_______________________________________________Dear Condor experts,
I am running a job that seems to be going beyond 8Gb in memory usage. However when I run it with memray (a memory profiler for python projects) I see that my memory usage is only 107Mb, as you can see in the screenshot below:
<image.png>
I have contacted the python mailing list:
and despite I try to turn off multithreading, I still get this high memory usage. Why is this happening and how should I fix it? I cannot use 16Gb of memory in my jobs, there are very few machines that have that much memory.
Cheers.
HTCondor-users mailing list
To unsubscribe, send a message to htcondor-users-request@xxxxxxxxxxx with a
subject: Unsubscribe
You can also unsubscribe by visiting
https://lists.cs.wisc.edu/mailman/listinfo/htcondor-users
The archives can be found at:
https://lists.cs.wisc.edu/archive/htcondor-users/
HTCondor-users mailing list
To unsubscribe, send a message to htcondor-users-request@xxxxxxxxxxx with a
subject: Unsubscribe
You can also unsubscribe by visiting
https://lists.cs.wisc.edu/mailman/listinfo/htcondor-users
The archives can be found at:
https://lists.cs.wisc.edu/archive/htcondor-users/