On Thu, 22 Jul 2010, Carsten Aulbert wrote:
Quite a few of our users need to read data from our data servers and of course would not like to thrash those with too many jobs hitting the servers at the same time.
I've been following this thread, but I didn't have a chance to comment before this. If I understand the problem correctly, it could be solved by being able to force two consecutive DAG node jobs to run on the same machine, right? (In other words, you'd have a bunch of pairs of data transfer/process jobs, and each pair would be forced to run on the same machine.) You could throttle the data transfer jobs with the category throttles in DAGMan, which would allow you to control the load on the server.
You can take a look at our thoughts on this (gittrac #572, or https://condor-wiki.cs.wisc.edu/index.cgi/tktview?tn=572,4.)
Kent Wenger Condor Team