Mailing List Archives
Authenticated access
|
|
|
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [condor-users] controlling number of jobs running at once
- Date: Wed, 3 Dec 2003 00:38:36 -0600
- From: Erik Paulson <epaulson@xxxxxxxxxxx>
- Subject: Re: [condor-users] controlling number of jobs running at once
On Tue, Dec 02, 2003 at 12:08:48PM +0000, Paul Wilson wrote:
>
> Alternatively, but requiring some pre-submission admin work:
> condor dagman can do this sequential submission for you.
> You'ld have to create 540 submit files, one for each run, then specify
> them in your condor_dag submit file and have something like:
>
> Parent a b c d e f g h i j Child k l m n o p q r s t u v.
> Parent w x y z aa bb cc dd ee ff gg hh ii jj Child kk ll mm nn oo ...
> and so on until you have specified 54x 10 Parent/Child job steps.
>
> This way, the next ten jobs would only be submitted to the schedd when
> the previous ten have finished.
> Also, you could use dagman's pre/post script feature to automate file
> admin you want to do between each ten jobs.
>
There's an easier way to do this with DAGMan - DAGMan can "throttle"
the number of jobs submitted at any given time.
If you create a flat DAG (ie something like
job one one.submit
job two two.submit
<...>
job five_hundred five_hundred.submit
with no parent/child relationships)
and then use
condor_submit_dag -maxjobs 25 everything.dag
DAGMan will not submit any more than 25 jobs at a time, and when there's
room, it will submit more.
-Erik
Condor Support Information:
http://www.cs.wisc.edu/condor/condor-support/
To Unsubscribe, send mail to majordomo@xxxxxxxxxxx with
unsubscribe condor-users <your_email_address>