[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [HTCondor-users] d/l job logs from remote submit / schedd node with py bindings?



This sounds like what I'm looking for but I'm not sure how I would do this with the python bindings - could you elaborate?

The API is perhaps a little clunky, but the general sequence is as follows:


import htcondor2

# Submit the job.
schedd = htcondor2.Schedd()
submit = htcondor2.Submit(...)
result = schedd.submit(submit, spool=True, count=1)

# Send the input files.
schedd.spool(result)

# Wait for the job to finish.
# (Code left as an exercise for the reader.)

# Retrieve the output files.
schedd.retrieve(result.cluster())


As usual, the job submit file specifies what files constitute the output;
I don't recall where in the process, if at all, the remaps take place,
but `retrieve()` should behave identically to the `condor_transfer_data` command-line tool in that respect. (The default, IIRC, is to write the
output files to the directory the tool is invoked in.)

-- ToddM