Mailing List Archives
Authenticated access
|
|
|
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [HTCondor-users] d/l job logs from remote submit / schedd node with py bindings?
- Date: Wed, 1 Oct 2025 02:17:29 +0000
- From: Jaime Frey <jfrey@xxxxxxxxxxx>
- Subject: Re: [HTCondor-users] d/l job logs from remote submit / schedd node with py bindings?
> On Sep 30, 2025, at 4:16âPM, Gavin Price <gaprice@xxxxxxx> wrote:
>
> Hi Jamie, thanks for the response.
>
> On 9/30/25 13:56, Jaime Frey via HTCondor-users wrote:
>> The job submission interface (whether python bindings or condor_submit command-line tool) assumes that all files and directories mentioned in your submit description are accessible where youâre submitting from.
>
> For my case, I'm making the executable and input files available via http d/l by HTC
>
>> 1) The schedd has access to the files/directories (i.e. itâs running on the same machine).
>
> Schedd is not running on the same machine as the process using the python bindings, but the job successfully completes due to the above
>
>> 2) You will use the spool option, in which case the submit operation includes reading the files locally and transfering them over the network to the schedd. After the job completes, you can retrieve the output files over the network from the schedd.
>
> This sounds like what I'm looking for but I'm not sure how I would do this with the python bindings - could you elaborate?
This code snippet shows how to do a submit with file spooling:
sub = htcondor.Submit({ ... })
schedd = htcondor.Schedd()
submit_result = schedd.submit(sub, spool=True)
schedd.spool(submit_result)
# wait for job to complete
schedd.retrieve(submit_result.cluster())
The command-line tool equivalent is this:
condor_submit -spool job.description
condor_transfer_data <cluster-id>
- Jaime