[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[HTCondor-users] Dataflow job skips when executable is updated



Dear all,

I just discovered the massively useful "skip_if_dataflow" submit option (how did I miss this before?). Its docs say that the job will be skipped only if its outputs are newer than either its inputs or executable. This works correctly for the inputs, but when I touch the executable the job still skips.

A related one for the wish list: when "transfer_output_files = dir", then if directory 'dir' already exists, its timestamp isn't updated when Condor transfers it back (at least on my file system), hence the job will never be skipped.

I'm aware that directory timestamp updates depend on file system and transfer mechanism, but for dataflow jobs it would be desirable if Condor explicitly touched the items in transfer_output_files upon return.

Cheers
Marco