You canât have a single file as both your submit description file and job executable, as you can with PBS or Slurm.
You can add or change any command in the submit file from the command line of condor_submit with the -append option. To change the executable, you would do this:
condor_submit -append "executable=test_condor.py" m_job.submit
- Jaime
Apologies I don't think I understand your response. I usually have a submission script as job.sub that specifies the executable path. If I could instead use the actually executable script as the submission script that would be great too e.g. I
usually do that with qsub as follows:
```
#!/homes/miranda9/.conda/envs/automl-meta-learning/bin/python #PBS -V #PBS -M me@xxxxxxxxx #PBS -m abe #PBS -lselect=1:ncpus=112 import sys import os
for p in sys.path: print(p)
print(os.environ)
```
alternatively if I could do string manipulation and get the filename from the end of my Exectuable path that would work too. Let me share my submission scripts.
```
#################### # # Experiments script # Simple HTCondor submit description file # # reference: https://gitlab.engr.illinois.edu/Vision/vision-gpu-servers/-/wikis/HTCondor-user-guide#submit-jobs # # chmod a+x test_condor.py # chmod a+x experiments_meta_model_optimization.py # chmod a+x meta_learning_experiments_submission.py # chmod a+x download_miniImagenet.py # chmod a+x ~/meta-learning-lstm-pytorch/main.py # chmod a+x /home/miranda9/automl-meta-learning/automl-proj/meta_learning/datasets/rand_fc_nn_vec_mu_ls_gen.py # chmod a+x /home/miranda9/automl-meta-learning/automl-proj/experiments/meta_learning/supervised_experiments_submission.py # chmod a+x /home/miranda9/automl-meta-learning/results_plots/is_rapid_learning_real.py # chmod a+x /home/miranda9/automl-meta-learning/test_condor.py # condor_submit -i # condor_submit job.sub # ####################
# Executable = /home/miranda9/automl-meta-learning/automl-proj/experiments/meta_learning/supervised_experiments_submission.py # Executable = /home/miranda9/automl-meta-learning/automl-proj/experiments/meta_learning/meta_learning_experiments_submission.py # Executable = /home/miranda9/meta-learning-lstm-pytorch/main.py # Executable = /home/miranda9/automl-meta-learning/automl-proj/meta_learning/datasets/rand_fc_nn_vec_mu_ls_gen.py # Executable = /home/miranda9/automl-meta-learning/results_plots/is_rapid_learning_real.py Executable = /home/miranda9/automl-meta-learning/test_condor.py
# Output Files Log = $(SUBMIT_FILE).log$(CLUSTER) Output = $(SUBMIT_FILE).o$(CLUSTER) Error = $(SUBMIT_FILE).e$(CLUSTER)
# Use this to make sure 1 gpu is available. The key words are case insensitive. # REquest_gpus = 1 # requirements = (CUDADeviceName != "Tesla K40m") # requirements = (CUDADeviceName == "Quadro RTX 6000")
# requirements = ((CUDADeviceName = "Tesla K40m")) && (TARGET.Arch == "X86_64") && (TARGET.OpSys == "LINUX") && (TARGET.Disk >= RequestDisk) && (TARGET.Memory >= RequestMemory) && (TARGET.Cpus >= RequestCpus) && (TARGET.gpus >= Requestgpus) && ((TARGET.FileSystemDomain == MY.FileSystemDomain) || (TARGET.HasFileTransfer)) # requirements = (CUDADeviceName == "Tesla K40m") # requirements = (CUDADeviceName == "GeForce GTX TITAN X")
# Note: to use multiple CPUs instead of the default (one CPU), use request_cpus as well # Request_cpus = 4 Request_cpus = 16
# E-mail option Notify_user = me@xxxxxxxxx Notification = always
Environment = MY_CONDOR_JOB_ID= $(CLUSTER)
# "Queue" means add the setup until this line to the queue (needs to be at the end of script). Queue
```
Thanks for the help!
Sincerely, Brando
Use the âinclude command :â directive. You will probably need to run an in-line or external shell script to echo a submit key-value pair, such as:
DynamicFilename = my_input.txt
Ideally the command should put a double dash as the last line of its output, since that signals the end of the values. Thatâs then available as $(DynamicFilename) in the rest of the submission.
-Michael Pelletier.
|