Hello Derek,
At this moment i'm submiting jobs to remote SGE in this way. I've a Condor Job file which specifies Grid Universe, remote cluster user and host etc... Executable specified in this Condor file is a bash script that starts a small workflow. Problem is that all SGE parameters inside my bash script are being ignored. And I'm in need of different SGE configuration for each different Workflow (select different queue, slot number, max memory, etc..) Here's my bash script called by Condor job file: #!/bin/bash ## select env.var #$ -V ## job name #$ -N aln_bosco ## current work directory #$ -cwd ## merge outputs #$ -j y ## select all.q queue #$ -q all.q cd /home/mastablasta bwa aln /home/mastablasta/ref/hg19.fa /home/mastablasta/input/HapMap_2.fastq -t 8 > /home/mastablasta/output/tmp/HapMap.right.sai If custom scripts are being echoed I don't understand why this isn't working. Are custom submit properties simple variables? I don't understand the concept with the example in manual. +remote_cerequirements = NumJobs == 100 Can i pass my script a custom submit property like this for SGE? +remote_cerequirements = Queue = all.q Thank you. Best regards, Guillermo. On 03/05/2013 05:49 PM, Derek Weitzel wrote: Hi Guillermo, Are you using the configuration described here: https://twiki.grid.iu.edu/bin/view/CampusGrids/BoscoInstall#7_3_Custom_submit_properties I'm not sure what you are referring to when you say: "SGE job file configuration". -Derek On Mar 5, 2013, at 10:42 AM, Guillermo Marco Puche <guillermo.marco@xxxxxxxxxxxxxxxxxxxxx> wrote:Hello again, My setup to remote SGE cluster is finally working ! However Bosco (Condor) is still ignoring my SGE job file configuration. Bosco submits a job to a remote SGE cluster using a Condor job file. Executable inside Condor file points to a wrapper script which has SGE configuration inside (number of processors, selected queue, job name, current working directory) but Condor completly ignores them when submiting the job. Thank you. Best regards, Guillermo. On 03/02/2013 02:22 PM, Guillermo Marco Puche wrote:Hello, That's exactly what i thought. I'll stick to Bosco then ;) Thanks. Best regards, Guillermo. El 01/03/2013 18:53, Jaime Frey escribió:On Mar 1, 2013, at 5:00 AM, Guillermo Marco Puche <guillermo.marco@xxxxxxxxxxxxxxxxxxxxx> wrote:I've been trying Bosco lately and seems to work pretty well for me to submit to another lan cluster SGE cluster. For example: $ condor_q -- Submitter: brugal : <192.168.6.2:11000?sock=3072_dcd9_3> : brugal ID OWNER SUBMITTED RUN_TIME ST PRI SIZE CMD 62.0 gmarco 3/1 04:43 0+00:14:32 R 0 0.0 bwa.sh I was then trying to achieve the same but with my local Condor installation and not with condor pool inside Bosco. I'm having no success when trying to submit exactly the same condor job file: As root i start condor with "condor_master". ps -ef | grep condor condor 3850 1 0 05:05 ? 00:00:00 condor_master condor 3851 3850 0 05:05 ? 00:00:00 condor_collector -f condor 3853 3850 0 05:05 ? 00:00:00 condor_negotiator -f condor 3854 3850 0 05:05 ? 00:00:00 condor_schedd -f condor 3855 3850 0 05:05 ? 00:00:00 condor_startd -f root 3856 3854 0 05:05 ? 00:00:00 condor_procd -A /var/run/condor/procd_pipe.SCHEDD -L /var/log/condor/ProcLog.SCHEDD -R 10000000 -S 60 -C 498 condor 3907 3855 87 05:05 ? 00:00:03 mips root 3924 3758 0 05:05 pts/0 00:00:00 grep condor I try to submit my job and holds on Idle state forever, with Bosco I don't have that problem: condor_q -- Submitter: brugal : <192.168.6.2:41257> : brugal ID OWNER SUBMITTED RUN_TIME ST PRI SIZE CMD 26.0 gmarco 3/1 05:07 0+00:00:00 I 0 0.0 bwa.sh That's my job file: universe = grid grid_resource = batch sge gmarco@cacique executable = bwa.sh output = bwa.out error = bwa.err log = bwa.log should_transfer_files = YES transfer_output = true stream_output = true when_to_transfer_output = ON_EXIT_OR_EVICT queueSubmitting jobs to a remote cluster using a regular installation of HTCondor requires some manual configuration steps, which we don't have documented currently. This is one of the advantages of Bosco. Over time, we may make this kind of job submission easier to do with a regular HTCondor installation. Thanks and regards, Jaime Frey UW-Madison HTCondor Project _______________________________________________ HTCondor-users mailing list To unsubscribe, send a message to htcondor-users-request@xxxxxxxxxxx with a subject: Unsubscribe You can also unsubscribe by visiting https://lists.cs.wisc.edu/mailman/listinfo/htcondor-users The archives can be found at: https://lists.cs.wisc.edu/archive/htcondor-users/_______________________________________________ HTCondor-users mailing list To unsubscribe, send a message to htcondor-users-request@xxxxxxxxxxx with a subject: Unsubscribe You can also unsubscribe by visiting https://lists.cs.wisc.edu/mailman/listinfo/htcondor-users The archives can be found at: https://lists.cs.wisc.edu/archive/htcondor-users/-- <1MjpCpe.png> G.MARCO: Informatician at Sistemas Genómicos S.L phone: 0034635197460 web: www.sistemasgenomicos.com _______________________________________________ HTCondor-users mailing list To unsubscribe, send a message to htcondor-users-request@xxxxxxxxxxx with a subject: Unsubscribe You can also unsubscribe by visiting https://lists.cs.wisc.edu/mailman/listinfo/htcondor-users The archives can be found at: https://lists.cs.wisc.edu/archive/htcondor-users/ --
g.marco: Informatician at Sistemas Genómicos S.L
phone: 0034635197460 web: www.sistemasgenomicos.com |