ah, sorry - forgot about the obvious and did not check the versions... ð indeed, the broken 'new' CE was running on condor v8.8.9 and condor-ce v4.4.0 while the working CE was on condor v8.9.7 and condor-ce v4.4.0 - so the Condor minor version (security?) caused apparently my troubles. Anyway, I fixed a minor bug in my repo/package deployement ;) Cheers and thanks, Thomas [client] condor-8.9.7-1.el7.x86_64 condor-classads-8.9.7-1.el7.x86_64 condor-external-libs-8.9.7-1.el7.x86_64 condor-procd-8.9.7-1.el7.x86_64 htcondor-ce-client-4.4.0-1.el7.noarch python2-condor-8.9.7-1.el7.x86_64 [Condor CE broken] condor-8.8.9-1.el7.x86_64 condor-classads-8.8.9-1.el7.x86_64 condor-external-libs-8.8.9-1.el7.x86_64 condor-procd-8.8.9-1.el7.x86_64 htcondor-ce-4.4.0-1.el7.noarch htcondor-ce-apel-4.4.0-1.el7.noarch htcondor-ce-bdii-4.4.0-1.el7.noarch htcondor-ce-client-4.4.0-1.el7.noarch htcondor-ce-condor-4.4.0-1.el7.noarch htcondor-ce-view-4.4.0-1.el7.noarch python2-condor-8.8.9-1.el7.x86_64 [Condor CE working] condor-8.9.7-1.el7.x86_64 condor-classads-8.9.7-1.el7.x86_64 condor-external-libs-8.9.7-1.el7.x86_64 condor-procd-8.9.7-1.el7.x86_64 htcondor-ce-4.4.0-1.el7.noarch htcondor-ce-apel-4.4.0-1.el7.noarch htcondor-ce-bdii-4.4.0-1.el7.noarch htcondor-ce-client-4.4.0-1.el7.noarch htcondor-ce-condor-4.4.0-1.el7.noarch htcondor-ce-view-4.4.0-1.el7.noarch python2-condor-8.9.7-1.el7.x86_64 On 15/07/2020 20.31, MÃtyÃs Selmeci wrote: > Hi Thomas, > > What are the versions of condor and condor-ce on each machine? > > -Mat > > On 7/15/20 10:23 AM, Thomas Hartmann wrote: >> Hi all, >> >> I have spawned a new CondorCE instance based on a working puppet >> manifest - but condor_ce_trace fails to connect to the CE schedd. So the >> puppet manifest might be incomplete with something I forgot to add to >> the manifest :-/ >> >> On the broken CE, I see the client's request in the Collector [1] - >> which differs from the working CE by >> filter_private_ads=1 >> in the Query info and then >> (Sending 0 ads in response to query) [broken] >> vs. >> (Sending 1 ads in response to query) [working] >> >> I guess that the 1 class ad returned to the client is some kind of ACK, >> that is not returned in the broken case, or? >> >> The condor-ce/condor config files are the same on both machines, so that >> I am a bit lost here, what breaks the new instance? >> Maybe some port or so I forgot about?? >> >> Cheers, >> Thomas >> >> >> [1] >> 07/15/20 16:39:54 (Sending 1 ads in response to query) >> 07/15/20 16:39:54 Query includes collector's self ad >> 07/15/20 16:39:54 Updating collector stats using a chained ad and config= >> 07/15/20 16:39:54 Query info: matched=1; skipped=0; query_time=0.000117; >> send_time=0.000237; type=Collector; requirements={((true))}; locate=0; >> limit=0; from=TOOL; peer=<131.169.223.90:33765>; projection={} >> 07/15/20 16:39:54 Got QUERY_SCHEDD_ADS >> 07/15/20 16:39:54 (Sending 0 ads in response to query) >> 07/15/20 16:39:54 Query info: matched=0; skipped=0; query_time=0.000022; >> send_time=0.000029; type=Scheduler; >> requirements={((stricmp(Name,"grid-htcondorce0.desy.de") == 0))}; >> locate=1; limit=0; from=TOOL; peer=<131.169.223.90:43887>; >> projection={MyAddress AddressV1 CondorVersion CondorPlatform Name Machine} >> >> >> [2] >> 07/15/20 16:36:16 (Sending 1 ads in response to query) >> 07/15/20 16:36:16 Query includes collector's self ad >> 07/15/20 16:36:16 Updating collector stats using a chained ad and config= >> 07/15/20 16:36:16 Query info: matched=1; skipped=0; query_time=0.000101; >> send_time=0.000362; type=Collector; requirements={((true))}; locate=0; >> limit=0; from=TOOL; peer=<131.169.223.90:41934>; projection={}; >> filter_private_ads=1 >> 07/15/20 16:36:16 Got QUERY_SCHEDD_ADS >> 07/15/20 16:36:16 (Sending 1 ads in response to query) >> 07/15/20 16:36:16 Query info: matched=1; skipped=0; query_time=0.000056; >> send_time=0.000061; type=Scheduler; >> requirements={((stricmp(Name,"grid-vm08.desy.de") == 0))}; locate=1; >> limit=0; from=TOOL; peer=<131.169.223.90:40257>; projection={MyAddress >> AddressV1 CondorVersion CondorPlatform Name Machine}; filter_private_ads=1 >> 07/15/20 16:36:16 SECMAN: Succesfully sent DC_SEC_QUERY classad to >> <131.169.223.90:33652>! >> AuthorizationSucceeded = true >> 07/15/20 16:36:19 (Sending 0 ads in response to query) >> 07/15/20 16:36:19 Query info: matched=0; skipped=0; query_time=0.000090; >> send_time=0.000039; type=Negotiator; requirements={true}; locate=0; >> limit=0; from=COLLECTOR; peer=<131.169.223.234:7993>; projection={}; >> filter_private_ads=0 >> >> >> _______________________________________________ >> HTCondor-users mailing list >> To unsubscribe, send a message to htcondor-users-request@xxxxxxxxxxx with a >> subject: Unsubscribe >> You can also unsubscribe by visiting >> https://lists.cs.wisc.edu/mailman/listinfo/htcondor-users >> >> The archives can be found at: >> https://lists.cs.wisc.edu/archive/htcondor-users/ >> > _______________________________________________ > HTCondor-users mailing list > To unsubscribe, send a message to htcondor-users-request@xxxxxxxxxxx with a > subject: Unsubscribe > You can also unsubscribe by visiting > https://lists.cs.wisc.edu/mailman/listinfo/htcondor-users > > The archives can be found at: > https://lists.cs.wisc.edu/archive/htcondor-users/ >
Attachment:
smime.p7s
Description: S/MIME Cryptographic Signature