I've looked around for awhile to track down a filled out answers file that would be for an Enterprise version master, I was only able to locate something for version 2.0~ though and am not certain what has changed since then (below) can someone show me a (generic) master answers file that I can use to set up my master?
q_puppet_cloud_install=n
q_puppet_enterpriseconsole_auth_user_email=jd.daniel#mheducation.com
q_puppet_enterpriseconsole_auth_password=puppet
q_puppet_enterpriseconsole_auth_user=console
q_puppet_enterpriseconsole_database_install=y
q_puppet_enterpriseconsole_database_name=console
q_puppet_enterpriseconsole_database_password=puppet
q_puppet_enterpriseconsole_database_root_password=puppet
q_puppet_enterpriseconsole_database_user=console
q_puppet_enterpriseconsole_httpd_port=443
q_puppet_enterpriseconsole_install=y
q_puppet_enterpriseconsole_inventory_hostname=${HOSTNAME}
q_puppet_enterpriseconsole_inventory_port=8140
q_puppet_enterpriseconsole_master_hostname=${HOSTNAME}
q_puppet_symlinks_install=y
q_puppetagent_server=puppet
q_puppetagent_certname=puppet
q_puppetagent_install=y
q_puppetmaster_certname=puppet
q_puppetmaster_dnsaltnames=puppet
q_puppetmaster_enterpriseconsole_hostname=${HOSTNAME}
q_puppetmaster_enterpriseconsole_port=443
q_puppetmaster_forward_facts=n
q_puppetmaster_install=y
q_enable_future_parser=y
q_fail_on_unsuccessful_master_lookup=n
q_vendor_packages_install=y
q_install=y
Related
I'm looking if there is a way to call the board (DSXLE5) and request info of the current KeyMapping value?
It is the first time I deal with DSX5 family, I've already identified that the firmware change process is different, as before, to change a DSXLE4 8 port card to be 2in + 6out we used
$mvConnectorConfig -2in6out -sn=XXXXX
and now I see that DSXLE5 needs to point the firmware (*.pin) with the load parameter.
$mvConnectorConfig load -f=xmio5_x2_00i12o.pin -sn=xxxxx
I see from the help that we can use -KeyMapping to change the boards, but how does it work and can we call the board to show which mapping is currently using?
I'm new here, so I'm sorry if it isn't 100% straight question.
Best Regards,
Rafael Girotto
I have been trying to run a Nutch 1.16 crawler using code example and instructions from https://cwiki.apache.org/confluence/display/NUTCH/NutchTutorial but no matter what, I seem to get stuck when initiating the actual crawl.
I'm running it through Cygwin64 on a Windows 10 machine, using a binary installation (though I have tried compiling one with the same results). Initially, Nutch would throw an UnsatisfiedLinkError (NativeIO$Windows.access0) which I fixed by adding libraries from several other answers for the same issue. Upon doing so, I could at least start a server, but trying to crawl through nutch itself would return NoSuchMethodError no matter what I did. nutch-site.xml only contains http.agent.name and plugin.includes options, both taken from the same example.
The following is the error message (I also tried to omit seed.txt):
$ bin/nutch inject crawl/crawldb urls/seed.txt
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.commons.cli.OptionBuilder.withArgPattern(Ljava/lang/String;I)Lorg/apache/commons/cli/OptionBuilder;
at org.apache.hadoop.util.GenericOptionsParser.buildGeneralOptions(GenericOptionsParser.java:207)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:370)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:138)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:59)
at org.apache.nutch.crawl.Injector.main(Injector.java:534)
The following is the list of libraries currently present in the lib directory:
activation-1.1.jar
amqp-client-5.2.0.jar
animal-sniffer-annotations-1.14.jar
antlr-runtime-3.5.2.jar
antlr4-4.5.1.jar
aopalliance-1.0.jar
apache-nutch-1.16.jar
apacheds-i18n-2.0.0-M15.jar
apacheds-kerberos-codec-2.0.0-M15.jar
api-asn1-api-1.0.0-M20.jar
api-util-1.0.0-M20.jar
args4j-2.0.16.jar
ascii-utf-themes-0.0.1.jar
asciitable-0.3.2.jar
asm-3.3.1.jar
asm-7.1.jar
avro-1.7.7.jar
bootstrap-3.0.3.jar
cglib-2.2.1-v20090111.jar
cglib-2.2.2.jar
char-translation-0.0.2.jar
checker-compat-qual-2.0.0.jar
closure-compiler-v20130603.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.8.0.jar
commons-cli-1.2-sources.jar
commons-cli-1.2.jar
commons-codec-1.11.jar
commons-collections-3.2.2.jar
commons-collections4-4.2.jar
commons-compress-1.18.jar
commons-configuration-1.6.jar
commons-daemon-1.0.13.jar
commons-digester-1.8.jar
commons-el-1.0.jar
commons-httpclient-3.1.jar
commons-io-2.4.jar
commons-jexl-2.1.1.jar
commons-lang-2.6.jar
commons-lang3-3.8.1.jar
commons-logging-1.1.3.jar
commons-math3-3.1.1.jar
commons-net-3.1.jar
crawler-commons-1.0.jar
curator-client-2.7.1.jar
curator-framework-2.7.1.jar
curator-recipes-2.7.1.jar
cxf-core-3.3.3.jar
cxf-rt-bindings-soap-3.3.3.jar
cxf-rt-bindings-xml-3.3.3.jar
cxf-rt-databinding-jaxb-3.3.3.jar
cxf-rt-frontend-jaxrs-3.3.3.jar
cxf-rt-frontend-jaxws-3.3.3.jar
cxf-rt-frontend-simple-3.3.3.jar
cxf-rt-security-3.3.3.jar
cxf-rt-transports-http-3.3.3.jar
cxf-rt-transports-http-jetty-3.3.3.jar
cxf-rt-ws-addr-3.3.3.jar
cxf-rt-ws-policy-3.3.3.jar
cxf-rt-wsdl-3.3.3.jar
dom4j-1.6.1.jar
ehcache-3.3.1.jar
elasticsearch-0.90.1.jar
error_prone_annotations-2.1.3.jar
FastInfoset-1.2.16.jar
geronimo-jcache_1.0_spec-1.0-alpha-1.jar
gora-hbase-0.3.jar
gson-2.2.4.jar
guava-25.0-jre.jar
guice-3.0.jar
guice-servlet-3.0.jar
h2-1.4.197.jar
hadoop-0.20.0-ant.jar
hadoop-0.20.0-core.jar
hadoop-0.20.0-examples.jar
hadoop-0.20.0-test.jar
hadoop-0.20.0-tools.jar
hadoop-annotations-2.9.2.jar
hadoop-auth-2.9.2.jar
hadoop-common-2.9.2.jar
hadoop-core-1.2.1.jar
hadoop-core_0.20.0.xml
hadoop-core_0.21.0.xml
hadoop-core_0.22.0.xml
hadoop-hdfs-2.9.2.jar
hadoop-hdfs-client-2.9.2.jar
hadoop-mapreduce-client-common-2.2.0.jar
hadoop-mapreduce-client-common-2.9.2.jar
hadoop-mapreduce-client-core-2.2.0.jar
hadoop-mapreduce-client-core-2.9.2.jar
hadoop-mapreduce-client-jobclient-2.2.0.jar
hadoop-mapreduce-client-jobclient-2.9.2.jar
hadoop-mapreduce-client-shuffle-2.2.0.jar
hadoop-mapreduce-client-shuffle-2.9.2.jar
hadoop-yarn-api-2.9.2.jar
hadoop-yarn-client-2.9.2.jar
hadoop-yarn-common-2.9.2.jar
hadoop-yarn-registry-2.9.2.jar
hadoop-yarn-server-common-2.9.2.jar
hadoop-yarn-server-nodemanager-2.9.2.jar
hbase-0.90.0-tests.jar
hbase-0.90.0.jar
hbase-0.92.1.jar
hbase-client-0.98.0-hadoop2.jar
hbase-common-0.98.0-hadoop2.jar
hbase-protocol-0.98.0-hadoop2.jar
HikariCP-java7-2.4.12.jar
htmlparser-1.6.jar
htrace-core-2.04.jar
htrace-core4-4.1.0-incubating.jar
httpclient-4.5.6.jar
httpcore-4.4.9.jar
httpcore-nio-4.4.9.jar
icu4j-61.1.jar
istack-commons-runtime-3.0.8.jar
j2objc-annotations-1.1.jar
jackson-annotations-2.9.9.jar
jackson-core-2.9.9.jar
jackson-core-asl-1.9.13.jar
jackson-databind-2.9.9.jar
jackson-dataformat-cbor-2.9.9.jar
jackson-jaxrs-1.9.13.jar
jackson-jaxrs-base-2.9.9.jar
jackson-jaxrs-json-provider-2.9.9.jar
jackson-mapper-asl-1.9.13.jar
jackson-module-jaxb-annotations-2.9.9.jar
jackson-xc-1.9.13.jar
jakarta.activation-api-1.2.1.jar
jakarta.ws.rs-api-2.1.5.jar
jakarta.xml.bind-api-2.3.2.jar
jasper-compiler-5.5.12.jar
jasper-runtime-5.5.12.jar
java-xmlbuilder-0.4.jar
javassist-3.12.1.GA.jar
javax.annotation-api-1.3.2.jar
javax.inject-1.jar
javax.persistence-2.2.0.jar
javax.servlet-api-3.1.0.jar
jaxb-api-2.2.2.jar
jaxb-impl-2.2.3-1.jar
jaxb-runtime-2.3.2.jar
jcip-annotations-1.0-1.jar
jersey-client-1.19.4.jar
jersey-core-1.9.jar
jersey-guice-1.9.jar
jersey-json-1.9.jar
jersey-server-1.9.jar
jets3t-0.9.0.jar
jettison-1.1.jar
jetty-6.1.26.jar
jetty-client-6.1.22.jar
jetty-continuation-9.4.19.v20190610.jar
jetty-http-9.4.19.v20190610.jar
jetty-io-9.4.19.v20190610.jar
jetty-security-9.4.19.v20190610.jar
jetty-server-9.4.19.v20190610.jar
jetty-sslengine-6.1.26.jar
jetty-util-6.1.26.jar
jetty-util-9.4.19.v20190610.jar
joda-time-2.3.jar
jquery-2.0.3-1.jar
jquery-selectors-0.0.3.jar
jquery-ui-1.10.2-1.jar
jquerypp-1.0.1.jar
jsch-0.1.54.jar
json-smart-1.3.1.jar
jsp-2.1-6.1.14.jar
jsp-api-2.1-6.1.14.jar
jsp-api-2.1.jar
jsr305-3.0.0.jar
junit-3.8.1.jar
juniversalchardet-1.0.3.jar
leveldbjni-all-1.8.jar
log4j-1.2.17.jar
lucene-analyzers-common-4.3.0.jar
lucene-codecs-4.3.0.jar
lucene-core-4.3.0.jar
lucene-grouping-4.3.0.jar
lucene-highlighter-4.3.0.jar
lucene-join-4.3.0.jar
lucene-memory-4.3.0.jar
lucene-queries-4.3.0.jar
lucene-queryparser-4.3.0.jar
lucene-sandbox-4.3.0.jar
lucene-spatial-4.3.0.jar
lucene-suggest-4.3.0.jar
maven-parent-config-0.3.4.jar
metrics-core-3.0.1.jar
modernizr-2.6.2-1.jar
mssql-jdbc-6.2.1.jre7.jar
neethi-3.1.1.jar
netty-3.6.2.Final.jar
netty-all-4.0.23.Final.jar
nimbus-jose-jwt-4.41.1.jar
okhttp-2.7.5.jar
okio-1.6.0.jar
org.apache.commons.cli-1.2.0.jar
ormlite-core-5.1.jar
ormlite-jdbc-5.1.jar
oro-2.0.8.jar
paranamer-2.3.jar
protobuf-java-2.5.0.jar
reflections-0.9.8.jar
servlet-api-2.5-20081211.jar
servlet-api-2.5.jar
skb-interfaces-0.0.1.jar
slf4j-api-1.7.26.jar
slf4j-log4j12-1.7.25.jar
snappy-java-1.0.5.jar
spatial4j-0.3.jar
spring-aop-4.0.9.RELEASE.jar
spring-beans-4.0.9.RELEASE.jar
spring-context-4.0.9.RELEASE.jar
spring-core-4.0.9.RELEASE.jar
spring-expression-4.0.9.RELEASE.jar
spring-web-4.0.9.RELEASE.jar
ST4-4.0.8.jar
stax-api-1.0-2.jar
stax-ex-1.8.1.jar
stax2-api-3.1.4.jar
t-digest-3.2.jar
tika-core-1.22.jar
txw2-2.3.2.jar
typeaheadjs-0.9.3.jar
warc-hadoop-0.1.0.jar
webarchive-commons-1.1.5.jar
wicket-bootstrap-core-0.9.2.jar
wicket-bootstrap-extensions-0.9.2.jar
wicket-core-6.17.0.jar
wicket-extensions-6.13.0.jar
wicket-ioc-6.17.0.jar
wicket-request-6.17.0.jar
wicket-spring-6.17.0.jar
wicket-util-6.17.0.jar
wicket-webjars-0.4.0.jar
woodstox-core-5.0.3.jar
wsdl4j-1.6.3.jar
xercesImpl-2.12.0.jar
xml-apis-1.4.01.jar
xml-resolver-1.2.jar
xmlenc-0.52.jar
xmlParserAPIs-2.6.2.jar
xmlschema-core-2.2.4.jar
zookeeper-3.4.6.jar
This is my java version:
java version "1.8.0_241"
Java(TM) SE Runtime Environment (build 1.8.0_241-b07)
Java HotSpot(TM) 64-Bit Server VM (build 25.241-b07, mixed mode)
I'd also like to point out that, despite what another answer may have said, nutch 1.4 (or any other version of nutch for that matter) did NOT resolve the issue, at least on Windows.
EDIT: The following answer worked for me, but I left the original one because it may still be useful to someone working with other versions of nutch.
Again, thanks to Sebastian Nagel, in order to get around the NoSuchMethodError, just edit ivy\ivy.xml to reference a different version of hadoop libraries, in my case I installed hadoop 3.1.3 and I also added the corresponding 3.1.3 versions of winutils.exe and hadoop.dll to the hadoop\bin directory referenced by HADOOP_HOME. Running bin/crawl and it seems to be working correctly.
Outdated answer: Okay, after working on the source code itself (courtesy of https://github.com/apache/commons-cli) under the suggestion of Sebastian Nagel, I was able to find the (very simple) implementation for the method (https://github.com/marcelmaatkamp/EntityExtractorUtils/blob/master/src/main/java/org/apache/commons/cli/OptionBuilder.java):
/**
* The next Option created will have an argument patterns and
* the number of pattern occurances
*
* #param argPattern string representing a pattern regex
* #param limit the number of pattern occurance in the argument
* return the OptionBuilder instance
*/
public static OptionBuilder withArgPattern( String argPattern,
int limit )
{
OptionBuilder.argPattern = argPattern;
OptionBuilder.limit = limit;
Using maven I was then able to compile the code into their own jar files, which I then added in the lib folder for apache nutch.
This still did not completely resolve my problem, as there seem to be deprecated functions being used by the entire nutch framework, which will probably mean even more work under similar circumstances (for instance, right after using the new jar I've been returned a NoSuchMethodError over org.apache.hadoop.mapreduce.Job.getInstance).
I leave this answer here as a temporary solution to anyone who may have also gotten stuck on the same issue, but I surely wish there was an easier way of finding out which methods appear in which jar file before exploring their entire file structure, although it may just be me ignoring it.
I am getting an error when reading a binary in /sbin/ PDF files anywhere to get a font called N'ko.
I searched many places online and used my GUI search tool and did not find any results.
Please help me locate a URL with this font so I can install it.
Also, please let me know any other fonts needed to read these types of files from the Terminal or CLI.
The following is a list of my current fonts:
1a61f63f6dd3c0bac6e6737524af16c637b8131e-jomolhari-fonts-0.003-8.1.el6-noarch
250202b87e90a3dae239e372c581385dacd296bd-ctan-cm-lgc-sans-fonts-0.5-17.1.el6-noarch
d7472ff377cbad8c19dd5d88e562ce145ae05770-cjkuni-fonts-common-0.2.20080216.1-36.el6-noarch
02c4e9708abd7a4a272947688881cb24ed579f28-cjkuni-uming-fonts-0.2.20080216.1-36.el6-noarch
a22642b0f3b0e36b32b4f448a34b0187c83b4d8a-ctan-cm-lgc-typewriter-fonts-0.5-17.1.el6-noarch
083611c87fee17bf70562f4a5e22cae454bafde0-ctan-cm-lgc-fonts-common-0.5-17.1.el6-noarch
480bdfabd03ab1a24b995e596da668e3cf2404f2-ctan-cm-lgc-roman-fonts-0.5-17.1.el6-noarch
e2e7b21d2556800774be5347eec044f05c1b4121-smc-fonts-common-04.2-11.el6-noarch
50cf6d280b8b5efb15ab76f5566b7e88715d48bf-sil-padauk-fonts-2.6.1-1.el6-noarch
b0639f3a2a6892ff76440db01aeaeda940708526-smc-meera-fonts-04.2-11.el6-noarch
d5d083fc04f97ac1151957e86ec7f4b9fa5cbdcf-stix-fonts-0.9-13.1.el6-noarch
f174e6cdda7546672001da44a2e30f3fcc18ee72-xorg-x11-fonts-ISO8859-1-100dpi-7.2-11.el6-noarch
87fc988a182637e0db233ad0a33af5d20731dced-xorg-x11-fonts-Type1-7.2-11.el6-noarch
983c0992e7278550e716fbfcc6cda542e34feef2-xorg-x11-font-utils-7.2-11.el6-x86_64
e707cbd5a9930c060036fc89c067db3dc8d81756-xorg-x11-fonts-misc-7.2-11.el6-noarch
b6f400539062544da9d323e62f78ba8921cb830e-xorg-x11-fonts-100dpi-7.2-11.el6-noarch
d0d38a4960ca7b30d671ee93d04a9d184d47be5d-gnu-free-serif-fonts-20100919-3.el6-noarch
25f79cc4356c0b677dde6b81495393817eb26a2e-gnu-free-sans-fonts-20100919-3.el6-noarch
a87baa70c2c33e48c52a711e28bb72155022f897-ghostscript-fonts-5.50-23.2.el6-noarch
a059fe6beaab6056baf481f0fecf7015aaa517da-google-crosextra-caladea-fonts-1.002-0.3.20130214.el6.1-noarch
3f728e4c3f8463228a3d08cff9e6b6c6d1be5ba6-google-crosextra-carlito-fonts-1.103-0.1.20130920.el6.1-noarch
c6e2eb805a5f00c5767c9d6e33c50bc4328db42b-gnu-free-fonts-common-20100919-3.el6-noarch
8bd0fdea016b7993a427d1ccc4b957e9df2b5e87-liberation-mono-fonts-1.05.1.20090721-5.el6-noarch
44a9a9b994103ca3824bd84efc99ae4e903d2cb9-lohit-gujarati-fonts-2.4.4-4.el6-noarch
ca55f9717f1a8d8df766a91d548a1b970ad02d9b-lohit-bengali-fonts-2.4.3-6.el6-noarch
26f9c4b3433b37df272ec281ea8a7573ae88bf24-liberation-sans-fonts-1.05.1.20090721-5.el6-noarch
55c19776dc11427559938bebd4930818ad646bd9-lohit-punjabi-fonts-2.4.4-2.el6-noarch
fa787df1b84fe4b50ea12fda4b7114c94e136355-lohit-devanagari-fonts-2.4.3-7.el6-noarch
d62c5cd0eabc736d87bc669de1e3fd0eac2dd0fd-lohit-kannada-fonts-2.4.5-6.el6-noarch
f924a95845a7814449f59695cd41696e1dbf7df8-lohit-tamil-fonts-2.4.5-5.el6-noarch
f4a126559c1e3b23b073fd38e58a9c692f604271-libfontenc-1.0.5-2.el6-x86_64
ef306e034658efcb8a0e86f4ee7042d977cf2418-liberation-fonts-common-1.05.1.20090721-5.el6-noarch
c461a25faaf3272e7c288f3b6d1aa7ab0345f725-liberation-serif-fonts-1.05.1.20090721-5.el6-noarch
fd4a03290a91ed7ab7371fbe31ccafa2dbd7c251-lklug-fonts-0.6-4.20090803cvs.el6-noarch
e3c68385d75c8b6c61183fd63dcbd01e1e51432c-lohit-oriya-fonts-2.4.3-6.el6-noarch
f72d6108104fb88ee5c0efa227fba25a59f357f0-libXfont-1.4.5-5.el6_7-x86_64
15bb7f00d4a73d3ad3def2848ac75032051484f2-lohit-telugu-fonts-2.4.5-6.el6-noarch
da6a5e74cd37b0a4c0fc81a4fcbb460c6b8240df-lohit-assamese-fonts-2.4.3-5.el6-noarch
c7cc946c4970730ac96fc1698e617fbfcacd9cda-kurdit-unikurd-web-fonts-20020502-6.el6-noarch
61a68270ec91806e0a848cbd050345f789df170c-khmeros-fonts-common-5.0-9.el6-noarch
e1fe63e74f8f4133db2b08042fdd9142574058a6-khmeros-base-fonts-5.0-9.el6-noarch
1b14d59dfbaa419c8f3c8ebbf40eecc37571a654-fontconfig-devel-2.8.0-5.el6-x86_64
8ddcd1a064d26e97e9a51d898232fc4543d986a5-fontpackages-filesystem-1.41-1.1.el6-noarch
50f5be20479807113830f466211c1304b2af3306-fontconfig-2.8.0-5.el6-x86_64
6040a892621dbc546895a8ff2c3be91a86749992-thai-scalable-fonts-common-0.4.12-2.1.el6-noarch
f2e8d53b2e9234aec858e645ef35ae138c12c04b-thai-scalable-waree-fonts-0.4.12-2.1.el6-noarch
fa6344af32177d7a19c4d0fe08e133b533d9ab60-texlive-texmf-errata-fonts-2007-7.1.el6-noarch
bb6198a6d9d55e0f91f19a835ed8dd6399d619f5-tibetan-machine-uni-fonts-1.901-5.el6-noarch
b5b1bb8d09a48194b064d3d5412a8afc62635e7d-urw-fonts-2.4-10.el6-noarch
09637fec4d2f0d446731b568b0ecc3480779abaa-un-core-dotum-fonts-1.0.2-0.15.080608.el6-noarch
20bb98d0cabca85596da63c3621c741400462575-un-core-fonts-common-1.0.2-0.15.080608.el6-noarch
279150946d58035881830ed5ac20554e9db26969-wqy-zenhei-fonts-0.9.45-3.el6-noarch
4ecb7070809988e2a89a3ce3201efea1ec403b22-vlgothic-fonts-20091202-2.el6-noarch
8938d215aa76e8ed9b75df01d2fbbd61b48371cb-vlgothic-fonts-common-20091202-2.el6-noarch
2d0a75885e65ab1d2c49be2e6cbb81dfe2189afe-dejavu-fonts-common-2.33-1.el6-noarch
dd9c12cb845ef9d8abfb870f602451d20c8ef422-dejavu-sans-fonts-2.33-1.el6-noarch
88a85a6ce1f0f7dee0ef185843ff734efd8a0ed0-dejavu-sans-mono-fonts-2.33-1.el6-noarch
6cbf591866bcf9d651909a701ab243f4bfa6621f-dejavu-serif-fonts-2.33-1.el6-noarch
5ffb229900547dc846c06329013973bf527a3c4e-abyssinica-fonts-1.0-5.1.el6-noarch
5c151ed158d2efd8c9595d1d4022845e1a59d46a-madan-fonts-2.000-3.el6-noarch
f2521d94e4767594281da1efc9d7c39206afae4a-mplayer-fonts-1.1-3.0.rf-noarch
9b821dcad23236e7e8a4c85ccb406a67c58bb2b9-mph-2b-damase-fonts-002.000-3.el6-noarch
59072f74f42da013a4d90e556ca41bb5b70e6311-paktype-tehreer-fonts-2.0-8.el6-noarch
e4c4afc0e04fc5daae049e2d77f75fde9392b73e-paktype-fonts-common-2.0-8.el6-noarch
04acb7c2efaf10bef2eb3abda0ba8b693778c55b-paktype-naqsh-fonts-2.0-8.el6-noarch
I have Centos 6.7. 32-bit
Thank you for your help.
I'm trying to setup a cluster across machines on a PBS managed cluster. I'm perfectly able to compute within one node by saying julia -p 12 (after having reserved one node with 12 CPUs).
I understand that to use several machines, I have to add them to the master process with addprocs. I was able to do that on a different cluster (SGE). on this one here something is going wrong.
You can see everything I'm doing, including submit scripts etc, on this branch of a github repo.
to get a list of machines, I parse the PBS_NODEFILE, which for the case of a submit script with option
#PBS -l nodes=2:ppn=12 # give me 2 nodes with 12 processors each
looks like something like this:
red0004
red0004
...
red0004
red0347
...
red0347
I parse this file with bind_pe_procs() in sge.jl in the repo and give a vector of machine names to addprocs. When I submit this I get this error which I put up a gist with the resulting SSH error. I don't know what it means.
has this to do with a system setting, ie do i have to talk to the sys admin about SSH between machines? What are the right questions to ask?
I am unsure about what exactly I have to give to addprocs(). I don't want to add the master process (I don't want worker 1 SSHing into itself?), so I exclude ENV["HOST"] = node001 from my list. but what about all processors with the same name node002? do i list all of those
machines = [ "red0347" for i=1:12]
or just once
machines = ["red0347"]
in addprocs(machines)
thanks!
My fix engine keeps rejecting messages and I was hoping someone could help me figure out why... I'm receiving the following sample message:
8=FIXT.1.1 9=518 35=AE 34=4 1128=8 49=XXXXXXX 56=YYYYYYY 52=20130322-17:58:37 552=1 54=1 37=Z00097H4ON 11=NOREF 826=0 78=1 79=NOT SPECIFIED 80=100000.000000 5967=129776.520000 453=5 448=BCART6 452=3 447=D 448=BARX 452=1 447=D 448=BARX 452=16 447=D 448=bcart6 452=11 447=D 448=ABCDEFGHI 452=12 447=D 571=6611540 150=F 17=Z00097H4ON 32=100000.000000 38=100000.000000 15=EUR 1056=129776.520000 31=1.2977652 194=1.298120 195=-3.5480 64=20130409 63=W2 60=20130322-17:26:50 75=20130322 1057=Y 460=4 167=FOR 65=OR 55=EUR/USD 10=121
8=FIXT.1.1 9=124 35=3 34=4 49=XXXXXXX 52=20130322-17:58:37.917 56=YYYYYYY 45=4 58=Tag appears more than once 371=448 372=AE 373=13 10=216
But as you can see it's being rejected by the quickfix engine. I am using the 5.0sp1 data dictionary and have configured it in my config file:
[DEFAULT]
ConnectionType=initiator
HeartBtInt=30
ReconnectInterval=10
SocketReuseAddress=Y
FileStorePath=D:\XXX\Interface\ReutersStore
FileLogPath=D:\XXX\Interface\ReutersLog
[SESSION]
BeginString = FIXT.1.1
SenderCompID = XXXXX
TargetCompID= YYYYY
DefaultApplVerId = FIX.5.0
UseDataDictionary=Y
AppDataDictionary=FIX50SP1.xml
StartDay=sunday
StartTime=20:55:00
EndTime=06:05:00
EndDay=saturday
SocketConnectHost= A.B.C.D
SocketConnectPort= 123
Does anyone have any idea why the Engine would be rejecting this message? I know that quickfix is normally able to handle messages with repeating groups, is it a config thing? Any help would be greatly appreciated!
Your message seems to be in order. Try putting this in your config file.
ValidateFieldsOutOfOrder=N
Quickfix by default puts that as Y and the underlying structure storing the tab and field values is unable to see the count before. 453 > 448.
As a sidenote always check these fields. They should point you to the source of the problem.
58=Tag appears more than once
371=448
Maybe it's a shot in the dark, but I had a similar a problem when using a 5.0sp2 dictionary.
I resolved using an updated version of the quickfix library compiled from the library SVN repository. If I remember correctly this was the bug.
It seems that the quickfix library has not been updated since a long time, and for newer version of fix I suggest you to use the "trunk" of the repo.
I had the same problem and i resolved it by tweaking my DataDictionary like the following in message AE TradeCaptureReport