How we can integrate Black Duck license scanning with gitlab-ci:
$ bash <(curl -s -L https://detect.synopsys.com/detect.sh)
Detect Shell Script
Detect Shell Script 2.4.0
Will look for : https://sig-repo.synopsys.com/bds-integrations-release/com/synopsys/integration/synopsys-detect/6.5.0/synopsys-detect-6.5.0.jar
You have already downloaded the latest file, so the local file will be used.
Java Source: PATH
running Detect: "java" -jar "/root/synopsys-detect/download/synopsys-detect-6.5.0.jar"
______ _ _
| _ \ | | | |
| | | |___| |_ ___ ___| |_
| | | / _ \ __/ _ \/ __| __|
| |/ / __/ || __/ (__| |_
|___/ \___|\__\___|\___|\__|
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.codehaus.groovy.reflection.CachedClass (jar:file:/root/synopsys-detect/download/synopsys-detect-6.5.0.jar!/BOOT-INF/lib/groovy-all-2.4.12.jar!/) to method java.lang.Object.finalize()
WARNING: Please consider reporting this to the maintainers of org.codehaus.groovy.reflection.CachedClass
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Detect Version: 6.5.0
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- Current property values:
2020-09-08 11:16:06 INFO [main] --- --property = value [notes]
2020-09-08 11:16:06 INFO [main] --- ------------------------------------------------------------
2020-09-08 11:16:06 INFO [main] --- ------------------------------------------------------------
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- Tilde's will be automatically resolved to USER HOME.
2020-09-08 11:16:06 INFO [main] --- Source directory: /home/siddharth.sharma2
2020-09-08 11:16:06 INFO [main] --- Output directory: /root/blackduck
2020-09-08 11:16:06 INFO [main] --- Run directory: /root/blackduck/runs/2020-09-08-05-46-05-916
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 ERROR [main] --- Your environment was not sufficiently configured to run Black Duck or Polaris. Please configure your environment for at least one product.
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- Creating status file: /root/blackduck/runs/2020-09-08-05-46-05-916/status/status.json
2020-09-08 11:16:06 INFO [main] --- Status file has been deleted. To preserve status file, turn off cleanup actions.
2020-09-08 11:16:06 INFO [main] --- Cleaning up directory: /root/blackduck/runs/2020-09-08-05-46-05-916
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- ======== Detect Issues ========
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- EXCEPTIONS:
2020-09-08 11:16:06 INFO [main] --- Your environment was not sufficiently configured to run Black Duck or Polaris. Please configure your environment for at least one product.
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- ======== Detect Status ========
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- Overall Status: FAILURE_CONFIGURATION - Detect was unable to start due to issues with it's configuration. Check and fix your configuration.
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- ===============================
2020-09-08 11:16:06 INFO [main] ---
2020-09-08 11:16:06 INFO [main] --- Detect duration: 00h 00m 01s 034ms`enter code here`
2020-09-08 11:16:06 ERROR [main] --- Exiting with code 7 - FAILURE_CONFIGURATION
Result code of 7, exiting
I have tried simply by getting detect.sh file and also no document is available for integration of black duck license scanning with gitlab-ci
--detect.source.path Try
bash <(curl -s -L https://detect.synopsys.com/detect.sh) --blackduck.url=<your_blackduck_server_url --blackduck.api.token=<your_api_token> --detect.source.path=<source_folder_to_scan>
To get debug output
--logging.level.com.synopsys.integration=DEBUG
I got "environment was not sufficiently configured" error because I did not add the blackduck.url.
Weird behavior for a cli tool to not display the full help on -h/--help or when no proper arguments are given.
Consider joining the Black Duck Community:
https://community.synopsys.com/s/
Consider running
bash <(curl -s -L https://detect.synopsys.com/detect.sh) -hv
See online help here:
https://synopsys.atlassian.net/wiki/spaces/INTDOCS/pages/62423113/Synopsys+Detect
Related
I have configures Kerberos Authentication Module(Windows Desktop SSO Node) with AWS Aurora Kerberos Details.
I have followed this doc - https://backstage.forgerock.com/marketplace/entry/AWyLw-zpDPiiBBbH4Pu-
Below are the errors from Logs.
I have followed the this doc to resolve – https://backstage.forgerock.com/knowledge/kb/article/a62965844 but couldnt resolve after following the solutions.
Keytab file is created with this command - ktpass -out fileName.keytab -princ HTTP/openam.forgerock.com#AD_DOMAIN.COM -pass +rdnPass -maxPass 256 -mapuser amKerberos#frdpcloud.com -crypto AES256-SHA1 -ptype KRB5_NT_PRINCIPAL -kvno 0
SPN is : HTTP/danvledwse01.xyz.com#XYZ.COM
Kerberos Configuration
18-Oct-2021 13:27:20.840 SEVERE [main] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [openam] created a ThreadLocal with key of type [java.lang.ThreadLocal.SuppliedThreadLocal] (value [java.lang.ThreadLocal$SuppliedThreadLocal#7bbd38c1]) and a value of type [org.forgerock.openam.audit.context.AuditRequestContext] (value [org.forgerock.openam.audit.context.AuditRequestContext#407cbfdc]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
18-Oct-2021 13:27:20.840 SEVERE [main] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks The web application [openam] created a ThreadLocal with key of type [java.lang.ThreadLocal.SuppliedThreadLocal] (value [java.lang.ThreadLocal$SuppliedThreadLocal#79238e6a]) and a value of type [org.forgerock.opendj.ldap.AttributeDescription$1] (value [{objectclass=Pair [Schema Core Schema-0 mr=773 syntaxes=45 at=109, objectclass]}]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak.
18-Oct-2021 13:27:20.874 INFO [main] org.apache.coyote.AbstractProtocol.stop Stopping ProtocolHandler ["https-jsse-nio-8444"]
18-Oct-2021 13:27:20.926 INFO [main] org.apache.coyote.AbstractProtocol.destroy Destroying ProtocolHandler ["https-jsse-nio-8444"]
18-Oct-2021 13:27:48.078 WARNING [main] org.apache.tomcat.util.digester.SetPropertiesRule.begin Match [Server/Service/Connector] failed to set property [sslVerifyClient] to [optional]
18-Oct-2021 13:27:48.157 WARNING [main] org.apache.tomcat.util.net.SSLHostConfig.setProtocols The protocol [TLSv1.1] was added to the list of protocols on the SSLHostConfig named [_default_]. Check if a +/- prefix is missing.
18-Oct-2021 13:27:48.157 WARNING [main] org.apache.tomcat.util.net.SSLHostConfig.setProtocols The protocol [SSLv2Hello] was added to the list of protocols on the SSLHostConfig named [_default_]. Check if a +/- prefix is missing.
18-Oct-2021 13:27:48.193 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version name: Apache Tomcat/9.0.52
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server built: Jul 31 2021 04:12:17 UTC
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version number: 9.0.52.0
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Name: Linux
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Version: 4.18.0-305.17.1.el8_4.x86_64
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Architecture: amd64
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Java Home: /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.302.b08-0.el8_4.x86_64/jre
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Version: 1.8.0_302-b08
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Vendor: Red Hat, Inc.
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_BASE: /home/forgerock/data/stage/apache-tomcat-9.0.52
18-Oct-2021 13:27:48.194 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_HOME: /home/forgerock/data/stage/apache-tomcat-9.0.52
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.config.file=/home/forgerock/data/stage/apache-tomcat-9.0.52/conf/logging.properties
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djdk.tls.ephemeralDHKeySize=2048
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.protocol.handler.pkgs=org.apache.catalina.webresources
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dorg.apache.catalina.security.SecurityListener.UMASK=0027
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dsun.security.krb5.debug=true
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dsun.security.jgss.debug=true
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dsun.security.spnego.debug=true
18-Oct-2021 13:27:48.196 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dignore.endorsed.dirs=
18-Oct-2021 13:27:48.197 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.base=/home/forgerock/data/stage/apache-tomcat-9.0.52
18-Oct-2021 13:27:48.197 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.home=/home/forgerock/data/stage/apache-tomcat-9.0.52
18-Oct-2021 13:27:48.197 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.io.tmpdir=/home/forgerock/data/stage/apache-tomcat-9.0.52/temp
18-Oct-2021 13:27:48.198 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The Apache Tomcat Native library which allows using OpenSSL was not found on the java.library.path: [/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib]
18-Oct-2021 13:27:48.718 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["https-jsse-nio-8444"]
18-Oct-2021 13:27:49.135 INFO [main] org.apache.catalina.startup.Catalina.load Server initialization in [1192] milliseconds
18-Oct-2021 13:27:49.174 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service [Catalina]
18-Oct-2021 13:27:49.175 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet engine: [Apache Tomcat/9.0.52]
18-Oct-2021 13:27:50.326 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deploying web application archive [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/openam.war]
18-Oct-2021 13:28:02.421 INFO [main] org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time.
Starting up OpenAM at Oct 18, 2021 1:28:05 PM
SLF4J: Failed to load class "org.slf4j.impl.StaticMDCBinder".
SLF4J: Defaulting to no-operation MDCAdapter implementation.
SLF4J: See http://www.slf4j.org/codes.html#no_static_mdc_binder for further details.
18-Oct-2021 13:28:14.928 INFO [main] org.apache.catalina.startup.HostConfig.deployWAR Deployment of web application archive [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/openam.war] has finished in [24,602] ms
18-Oct-2021 13:28:14.985 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/ROOT]
18-Oct-2021 13:28:15.004 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/ROOT] has finished in [19] ms
18-Oct-2021 13:28:15.004 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/docs]
18-Oct-2021 13:28:15.020 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/docs] has finished in [16] ms
18-Oct-2021 13:28:15.021 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/examples]
18-Oct-2021 13:28:15.535 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/examples] has finished in [514] ms
18-Oct-2021 13:28:15.535 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/host-manager]
18-Oct-2021 13:28:15.568 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/host-manager] has finished in [32] ms
18-Oct-2021 13:28:15.568 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/manager]
18-Oct-2021 13:28:15.600 INFO [main] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory [/home/forgerock/data/stage/apache-tomcat-9.0.52/webapps/manager] has finished in [32] ms
18-Oct-2021 13:28:15.612 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["https-jsse-nio-8444"]
18-Oct-2021 13:28:15.626 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in [26490] milliseconds
>>> KeyTabInputStream, readName(): XYZ.COM
>>> KeyTabInputStream, readName(): HTTP
>>> KeyTabInputStream, readName(): danvledwse01.xyz.com
>>> KeyTab: load() entry length: 72; type: 23
Looking for keys for: HTTP/danvledwse01.xyz.com#XYZ.COM
Java config name: /home/forgerock/openam/krb5.conf
Loaded from Java config
Added key: 23version: 0
>>> KdcAccessibility: reset
Looking for keys for: HTTP/danvledwse01.xyz.com#XYZ.COM
Added key: 23version: 0
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23.
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=172.20.2.196 UDP:88, timeout=30000, number of retries =3, #bytes=153
>>> KDCCommunication: kdc=172.20.2.196 UDP:88, timeout=30000,Attempt =1, #bytes=153
>>> KrbKdcReq send: #bytes read=175
>>>Pre-Authentication Data:
PA-DATA type = 11
PA-ETYPE-INFO etype = 23, salt =
>>>Pre-Authentication Data:
PA-DATA type = 19
PA-ETYPE-INFO2 etype = 23, salt = null, s2kparams = null
>>>Pre-Authentication Data:
PA-DATA type = 2
PA-ENC-TIMESTAMP
>>>Pre-Authentication Data:
PA-DATA type = 16
>>>Pre-Authentication Data:
PA-DATA type = 15
>>> KdcAccessibility: remove 172.20.2.196
>>> KDCRep: init() encoding tag is 126 req type is 11
>>>KRBError:
sTime is Mon Oct 18 13:28:37 UTC 2021 1634563717000
suSec is 124522
error code is 25
error Message is Additional pre-authentication required
sname is krbtgt/XYZ.COM#XYZ.COM
eData provided.
msgType is 30
>>>Pre-Authentication Data:
PA-DATA type = 11
PA-ETYPE-INFO etype = 23, salt =
>>>Pre-Authentication Data:
PA-DATA type = 19
PA-ETYPE-INFO2 etype = 23, salt = null, s2kparams = null
>>>Pre-Authentication Data:
PA-DATA type = 2
PA-ENC-TIMESTAMP
>>>Pre-Authentication Data:
PA-DATA type = 16
>>>Pre-Authentication Data:
PA-DATA type = 15
KrbAsReqBuilder: PREAUTH FAILED/REQ, re-send AS-REQ
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23.
Looking for keys for: HTTP/danvledwse01.xyz.com#XYZ.COM
Added key: 23version: 0
Looking for keys for: HTTP/danvledwse01.xyz.com#XYZ.COM
Added key: 23version: 0
Using builtin default etypes for default_tkt_enctypes
default etypes for default_tkt_enctypes: 18 17 16 23.
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsReq creating message
>>> KrbKdcReq send: kdc=172.20.2.196 UDP:88, timeout=30000, number of retries =3, #bytes=235
>>> KDCCommunication: kdc=172.20.2.196 UDP:88, timeout=30000,Attempt =1, #bytes=235
>>> KrbKdcReq send: #bytes read=90
>>> KrbKdcReq send: kdc=172.20.2.196 TCP:88, timeout=30000, number of retries =3, #bytes=235
>>> KDCCommunication: kdc=172.20.2.196 TCP:88, timeout=30000,Attempt =1, #bytes=235
>>>DEBUG: TCPClient reading 1474 bytes
>>> KrbKdcReq send: #bytes read=1474
>>> KdcAccessibility: remove 172.20.2.196
Looking for keys for: HTTP/danvledwse01.xyz.com#XYZ.COM
Added key: 23version: 0
>>> EType: sun.security.krb5.internal.crypto.ArcFourHmacEType
>>> KrbAsRep cons in KrbAsReq.getReply HTTP/danvledwse01.xyz.com
EDIT:
There was an update to the plugin some weeks ago, and now I get in the Jenkins log that:
Aug 14, 2018 8:57:26 AM WARNING com.sonyericsson.hudson.plugins.gerrit.trigger.playback.GerritMissedEventsPlaybackManager performCheck
Missed Events Playback used to be NOT supported. now it IS!
Aug 14, 2018 8:57:26 AM INFO com.sonymobile.tools.gerrit.gerritevents.GerritConnection run
And in the GERRIT_SITE/logs/error_log it says plugin is loaded:
[2018-08-14 10:56:57,213] [ShutdownCallback] INFO com.google.gerrit.pgm.Daemon : caught shutdown, cleaning up
[2018-08-14 10:56:57,380] [ShutdownCallback] INFO org.eclipse.jetty.server.AbstractConnector : Stopped ServerConnector#500beb9f{HTTP/1.1,[http/1.1]}{127.0.0.1:8081}
[2018-08-14 10:56:57,403] [ShutdownCallback] INFO org.eclipse.jetty.server.handler.ContextHandler : Stopped o.e.j.s.ServletContextHandler#3437fc4f{/,null,UNAVAILABLE}
[2018-08-14 10:56:57,469] [ShutdownCallback] WARN org.apache.sshd.server.channel.ChannelSession : doCloseImmediately(ChannelSession[id=1, recipient=1]-ServerSessionIm$
[2018-08-14 10:56:57,508] [ShutdownCallback] INFO com.google.gerrit.sshd.SshDaemon : Stopped Gerrit SSHD
[2018-08-14 10:57:21,044] [main] WARN com.google.gerrit.sshd.SshDaemon : Cannot format SSHD host key [EdDSA]: invalid key type
[2018-08-14 10:57:21,061] [main] WARN com.google.gerrit.server.config.GitwebCgiConfig : gitweb not installed (no /usr/lib/cgi-bin/gitweb.cgi found)
[2018-08-14 10:57:22,289] [main] INFO org.eclipse.jetty.util.log : Logging initialized #15822ms
[2018-08-14 10:57:22,430] [main] INFO com.google.gerrit.server.git.LocalDiskRepositoryManager : Defaulting core.streamFileThreshold to 1339m
[2018-08-14 10:57:22,784] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Loading plugins from /opt/gerrit/plugins
[2018-08-14 10:57:23,056] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Loaded plugin delete-project, version v2.13-61-g8d6b23b122
[2018-08-14 10:57:23,500] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Loaded plugin events-log, version v2.13-66-ge95af940c6
[2018-08-14 10:57:24,150] [main] INFO com.google.gerrit.server.git.GarbageCollectionRunner : Ignoring missing gc schedule configuration
[2018-08-14 10:57:24,151] [main] INFO com.google.gerrit.server.config.ScheduleConfig : accountDeactivation schedule parameter "accountDeactivation.interval" is not co$
[2018-08-14 10:57:24,151] [main] INFO com.google.gerrit.server.change.ChangeCleanupRunner : Ignoring missing changeCleanup schedule configuration
[2018-08-14 10:57:24,295] [main] INFO com.google.gerrit.sshd.SshDaemon : Started Gerrit SSHD-CORE-1.6.0 on *:29418
[2018-08-14 10:57:24,298] [main] INFO org.eclipse.jetty.server.Server : jetty-9.3.18.v20170406
[2018-08-14 10:57:25,454] [main] INFO org.eclipse.jetty.server.handler.ContextHandler : Started o.e.j.s.ServletContextHandler#73f0b216{/,null,AVAILABLE}
[2018-08-14 10:57:25,475] [main] INFO org.eclipse.jetty.server.AbstractConnector : Started ServerConnector#374013e8{HTTP/1.1,[http/1.1]}{127.0.0.1:8081}
[2018-08-14 10:57:25,476] [main] INFO org.eclipse.jetty.server.Server : Started #19011ms
[2018-08-14 10:57:25,478] [main] INFO com.google.gerrit.pgm.Daemon : Gerrit Code Review 2.15.1 ready
So now this is solved.
I am trying to solve the issue with Missed Events Playback warning I get in Jenkins.
I've enabled the REST API in Jenkins with my generated http password from Gerrit web UI.
So my issue is with the events-log plugin.
I've installed the events-log.jar plugin under GERRIT_SITE/gerrit/plugins
This directory has drwxr-xr-x as permission settings.
GERRIT_SITE/gerrit/logs/error_log gives me this when restarting:
[2018-06-21 13:40:34,678] [main] WARN com.google.gerrit.sshd.SshDaemon : Cannot format SSHD host key [EdDSA]: invalid key type [2018-06-21 13:40:34,697] [main] WARN com.google.gerrit.server.config.GitwebCgiConfig : gitweb not installed (no /usr/lib/cgi-bin/gitweb.cgi found) [2018-06-21 13:40:35,761] [main] INFO org.eclipse.jetty.util.log : Logging initialized #11099ms [2018-06-21 13:40:35,925] [main] INFO com.google.gerrit.server.git.LocalDiskRepositoryManager : Defaulting core.streamFileThreshold to 1339m [2018-06-21 13:40:36,410] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Removing stale plugin file: plugin_events-log_180621_1333_5163201567282630382.jar [2018-06-21 13:40:36,410] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Loading plugins from /opt/gerrit/plugins [2018-06-21 13:40:36,528] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Loaded plugin delete-project, version v2.13-61-g8d6b23b122 [2018-06-21 13:40:36,614] [main] WARN com.google.gerrit.server.plugins.PluginLoader : **Cannot**
**load plugin events-log** java.lang.NoSuchMethodError: com.google.gerrit.server.git.WorkQueue.createQueue(ILjava/lang/String;)Ljava/util/concurrent/ScheduledThreadPoolExecutor;
at com.ericsson.gerrit.plugins.eventslog.EventQueue.start(EventQueue.java:35)
at com.google.gerrit.lifecycle.LifecycleManager.start(LifecycleManager.java:92)
at com.google.gerrit.server.plugins.ServerPlugin.startPlugin(ServerPlugin.java:251)
at com.google.gerrit.server.plugins.ServerPlugin.start(ServerPlugin.java:192)
at com.google.gerrit.server.plugins.PluginLoader.runPlugin(PluginLoader.java:491)
at com.google.gerrit.server.plugins.PluginLoader.rescan(PluginLoader.java:419)
at com.google.gerrit.server.plugins.PluginLoader.start(PluginLoader.java:324)
at com.google.gerrit.lifecycle.LifecycleManager.start(LifecycleManager.java:92)
at com.google.gerrit.pgm.Daemon.start(Daemon.java:349)
at com.google.gerrit.pgm.Daemon.run(Daemon.java:256)
at com.google.gerrit.pgm.util.AbstractProgram.main(AbstractProgram.java:61)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.google.gerrit.launcher.GerritLauncher.invokeProgram(GerritLauncher.java:223)
at com.google.gerrit.launcher.GerritLauncher.mainImpl(GerritLauncher.java:119)
at com.google.gerrit.launcher.GerritLauncher.main(GerritLauncher.java:63)
at Main.main(Main.java:24) [2018-06-21 13:40:36,687] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Loaded plugin gitiles, version dd264dd2d4 [2018-06-21 13:40:36,728] [main] INFO com.google.gerrit.server.plugins.PluginLoader : Loaded plugin its-jira, version v2.15 [2018-06-21 13:40:37,034] [main] INFO com.google.gerrit.server.git.GarbageCollectionRunner : Ignoring missing gc schedule configuration [2018-06-21 13:40:37,034] [main] INFO com.google.gerrit.server.config.ScheduleConfig : accountDeactivation schedule parameter "accountDeactivation.interval" is not configured [2018-06-21 13:40:37,034] [main] INFO com.google.gerrit.server.change.ChangeCleanupRunner : Ignoring missing changeCleanup schedule configuration [2018-06-21 13:40:37,060] [main] INFO com.google.gerrit.sshd.SshDaemon : Started Gerrit SSHD-CORE-1.6.0 on *:29418 [2018-06-21 13:40:37,074] [main] INFO org.eclipse.jetty.server.Server : jetty-9.3.18.v20170406 [2018-06-21 13:40:38,104] [main] INFO org.eclipse.jetty.server.handler.ContextHandler : Started o.e.j.s.ServletContextHandler#2c8469fe{/,null,AVAILABLE} [2018-06-21 13:40:38,113] [main] INFO org.eclipse.jetty.server.AbstractConnector : Started ServerConnector#3803bc1a{HTTP/1.1,[http/1.1]}{127.0.0.1:8081} [2018-06-21 13:40:38,115] [main] INFO org.eclipse.jetty.server.Server : Started #13456ms [2018-06-21 13:40:38,118] [main] INFO com.google.gerrit.pgm.Daemon : Gerrit Code Review 2.15.1 ready
I would like some help on why the plugin is not loading/enables when other plugins are working.
Note 1: Jenkins v2.107.2 and Gerrit v2.15.1 are installed on different linux based servers. And I'm able to trigger a build from Gerrit.
Note 2: I tried both with plugin-manager (uninstalled for now) and with command wget https://gerrit-ci.gerritforge.com/view/Plugins-stable-2.15/job/plugin-events-log-bazel-stable-2.15/lastSuccessfulBuild/artifact/bazel-genfiles/plugins/events-log/events-log.jar, which is the way I'm doing now.
Note 3: events-log in gerrit.config looks like this:
[plugin "events-log"]
maxAge = 20
returnLimit = 10000
storeDriver = org.postgresql.Driver
storeUsername = gerrit
storeUrl = jdbc:postgresql:/var/lib/postgresql/9.5/main
urlOptions = loglevel=INFO
urlOptions = logUnclosedConnections=true
copyLocal = true
I know this is one of the most repeated question. I have looked almost everywhere and none of the resources could resolve the issue I am facing.
Below is the simplified version of my problem statement. But in actual data is little complex so I have to use UDF
My input File: (input.txt)
NotNeeded1,NotNeeded11;Needed1
NotNeeded2,NotNeeded22;Needed2
I want the output to be
Needed1
Needed2
So, I am writing the below UDF
(Java code):
package com.company.pig;
import java.io.IOException;
import org.apache.pig.EvalFunc;
import org.apache.pig.data.Tuple;
public class myudf extends EvalFunc<String>{
public String exec(Tuple input) throws IOException {
if (input == null || input.size() == 0)
return null;
String s = (String)input.get(0);
String str = s.split("\\,")[1];
String str1 = str.split("\\;")[1];
return str1;
}
}
And packaging it into
rollupreg_extract-jar-with-dependencies.jar
Below is my pig shell code
grunt> REGISTER /pig/rollupreg_extract-jar-with-dependencies.jar;
grunt> DEFINE myudf com.company.pig.myudf;
grunt> data = LOAD 'hdfs://sandbox.hortonworks.com:8020/pig_hdfs/input.txt' USING PigStorage(',');
grunt> extract = FOREACH data GENERATE myudf($1);
grunt> DUMP extract;
And I get the below error:
2017-05-15 15:58:15,493 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2017-05-15 15:58:15,577 [main] INFO org.apache.pig.data.SchemaTupleBackend - Key [pig.schematuple] was not set... will not generate code.
2017-05-15 15:58:15,659 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter]}
2017-05-15 15:58:15,774 [main] INFO org.apache.pig.impl.util.SpillableMemoryManager - Selected heap (PS Old Gen) of size 699400192 to monitor. collectionUsageThreshold = 489580128, usageThreshold = 489580128
2017-05-15 15:58:15,865 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2017-05-15 15:58:15,923 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2017-05-15 15:58:15,923 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2017-05-15 15:58:16,184 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:16,196 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:16,396 [main] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:16,576 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2017-05-15 15:58:16,580 [main] WARN org.apache.pig.tools.pigstats.ScriptState - unable to read pigs manifest file
2017-05-15 15:58:16,584 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2017-05-15 15:58:16,588 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - This job cannot be converted run in-process
2017-05-15 15:58:17,258 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Added jar file:/pig/rollupreg_extract-jar-with-dependencies.jar to DistributedCache through /tmp/temp-1119775568/tmp-858482998/rollupreg_extract-jar-with-dependencies.jar
2017-05-15 15:58:17,276 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2017-05-15 15:58:17,294 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false, will not generate code.
2017-05-15 15:58:17,295 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Starting process to move generated code to distributed cacche
2017-05-15 15:58:17,295 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Setting key [pig.schematuple.classes] with classes to deserialize []
2017-05-15 15:58:17,354 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2017-05-15 15:58:17,510 [JobControl] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:17,511 [JobControl] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:17,511 [JobControl] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:17,753 [JobControl] WARN org.apache.hadoop.mapreduce.JobResourceUploader - No job jar file set. User classes may not be found. See Job or Job#setJar(String).
2017-05-15 15:58:17,820 [JobControl] INFO org.apache.pig.builtin.PigStorage - Using PigTextInputFormat
2017-05-15 15:58:17,830 [JobControl] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2017-05-15 15:58:17,830 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
2017-05-15 15:58:17,884 [JobControl] INFO com.hadoop.compression.lzo.GPLNativeCodeLoader - Loaded native gpl library
2017-05-15 15:58:17,889 [JobControl] INFO com.hadoop.compression.lzo.LzoCodec - Successfully loaded & initialized native-lzo library [hadoop-lzo rev 7a4b57bedce694048432dd5bf5b90a6c8ccdba80]
2017-05-15 15:58:17,922 [JobControl] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2017-05-15 15:58:18,525 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
2017-05-15 15:58:18,692 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_1494853652295_0023
2017-05-15 15:58:18,879 [JobControl] INFO org.apache.hadoop.mapred.YARNRunner - Job jar is not present. Not adding any jar to the list of resources.
2017-05-15 15:58:18,973 [JobControl] INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl - Submitted application application_1494853652295_0023
2017-05-15 15:58:19,029 [JobControl] INFO org.apache.hadoop.mapreduce.Job - The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1494853652295_0023/
2017-05-15 15:58:19,030 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_1494853652295_0023
2017-05-15 15:58:19,030 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Processing aliases data,extract
2017-05-15 15:58:19,030 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - detailed locations: M: data[2,7],extract[3,10] C: R:
2017-05-15 15:58:19,044 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2017-05-15 15:58:19,044 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Running jobs are [job_1494853652295_0023]
2017-05-15 15:58:29,156 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2017-05-15 15:58:29,156 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_1494853652295_0023 has failed! Stop running all dependent jobs
2017-05-15 15:58:29,157 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2017-05-15 15:58:29,790 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:29,791 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:29,793 [main] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:30,311 [main] INFO org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl - Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
2017-05-15 15:58:30,312 [main] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8050
2017-05-15 15:58:30,313 [main] INFO org.apache.hadoop.yarn.client.AHSProxy - Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
2017-05-15 15:58:30,465 [main] ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map reduce job(s) failed!
2017-05-15 15:58:30,467 [main] WARN org.apache.pig.tools.pigstats.ScriptState - unable to read pigs manifest file
2017-05-15 15:58:30,472 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.7.3.2.5.0.0-1245 root 2017-05-15 15:58:16 2017-05-15 15:58:30 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_1494853652295_0023 data,extract MAP_ONLY Message: Job failed! hdfs://sandbox.hortonworks.com:8020/tmp/temp-1119775568/tmp-1619300225,
Input(s):
Failed to read data from "/pig_hdfs/input.txt"
Output(s):
Failed to produce result in "hdfs://sandbox.hortonworks.com:8020/tmp/temp-1119775568/tmp-1619300225"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_1494853652295_0023
2017-05-15 15:58:30,472 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2017-05-15 15:58:30,499 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias extract
Details at logfile: /pig/pig_1494863836458.log
I know it complaints that
Failed to read data from "/pig_hdfs/input.txt"
But I am sure this is not the actual issue. If I don't use the udf and directly dump the data, I get the output. So, this is not the issue.
First, you do not need an udf to get the desired output.You can use semi colon as the delimiter in load statement and get the needed column.
data = LOAD 'hdfs://sandbox.hortonworks.com:8020/pig_hdfs/input.txt' USING PigStorage(';');
extract = FOREACH data GENERATE $1;
DUMP extract;
If you insist on using udf then you will have to load the record into a single field and then use the udf.Also,your udf is incorrect.You should split the string s with ';' as the delimiter, which is passed from the pig script.
String s = (String)input.get(0);
String str1 = s.split("\\;")[1];
And in your pig script,you need to load the entire record into 1 field and use the udf on field $0.
REGISTER /pig/rollupreg_extract-jar-with-dependencies.jar;
DEFINE myudf com.company.pig.myudf;
data = LOAD 'hdfs://sandbox.hortonworks.com:8020/pig_hdfs/input.txt' AS (f1:chararray);
extract = FOREACH data GENERATE myudf($0);
DUMP extract;
package com.example;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
System.out.println("xxxx");
SpringApplication.run(DemoApplication.class, args);
}
}
other class
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
#RestController
public class SampleController {
#RequestMapping("/")
public String index() {
return "Greetings from Spring Boot!";
}
}
I made tomcat working on port 8181, because when I used 8080 and run in intellj, it says 8080 is already in use and can not start it.
So, I use 8181 and after executing, it opens localhost:8181 page but it is a white page, nothing there.
These are logs of output
06-Mar-2016 14:38:16.383 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /opt/tomcat/webapps/manager
06-Mar-2016 14:38:16.977 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /opt/tomcat/webapps/manager has finished in 593 ms
These are catalina log
06-Mar-2016 14:38:05.878 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version: Apache Tomcat/8.0.32
06-Mar-2016 14:38:05.887 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server built: Feb 2 2016 19:34:53 UTC
06-Mar-2016 14:38:05.888 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server number: 8.0.32.0
06-Mar-2016 14:38:05.888 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Name: Linux
06-Mar-2016 14:38:05.891 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Version: 4.2.0-30-generic
06-Mar-2016 14:38:05.892 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Architecture: amd64
06-Mar-2016 14:38:05.893 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Java Home: /usr/lib/jvm/java-8-oracle/jre
06-Mar-2016 14:38:05.893 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Version: 1.8.0_74-b02
06-Mar-2016 14:38:05.894 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Vendor: Oracle Corporation
06-Mar-2016 14:38:05.894 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_BASE: /home/caneraydin/.IntelliJIdea16/system/tomcat/Unnamed_Last5
06-Mar-2016 14:38:05.895 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_HOME: /opt/tomcat
06-Mar-2016 14:38:05.896 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.config.file=/home/caneraydin/.IntelliJIdea16/system/tomcat/Unnamed_Last5/conf/logging.properties
06-Mar-2016 14:38:05.897 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
06-Mar-2016 14:38:05.898 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcom.sun.management.jmxremote=
06-Mar-2016 14:38:05.899 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcom.sun.management.jmxremote.port=1099
06-Mar-2016 14:38:05.900 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcom.sun.management.jmxremote.ssl=false
06-Mar-2016 14:38:05.900 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcom.sun.management.jmxremote.authenticate=false
06-Mar-2016 14:38:05.901 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.rmi.server.hostname=127.0.0.1
06-Mar-2016 14:38:05.901 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.endorsed.dirs=/opt/tomcat/endorsed
06-Mar-2016 14:38:05.902 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.base=/home/caneraydin/.IntelliJIdea16/system/tomcat/Unnamed_Last5
06-Mar-2016 14:38:05.905 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.home=/opt/tomcat
06-Mar-2016 14:38:05.905 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.io.tmpdir=/opt/tomcat/temp
06-Mar-2016 14:38:05.906 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /home/caneraydin/Downloads/idea-IU-144.4199.23/bin::/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
06-Mar-2016 14:38:06.265 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["http-nio-8181"]
06-Mar-2016 14:38:06.296 INFO [main] org.apache.tomcat.util.net.NioSelectorPool.getSharedSelector Using a shared selector for servlet write/read
06-Mar-2016 14:38:06.302 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["ajp-nio-34294"]
06-Mar-2016 14:38:06.304 INFO [main] org.apache.tomcat.util.net.NioSelectorPool.getSharedSelector Using a shared selector for servlet write/read
06-Mar-2016 14:38:06.305 INFO [main] org.apache.catalina.startup.Catalina.load Initialization processed in 1704 ms
06-Mar-2016 14:38:06.353 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service Catalina
06-Mar-2016 14:38:06.353 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet Engine: Apache Tomcat/8.0.32
06-Mar-2016 14:38:06.370 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["http-nio-8181"]
06-Mar-2016 14:38:06.433 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["ajp-nio-34294"]
06-Mar-2016 14:38:06.448 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in 142 ms
06-Mar-2016 14:38:16.383 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /opt/tomcat/webapps/manager
06-Mar-2016 14:38:16.977 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /opt/tomcat/webapps/manager has finished in 593 ms
What am i doing wrong?
change the port in application.properties file to 8181 it will be default 8080
Regards,
Nitin
I am running the Pig example from DataStax: http://www.datastax.com/docs/datastax_enterprise3.1/solutions/about_pig#pig-read-write. I am using DataStax Enterprise 3.1.2. But when I want to save the Data back in Cassandra with:
grunt> STORE insertformat INTO
'cql://cql3ks/test?output_query=UPDATE+cql3ks.test+set+b+%3D+%3F'
USING CqlStorage;
I get the following output:
2014-03-11 10:14:38,383 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: UNKNOWN
2014-03-11 10:14:38,440 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2014-03-11 10:14:38,442 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2014-03-11 10:14:38,442 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2014-03-11 10:14:38,451 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added to the job
2014-03-11 10:14:38,452 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2014-03-11 10:14:38,452 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - creating jar file Job1332293282461754849.jar
2014-03-11 10:14:40,560 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - jar file Job1332293282461754849.jar created
2014-03-11 10:14:40,569 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2014-03-11 10:14:40,597 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2014-03-11 10:14:41,111 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2014-03-11 10:14:43,934 [Thread-10] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
2014-03-11 10:14:45,547 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId: job_201403091619_0036
2014-03-11 10:14:45,547 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - More information at: http://127.0.0.1:50030/jobdetails.jsp?jobid=job_201403091619_0036
2014-03-11 10:17:52,330 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job job_201403091619_0036 has failed! Stop running all dependent jobs
2014-03-11 10:17:52,330 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2014-03-11 10:17:52,334 [main] ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: java.io.IOException: InvalidRequestException(why:Expected 4 or 0 byte int (11))
2014-03-11 10:17:52,335 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!
2014-03-11 10:17:52,335 [main] INFO org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
1.0.4.8 0.9.2 root 2014-03-11 10:14:38 2014-03-11 10:17:52 UNKNOWN
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
job_201403091619_0036 insertformat,moretestvalues MAP_ONLY Message: Job failed! Error - # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201403091619_0036_m_000000 cql://cql3ks/test?output_query=UPDATE+cql3ks.test+set+b+%3D+%3F,
Input(s):
Failed to read data from "cql://cql3ks/moredata/"
Output(s):
Failed to produce result in "cql://cql3ks/test?output_query=UPDATE+cql3ks.test+set+b+%3D+%3F"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
job_201403091619_0036
2014-03-11 10:17:52,335 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
The Log-File is:
Backend error message
---------------------
java.io.IOException: InvalidRequestException(why:Expected 4 or 0 byte int (11))
at org.apache.cassandra.hadoop.cql3.CqlRecordWriter$RangeClient.run(CqlRecordWriter.java:248)
Caused by: InvalidRequestException(why:Expected 4 or 0 byte int (11))
at org.apache.cassandra.thrift.Cassandra$execute_prepared_cql3_query_result.read(Cassandra.java:41868)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at org.apache.cassandra.thrift.Cassandra$Client.recv_execute_prepared_cql3_query(Cassandra.java:1689)
at org.apache.cassandra.thrift.Cassandra$Client.execute_prepared_cql3_query(Cassandra.java:1674)
at org.apache.cassandra.hadoop.cql3.CqlRecordWriter$RangeClient.run(CqlRecordWriter.java:232)
What I am doing wrong? For me, it looks like a Bug, because when I use Strings instead of Integers in CQL while creating the Table, the Example works well.
Thank you
I just tested in with a fresh install of DSE-3.12, it works for me. You may need to re-install DSE and re-create the tables to test it again.