I am trying to combine log files from s3 to s3 using the following command.
s3-dist-cp --src s3://path/to/ym=2020/ --dest s3://path/to/ym=2020/ --groupBy='.*/(\d{8}).+(\.json) --deleteOnSuccess'
I have the following files in
s3://path/to/ym=2020/20200802010010.json
s3://path/to/ym=2020/20200802010020.json
s3://path/to/ym=2020/20200802010030.json
s3://path/to/ym=2020/20200802010040.json
s3://path/to/ym=2020/20200802010050.json
s3://path/to/ym=2020/20200803010010.json
s3://path/to/ym=2020/20200803010020.json
s3://path/to/ym=2020/20200803010030.json
s3://path/to/ym=2020/20200803010040.json
s3://path/to/ym=2020/20200803010050.json
expected result is
s3://path/to/ym=2020/20200802.json
s3://path/to/ym=2020/20200803.json
but I get the following errors
20/08/04 03:29:14 INFO s3distcp.S3DistCp: Created 1 files to copy 856 files
Exception in thread "main" java.lang.NullPointerException
at com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem.mkdirs(S3NativeFileSystem.java:1052)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1961)
at com.amazon.ws.emr.hadoop.fs.EmrFileSystem.mkdirs(EmrFileSystem.java:443)
at com.amazon.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:893)
at com.amazon.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:728)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at com.amazon.elasticmapreduce.s3distcp.Main.main(Main.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:239)
at org.apache.hadoop.util.RunJar.main(RunJar.java:153)
The following command is working.(change dest path to other path)
s3-dist-cp --src s3://path/to/ym=2020/ --dest s3://path/to/archives/ym=2020/ --groupBy='.*/(\d{8}).+(\.json) --deleteOnSuccess'
Related
I am using Flink1.6.1 and Hadoop2.7.5. on first I start a flink
bin/yarn-session.sh -n 2 -jm 1024 -tm 1024 -d
then submit a task
./bin/flink run ./examples/batch/WordCount.jar -input hdfs://CS-201:9000/LICENSE -output hdfs://CS-201:9000/wordcount-result.txt
I got a error:
[root#CS-201 flink-1.6.1]# ./bin/flink run
./examples/batch/WordCount.jar -input hdfs://CS-201:9000/LICENSE
-output hdfs://CS-201:9000/wordcount-result.txt 2019-05-19 15:31:11,357 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli
- Found Yarn properties file under /tmp/.yarn-properties-root. 2019-05-19 15:31:11,357 INFO
org.apache.flink.yarn.cli.FlinkYarnSessionCli - Found
Yarn properties file under /tmp/.yarn-properties-root. 2019-05-19
15:31:11,737 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli
- YARN properties set default parallelism to 2 2019-05-19 15:31:11,737 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli -
YARN properties set default parallelism to 2 YARN properties set
default parallelism to 2 2019-05-19 15:31:11,777 INFO
org.apache.hadoop.yarn.client.RMProxy -
Connecting to ResourceManager at CS-201/192.168.1.201:8032 2019-05-19
15:31:11,887 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli
- No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2019-05-19 15:31:11,887 INFO
org.apache.flink.yarn.cli.FlinkYarnSessionCli - No
path for the flink jar passed. Using the location of class
org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
2019-05-19 15:31:11,891 WARN
org.apache.flink.yarn.AbstractYarnClusterDescriptor -
Neither the HADOOP_CONF_DIR nor the YARN_CONF_DIR environment variable
is set.The Flink YARN Client needs one of these to be set to properly
load the Hadoop configuration for accessing YARN. 2019-05-19
15:31:11,979 INFO org.apache.flink.yarn.AbstractYarnClusterDescriptor
- Found application JobManager host name 'cs-202' and port '52389' from supplied application id 'application_1558248666499_0003' Starting
execution of program
------------------------------------------------------------ The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: Could not
retrieve the execution result. (JobID:
471f0c2d047aba74ea621c5bfe782cbf) at
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:260)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:486)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:474)
at
org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
at
org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:85)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
at
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426)
at
org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:804)
at
org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:280)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)
at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1044)
at
org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1120)
at java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at
org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1120)
Caused by: org.apache.flink.runtime.client.JobSubmissionException:
Failed to submit JobGraph. at
org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$8(RestClusterClient.java:379)
at
java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
at
java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at
java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
at
org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$5(FutureUtils.java:213)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at
java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:929)
at
java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748) Caused by:
java.util.concurrent.CompletionException:
org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could
not complete the operation. Exception is not retryable. at
java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at
java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at
java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911)
at
java.util.concurrent.CompletableFuture$UniRelay.tryFire(CompletableFuture.java:899)
... 12 more Caused by:
org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could
not complete the operation. Exception is not retryable. ... 10 more
Caused by: java.util.concurrent.CompletionException:
org.apache.flink.runtime.rest.util.RestClientException: [Job
submission failed.] at
java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at
java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at
java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:911)
at
java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:953)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926)
... 4 more Caused by:
org.apache.flink.runtime.rest.util.RestClientException: [Job
submission failed.] at
org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:310)
at
org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:294)
at
java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952)
... 5 more
why it happen? and How to fix that..
i am getting below error when trying to create a bar using this command
Command: D:\ApacheAnt\apache-ant-1.9.7-bin\apache-ant-1.9.7\bin\ant -f build.xml
Error: D:\Ant Build and Deployment Scripts\AntBuild\Scripts>D:\ApacheAnt\apache-ant-1.9.7-bin\apache-ant-1.9.7\bin\ant -f build.xml
Buildfile: D:\Ant Build and Deployment Scripts\AntBuild\Scripts\build.xml
runScript:
buildWorkspace:
[echo] The applications [MF_HTB_B1_HTTPSOAPINPUT] will be checkout from SVN
under the URL [https://punitp134088l.ad.infosys.com/svn/ESB_Rep/branches/ESB_Fr
ameWork/ESB_FRAMEWORK/CODE_BASE/HTTP%20Based%20Framework/MF_HTB_B1]
[mkdir] Created dir: D:\Ant Build and Deployment Scripts\AntBuild\BuildPacka
ge\tempWorkspace\MF_HTB_B1_HTTPSOAPINPUT
svnCheckout:
[echo] Checking-out MF_HTB_B1_HTTPSOAPINPUT from SVN
[svn] <Checkout> started ...
[svn] Command completed abnormally.
[svn] <Checkout> finished.
[echo] Checkout completed
buildBarfiles:
[echo] Compile and Create Bar files for [MF_HTB_B1_HTTPSOAPINPUT] applicati
ons
[mkdir] Created dir: D:\Ant Build and Deployment Scripts\AntBuild\BuildPacka
ge\BarFiles
[echo] D:\Ant Build and Deployment Scripts\AntBuild\Scripts
[echo] Compiling Application [MF_HTB_B1_HTTPSOAPINPUT] for any errors
[exec] Result: 1
[delete] Deleting: D:\Ant Build and Deployment Scripts\AntBuild\BuildPackage\
Logs\MF_HTB_B1_HTTPSOAPINPUT.log
[echo] Failed to create MF_HTB_B1_HTTPSOAPINPUT.bar, Please check ..\BuildP
ackage\Logs\BuildPackage.log for logs
applyBarOverride:
[echo] Executing [mqsiapplybaroverride] for Application [MF_HTB_B1_HTTPSOAP
INPUT]
BUILD FAILED
D:\Ant Build and Deployment Scripts\AntBuild\Scripts\build.xml:42: The following
error occurred while executing this line:
D:\Ant Build and Deployment Scripts\AntBuild\Scripts\build.xml:175: The followin
g error occurred while executing this line:
D:\Ant Build and Deployment Scripts\AntBuild\Scripts\build.xml:182: Execute fail
ed: java.io.IOException: Cannot run program "mqsiapplybaroverride" (in directory
"D:\Ant Build and Deployment Scripts\AntBuild\Scripts"): CreateProcess error=2,
The system cannot find the file specified
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1059)
at java.lang.Runtime.exec(Runtime.java:629)
at org.apache.tools.ant.taskdefs.launcher.Java13CommandLauncher.exec(Jav
a13CommandLauncher.java:58)
at org.apache.tools.ant.taskdefs.Execute.launch(Execute.java:426)
at org.apache.tools.ant.taskdefs.Execute.execute(Execute.java:440)
at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:629)
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:670)
at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:496)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:56)
at java.lang.reflect.Method.invoke(Method.java:620)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.jav
a:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:68)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:56)
at java.lang.reflect.Method.invoke(Method.java:620)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.jav
a:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.taskdefs.MacroInstance.execute(MacroInstance.jav
a:396)
at net.sf.antcontrib.logic.ForDelegate.doSequentialIteration(ForDelegate
.java:228)
at net.sf.antcontrib.logic.ForDelegate.doTheTasks(ForDelegate.java:253)
at net.sf.antcontrib.logic.ForDelegate.execute(ForDelegate.java:213)
at net.sf.antcontrib.logic.For.execute(For.java:166)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:56)
at java.lang.reflect.Method.invoke(Method.java:620)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.jav
a:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:435)
at org.apache.tools.ant.Target.performTasks(Target.java:456)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405)
at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(Single
CheckExecutor.java:38)
at org.apache.tools.ant.Project.executeTargets(Project.java:1260)
at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:441)
at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)
at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:56)
at java.lang.reflect.Method.invoke(Method.java:620)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.jav
a:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:68)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:95)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:56)
at java.lang.reflect.Method.invoke(Method.java:620)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.jav
a:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:435)
at org.apache.tools.ant.Target.performTasks(Target.java:456)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1405)
at org.apache.tools.ant.Project.executeTarget(Project.java:1376)
at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExe
cutor.java:41)
at org.apache.tools.ant.Project.executeTargets(Project.java:1260)
at org.apache.tools.ant.Main.runBuild(Main.java:854)
at org.apache.tools.ant.Main.startAnt(Main.java:236)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:285)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:112)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find th
e file specified
at java.lang.ProcessImpl.create(Native Method)
at java.lang.ProcessImpl.<init>(ProcessImpl.java:397)
at java.lang.ProcessImpl.start(ProcessImpl.java:148)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1040)
... 62 more
Total time: 10 seconds
i am running a command to execute ant script. Command: D:\ApacheAnt\apache-ant-1.9.7-bin\apache-ant-1.9.7\bin\ant -f build.xml this is for creating a bar file using apache ant s/w. Build.xml ant script is used to create a BuildPackage.zip file and include the bar files, deployment scripts and environment specific property files into it. but not able to create the bar. Getting above error.
Please assist.
Assuming you are trying to create a .jar file using ant script, try to use JDK as installed JRE in eclipse(if you have), and to set the JAVA-HOME variable to JDK path.
Here is the pig stack trace. I am running my code over daily data individually, but it will fail some days. Others are done within 5 min. I have about 10 group all parallel 1 at the end to do some counts. Is this the reason of error?
Backend error message
Error: Java heap space
Pig Stack Trace
ERROR 2997: Unable to recreate exception from backed error: Error: Java heap space
org.apache.pig.backend.executionengine.ExecException: ERROR 0: org.apache.pig.backend.executionengine.ExecException: ERROR 2997: Unable to recreate exception from backed error: Error: Java heap space
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.getStats(MapReduceLauncher.java:819)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:452)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:282)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1431)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1416)
at org.apache.pig.PigServer.execute(PigServer.java:1405)
at org.apache.pig.PigServer.executeBatch(PigServer.java:456)
at org.apache.pig.PigServer.executeBatch(PigServer.java:439)
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:171)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:234)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:495)
at org.apache.pig.Main.main(Main.java:170)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 2997: Unable to recreate exception from backed error: Error: Java heap space
at org.apache.pig.backend.hadoop.executionengine.Launcher.getErrorMessages(Launcher.java:215)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.getStats(MapReduceLauncher.java:803)
... 19 more
================================================================================
Pig Stack Trace
ERROR 2244: Job failed, hadoop does not return any error message
org.apache.pig.backend.executionengine.ExecException: ERROR 2244: Job failed, hadoop does not return any error message
at org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:179)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:234)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:205)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:81)
at org.apache.pig.Main.run(Main.java:495)
at org.apache.pig.Main.main(Main.java:170)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
`I am using android eclipse,i develop a project and add the dependencies GOOGLE play services,
app compact v7 in build path.When i run it as a normal android project it is perfectly executing,I add maven POM.XML file to generate APK file,when i proceed to maven install it gives error.
Error :
[ERROR] Error when generating sources.
org.apache.maven.plugin.MojoExecutionException:
at com.jayway.maven.plugins.android.phase01generatesources.GenerateSourcesMojo.generateR(GenerateSourcesMojo.java:608)
at com.jayway.maven.plugins.android.phase01generatesources.GenerateSourcesMojo.execute(GenerateSourcesMojo.java:229)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:106)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:317)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:152)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:555)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:214)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:158)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
at org.codehaus.classworlds.Launcher.main(Launcher.java:46)
Caused by: com.jayway.maven.plugins.android.ExecutionException: ANDROID-040-001: Could not execute: Command = cmd.exe /X /C "D:\NagarjunaWork\android\adt-bundle-windows-x86_64-20140321\sdk\build-tools\android-4.4.2\aapt.exe package -f --no-crunch -I D:\NagarjunaWork\android\adt-bundle-windows-x86_64-20140321\sdk\platforms\android-19\android.jar -M D:\5-8-2014\QuickRideApp\QuickRide\AndroidManifest.xml -S D:\5-8-2014\QuickRideApp\QuickRide\res -A D:\5-8-2014\QuickRideApp\QuickRide\target\generated-sources\combined-assets -m -J D:\5-8-2014\QuickRideApp\QuickRide\target\generated-sources\r --output-text-symbols D:\5-8-2014\QuickRideApp\QuickRide\target --auto-add-overlay", Result = -1073741819
at com.jayway.maven.plugins.android.CommandExecutor$Factory$DefaultCommandExecutor.executeCommand(CommandExecutor.java:252)
at com.jayway.maven.plugins.android.phase01generatesources.GenerateSourcesMojo.generateR(GenerateSourcesMojo.java:604)
... 23 more
This can be caused by problems in your XML files, such as referring to a resource ID that no longer exists or never existed. See MojoExecutionException: Maven with Android and aapt.exe has stopped working for possible solutions.
While loading files to hadoop via hive. I got following error:
Failed with exception org.apache.hadoop.hdfs.server.namenode.NotReplicatedYetException: Not replicated yet:/tmp/hive-hadoop/hive_2012-11-22_19-31-25_550_6464715632657097841/-ext-10000/outlog_11_oct12.csv
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1257)
at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:422)
at sun.reflect.GeneratedMethodAccessor17.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.CopyTask
And according to other threads, its datanode issue, but all of my datanodes are up and running.
I think the problem maybe lack of write permissions on /tmp directory.
Try hadoop dfs -chmod g+w /tmp ?