starting ignite cluster from command line - ignite

I am trying to start Ignite cluster from the command line on windows:
this is what I did:
Download Ignite binary version and kept it in C driver.
Set Environment Variable IGNITE_HOME to that folder location.
in command line I open the directory:
C:\apache-ignite-fabric-2.2.0-bin\bin
the from that directory :
C:\apache-ignite-fabric-2.2.0-bin\bin>sh ignite.sh examples/config/example-ignite.xml
I am getting the following error:
Failed to create Ignite component (consider adding ignite-spring module to classpath) [component=SPRING, cls=org.apache.ignite.internal.processors.spring.IgniteSpringProcessorImpl]
what can be the reason for this error?
found the solution for that:
need to run it in bat file and not sh file:
C:\apache-ignite-fabric-2.2.0-bin\bin>ignite.bat examples/config/example-ignite.xml

If you're on Windows I imagine you should try ignite.bat?
ignite.sh might have problems with classpath when run on Windows, that would explain it.

Related

Redis .conf file running problem for slave redis instance

The problem is my Windows 10 does not understand redis commands. I downlaoded and installed on D/Program Files/Redis my cli and server .msi files.
I run with command:
redis-server D:/Program Files/Redis/redis-slave.windows.conf
and expect to get an instance on redis slave with configuration provided in course in .conf file but ger error:
Invalid argument during startup: Failed to open the .conf file: Files/Redis/redis-slave.windows.conf CWD=D:\Program Files\Redis
and the problem is not in wrong configuration because I can copy default .conf redis file and it is the same
Another problem - but not so important as above to me -- but similar when I try to run a cluster Windows 10 does not know how to open this file. I run command:
D:\Program Files\Redis2\redis-7.0.4\utils\create-cluster>./create-cluster start
and receive windows where I need to choose a program to run this file but does not seen it in your MacOS case, anyway this create-cluster file does not have extension so I do not know what to do to make it run

Start Node in Ignite

i want to start ignite node with a configuration name as example-igfs.xml. i have alter this configuration for using IGFS as cache layer for HDFS. but when i execute the below command for start ignite node i encounter with error:
java.lang.NoClassDefFoundError: com/google/common/base/Preconditions
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configuration.java:361)
at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Configuration.java:374)
at org.apache.hadoop.conf.Configuration.(Configuration.java:456)
at org.apache.ignite.internal.processors.hadoop.impl.HadoopUtils.safeCreateConfiguration(HadoopUtils.java:334)
at org.apache.ignite.internal.processors.hadoop.impl.delegate.HadoopBasicFileSystemFactoryDelegate.start(HadoopBasicFileSystemFactoryDelegate.java:129)
java.lang.NoClassDefFoundError error usually comes when ignite can't find required libraries(Jars).
In your case, you have to move JARs to $IGNITE_HOME\libs folder.
Create a folder in libs directory, let's say hadoop-libs and move all all required JARs to this folder.
I am not expert of hadoop but it seems that you are missing hadoop client and its dependent google guava libraries.

Aerospike docker - 100L, 'UDF: Execution Error 1

I deployed an Aerospike container using the official docker hub image. When I try to execute test_list = client.llist(key, 'test_list'), my Python client script returns the following error:
exception.UDFError: (100L, 'UDF: Execution Error 1', 'src/main/llist/llist_operations.c', 93)
I looked at the Aerospike logs and found that each time this code is executed, the error below gets printed:
: WARNING (udf): (src/main/mod_lua.c:599) Lua Create Error: module 'llist' not found:
no field package.preload['llist']
no file './llist.lua'
no file '/usr/local/share/luajit-2.0.3/llist.lua'
no file '/usr/local/share/lua/5.1/llist.lua'
no file '/usr/local/share/lua/5.1/llist/init.lua'
no file '/opt/aerospike/sys/udf/lua/llist.lua'
no file '/opt/aerospike/sys/udf/lua/external/llist.lua'
no file '/opt/aerospike/usr/udf/lua/llist.lua'
no file './llist.so'
no file '/usr/local/lib/lua/5.1/llist.so'
no file '/usr/local/lib/lua/5.1/loadall.so'
no file '/opt/aerospike/sys/udf/lua/llist.so'
no file '/opt/aerospike/sys/udf/lua/external/llist.so'
no file '/opt/aerospike/usr/udf/lua/llist.so'
: INFO (udf): (udf.c:954) lua error, ret:1
I could not find the relevant lua files or a lua installation in the container. I have my code working fine when I run it directly on the host. Is there some extra configuration that needs to be done to the container?
LDTs were dropped in 3.15.
https://www.aerospike.com/docs/guide/ldt_guide.html
Excerpt:
Aerospike has removed the Large Data Type feature as of server version 3.15 after deprecating this functionality 12 months earlier. Please see the removal notice and deprecation notice. The features listed below are no longer in Aerospike servers.

Pentaho : Error while running pan.bat file in cmd line

I am trying to run my .ktr file in cmd line. I have my data-integration setup in this path:
C:\Users\dhamodharan.a\Desktop\pdi-ce-4.4.0-stable\data-integration
and my .ktr file in this path:
C:\Users\dhamodharan.a\Desktop\test.krt
while am trying to run that in cmd line I am getting the following error
DEBUG: Using PENTAHO_JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files (x86)\Java\jre7
DEBUG: _PENTAHO_JAVA=C:\Program Files (x86)\Java\jre7\bin\java.exe
WARN 11-08 11:47:09,728 - Unable to load Hadoop Configuration from "file:///C:/
Users/dhamodharan.a/Desktop/pdi-ce-4.4.0-stable/data-integration/plugins/pentaho
-big-data-plugin/hadoop-configurations/mapr". For more information enable debug
logging.
INFO 11-08 11:47:09,759 - Pan - Start of run.
ERROR: No repository provided, can't load transformation.
C:\Users\dhamodharan.a\Desktop\pdi-ce-4.4.0-stable\data-integration>e:C:\Users\d
hamodharan.a\Desktop.test.ktr /level:Basic
I am trying to run an input excel file and make the output as excel. Do I also need to create repository for that?
If I try to create repository option I saw only for dbms not for excel.
Make sure the environment variable PENTAHO_JAVA_HOME is set correctly and then it'll work.
For some reason the java install is not on your path. But if spoon works you must have it somewhere.
i did the environment variable for JAVA_HOME now pan.bat and kitchen.bat works fine.
here the command:
pan.bat /file:C:\Users\dhamodharan.a\Desktop\dhamu\test.ktr /level:Basic > C:\Users\dhamodharan.a\Desktop\dhamu\test.log
thanks

Apache oozie sharedlib is showing a blank list

Relatively new to Apache OOZIE and did an installation on Ubuntu 14.04, Hadoop 2.6.0, JDK 1.8. I was able to install oozie and the web console is visible at the 11000 port of my server.
Now while i copied the examples bundled with oozie and tried to run them i am running into an error which says no sharedlib exists.
Installed the sharedlib as below-
bin/oozie-setup.sh sharelib create -fs hdfs://localhost:54310
(my namenode is running on localhost 54310 and JT on localhost 54311)
hadoop fs -ls /user/hduser/share/lib is showing shared library created as per the oozie-site.xml file. However when i check the shared library using the command -
oozie admin -oozie http://localhost:11000/oozie -shareliblist the list is blank and also jobs are failing for the same reason.
Any clues on how should i approach this problem?
Thanks.
The sharelib create command looks fine.
If you havent done so already copy the core-site.xml from your hadoop installation folder into $OOZIE_HOME/conf/hadoop-conf/.
There might already be a "placeholder" core-site.xml in the hadoop-conf folder, delete or rename that one. Oozie doesnt get its hadoop configuration directly from your hadoop install (like hive for example) but from the core-site.xml you place in that hadoop-conf folder.
Okay i got a solution for this.
So when i was trying to create the sharedlib directory it was doing on HDFS but while running the job local path was being refereed. So i extracted the oozie-sharedlib tar.gz file in my local /user/hduser/share/lib directory and its working now.
But did not get the reason so its still an open question.
I have encountered the same issue and it turned out that
oozie was not able to communicate with hdfs, as it was not able to find the location for core-site.xml or any other hadoop configuration which has to be declared inside oozie-site.xml.
Corresponding property in oozie-site.xml is oozie.service.HadoopAccessorService.hadoop.configurations
this property was defined wrongly in my case.
changed it to point to where my Hadoop configuration xmls are present and then it started communicating with hdfs and hence was able to locate the sharelib on hdfs