Error while connecting to documentum source - documentum

I am new to documentum,can any one help me how to resolve the following error
DfException:: THREAD: Thread-694; MSG: [DFC_BOF_CLASS_CACHE_INIT_ERROR] Failed to initialize class cache.; ERRORCODE: ff; NEXT: null
at com.documentum.fc.client.impl.bof.cache.NullClassCacheManager.checkCacheConsistency(NullClassCacheManager.java:46)
at com.documentum.fc.client.impl.bof.compoundclass.CompoundClassMgr.getImpClass(CompoundClassMgr.java:66)
at com.documentum.fc.client.impl.objectmanager.AbstractPersistentObjectFactory.getCompoundClassEntry(AbstractPersistentObjectFactory.java:39)
at com.documentum.fc.client.impl.objectmanager.PObjectFactoryWithAspects.makeObject(PObjectFactoryWithAspects.java:49)
at com.documentum.fc.client.impl.objectmanager.PersistentObjectManager.getObjectFromServer(PersistentObjectManager.java:356)
at com.documentum.fc.client.impl.objectmanager.PersistentObjectManager.getObject(PersistentObjectManager.java:311)
at com.documentum.fc.client.impl.session.Session.getObject(Session.java:963)
at com.documentum.fc.client.impl.session.SessionHandle.getObject(SessionHandle.java:653)
at com.gsk.rd.datacoe.documentum.DocumentumBot.processBotLogic(DocumentumBot.java:193)
at com.gsk.rd.datacoe.bots.BaseBot.processMessage(BaseBot.java:542)
at com.gsk.rd.datacoe.bots.BaseBot.fetchMessages(BaseBot.java:460)
at com.gsk.rd.datacoe.bots.BaseBot$1.run(BaseBot.java:185)
at java.lang.Thread.run(Thread.java:745)
I am using dfc.properties,can any one help me what is the cause of the above error
my dfc.properties file is
dfc.data.dir=./dfc
dfc.registry.mode=file
dfc.search.ecis.enable=false
dfc.search.ecis.host=
dfc.search.ecis.port=
dfc.tokenstorage.dir=./dfc/apptoken
dfc.tokenstorage.enable=false
dfc.docbroker.host[0]=random host_name
dfc.docbroker.port[0]=1489
#Additions 3/22
dfc.security.keystore.file=./dfc/dfc.keystore
dfc.session.secure_connect_default=try_native_first
dfc.globalregistry.repository[0]=random docbase
dfc.globalregistry.username=username
dfc.globalregistry.password=password
#dfc.cache.object.size=100
#dfc.session.pool.expiration_interval=10
does any values in dfc.properties is causing the above error?? please help me

Just to be sure: do you have actual values for host and global registry properties?

A DFC client only has one global registry hence this is wrong
dfc.globalregistry.repository[0]=random docbase
and should be
dfc.globalregistry.repository=random docbase

Related

CloudReports (Cloudsim Extension) on Cloudlet migrations gives error Null Exception

I'm Kind new in Cloudsim and CloudReports Extension so i don't know why
when running the CloudReports simulator it gives this Error:
nullpointerexception at org.cloudbus.cloudsim.power.powerdatacenter.processcloudletsubmit(powerdatacenter.java:269)
I was adding a cloudlet scheduling Algorithm to the Extension
All I can see that the error happens with cloudlets Migration.
I tried searching alot about how to Fix it but didn't find something that will help me.
the Error is like this:
java.lang.NullPointerException
at org.cloudbus.cloudsim.Datacenter.processCloudletSubmit(Datacenter.java:761)
at org.cloudbus.cloudsim.power.PowerDatacenter.processCloudletSubmit(PowerDatacenter.java:269)
at org.cloudbus.cloudsim.Datacenter.processEvent(Datacenter.java:159)
at org.cloudbus.cloudsim.core.SimEntity.run(SimEntity.java:406)
at org.cloudbus.cloudsim.core.CloudSim.runClockTick(CloudSim.java:471)
at org.cloudbus.cloudsim.core.CloudSim.run(CloudSim.java:835)
at org.cloudbus.cloudsim.core.CloudSim.startSimulation(CloudSim.java:151)
at cloudreports.simulation.Simulation.runSimulation(Simulation.java:157)
at cloudreports.simulation.Simulation.runAllSimulations(Simulation.java:129)
at cloudreports.simulation.Simulation.run(Simulation.java:98)
at java.lang.Thread.run(Thread.java:748)
plz Advise;
Regards.
I ran into the same problem while I was following the simulation examples provided in the CloudSim project. After debugging figured out that the problem was with setting the UserID of the created cloudlet. I was setting a random value as the userId while the simulation retrieves the cloudlet using the brokerId. So setting the userId with the brokerId that was created did the trick for me.
// dataCenterBroker is the instance of the broker which you created
int brokerId = dataCenterBroker.getId();
//setting the userId after creating the cloudlet
cloudLet.setUserId(brokerId)

Error when connecting Hive with kibi?

I am using kibi-community-demo-full-4.6.4-linux-x64 version.
In datasource:
"connection_string": "jdbc:hive://localhost:10000/root",
"libpath": "/home/pare/Downloads/jar/",
"drivername": "org.apache.hadoop.hive.jdbc.HiveDriver",
"libs": "hive-jdbc-0.11.0.jar,hive-metastore-0.11.0.jar,libthrift-0.9.1.jar,hive-service-0.13.1.jar,hive-jdbc-1.2.1.2.3.2.0-2950-standalone.jar,hadoop-common-2.7.1.2.3.2.0-2950.jar",
After that when in queries I write a query it will show error like:
Queries Editor: Error 400 Bad Request: Error running static method java.lang.IllegalArgumentException: Bad URL format at org.apache.hive.jdbc.Utils.parseURL(Utils.java:185) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:84)
What is the error can any one explain me how to solve it?
I am able to connect after changing the jars version.
and also I changed the driver name "org.apache.hive.jdbc.HiveDriver".

Pig: STORE with MongoInsertStorage don't work

I'm executing this simple code in a pig script:
REGISTER /home/myuser/mongodb/mongo-2.10.1.jar
REGISTER /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/mongo-hadoop-cdh4-1.2.0/mongo-hadoop-core_cdh4.3.0-1.2.0.jar
REGISTER /opt/cloudera/parcels/CDH-4.5.0-1.cdh4.5.0.p0.30/lib/mongo-hadoop-cdh4-1.2.0/mongo-hadoop-pig_cdh4.3.0-1.2.0.jar
set mapred.map.tasks.speculative.execution false;
set mapred.reduce.tasks.speculative.execution false;
col = LOAD 'mongodb://localhost:27017/mydb.mycollection' using com.mongodb.hadoop.pig.MongoLoader ('id:chararray, companyId:chararray, ts:chararray', 'id');
STORE col INTO 'mongodb://localhost:27017/mydb.mycollection2' USING com.mongodb.hadoop.pig.MongoInsertStorage ('', '');
it returns the following error:
Location Config: Configuration: For URI: file:/tmp/temp449583595/tmp-109467318
2014-04-04 14:30:40,913 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2017: Internal error creating job configuration.
Details at logfile: /home/myuser/pig/pig_1396614639609.log
the end of file pig_1396614639609.log:
... at org.apache.hadoop.util.RunJar.main(RunJar.java:208) Caused
by: java.lang.IllegalArgumentException: Invalid URI Format. URIs must
begin with a mongodb:// protocol string. at
com.mongodb.hadoop.pig.MongoInsertStorage.setStoreLocation(MongoInsertStorage.java:159)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:576)
... 17 more
I don't know where is the error so that mongodb protocol string "mongodb://" is well-written.
I have a similar issue when running LOAD and STORE using mongo-hadoop on the same Pig script.
It throws
java.net.UnknownHostException: localhost:27017 is not a valid Inet address
at org.apache.hadoop.net.NetUtils.verifyHostnames(NetUtils.java:587)
at org.apache.hadoop.mapred.JobInProgress.initTasks(JobInProgress.java:734)
at org.apache.hadoop.mapred.JobTracker.initJob(JobTracker.java:3890)
at org.apache.hadoop.mapred.EagerTaskInitializationListener$InitJob.run(EagerTaskInitializationListener.java:79)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
I didn't investigate further, but either is a bug or some parameter related to locking. I don't know.
If I run the same code, but loading and storing in different scripts it runs without a problem.

apache.commons.configuration Unable to load the configuration from the URL

For a project I'm working on I need to change some property values at run time and save them. Looking for a solution I found the apache commons configuration.
I have looked at some others topics solving my first problems but now, an error that says:
org.apache.commons.configuration.ConfigurationException: Unable to load the configuration from the URL file:/C:/Users/sensor/Documents/NetBeansProjects/HermanWijn/dist/run1776677076/HermanWijn.jar!/Database/DataProptester.properties
Looking at this toppic: Property file not reflecting the modified changes using Apache Commons Configuration
My code should be working but for some reason I get an error.
The JDBCWijnDAO is a class in the same package as the properties file, Of course I want to do more than just load the prop file, but at the moment when creating the new propconfig, gives an error which I need to solve.
code:
URL resource = JDBCWijnDAO.class.getResource("DataProptester.properties");
PropertiesConfiguration config = new PropertiesConfiguration(resource.getPath());
Okay, i think i "know" what the problem is but I don't really know how to fix it, in the trace there is a:
Caused by: java.io.FileNotFoundException: C:\Users\Sensor\Documents\NetBeansProjects\HermanWijn\dist\run1776677076\HermanWijn.jar!\Database\DataProptester.properties (The system cannot find the path specified)
I assume there is something wrong with the total path.
errors:
org.apache.commons.configuration.ConfigurationException: Unable to load the configuration from the URL file:/C:/Users/Sander/Documents/NetBeansProjects/HermanWijn/dist/run1776677076/HermanWijn.jar!/Database/DataProptester.properties
at org.apache.commons.configuration.DefaultFileSystem.getInputStream(DefaultFileSystem.java:86)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:323)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:261)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:238)
at org.apache.commons.configuration.AbstractFileConfiguration.<init>(AbstractFileConfiguration.java:158)
at org.apache.commons.configuration.PropertiesConfiguration.<init>(PropertiesConfiguration.java:253)
at hermanwijn.knophandlers.DataBestandLocatieSelectieHandler.handle(DataBestandLocatieSelectieHandler.java:49)
at hermanwijn.knophandlers.DataBestandLocatieSelectieHandler.handle(DataBestandLocatieSelectieHandler.java:29)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:69)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:217)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:170)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:38)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:37)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:53)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:28)
at javafx.event.Event.fireEvent(Event.java:171)
at javafx.scene.Node.fireEvent(Node.java:6863)
at javafx.scene.control.Button.fire(Button.java:179)
at com.sun.javafx.scene.control.behavior.ButtonBehavior.mouseReleased(ButtonBehavior.java:193)
at com.sun.javafx.scene.control.skin.SkinBase$4.handle(SkinBase.java:336)
at com.sun.javafx.scene.control.skin.SkinBase$4.handle(SkinBase.java:329)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:64)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:217)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:170)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:38)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:37)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:53)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:33)
at javafx.event.Event.fireEvent(Event.java:171)
at javafx.scene.Scene$MouseHandler.process(Scene.java:3324)
at javafx.scene.Scene$MouseHandler.process(Scene.java:3164)
at javafx.scene.Scene$MouseHandler.access$1900(Scene.java:3119)
at javafx.scene.Scene.impl_processMouseEvent(Scene.java:1559)
at javafx.scene.Scene$ScenePeerListener.mouseEvent(Scene.java:2261)
at com.sun.javafx.tk.quantum.GlassViewEventHandler.handleMouseEvent(GlassViewEventHandler.java:228)
at com.sun.glass.ui.View.handleMouseEvent(View.java:528)
at com.sun.glass.ui.View.notifyMouse(View.java:922)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.access$100(WinApplication.java:29)
at com.sun.glass.ui.win.WinApplication$2$1.run(WinApplication.java:67)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.FileNotFoundException: C:\Users\Sander\Documents\NetBeansProjects\HermanWijn\dist\run1776677076\HermanWijn.jar!\Database\DataProptester.properties (The system cannot find the path specified)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:97)
at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
at java.net.URL.openStream(URL.java:1035)
at org.apache.commons.configuration.DefaultFileSystem.getInputStream(DefaultFileSystem.java:82)
Fixed, the problem was the package name was not included, it must be:
URL resource = JDBCWijnDAO.class.getResource("/Database/DataProptester.properties");

Lucene Search Error Stack

I am seeing the following error when trying to search using Lucene. (version 1.4.3). Any ideas as to why I could be seeing this and how to fix it?
Caused by: java.io.IOException: read past EOF
at org.apache.lucene.store.InputStream.refill(InputStream.java:154)
at org.apache.lucene.store.InputStream.readByte(InputStream.java:43)
at org.apache.lucene.store.InputStream.readVInt(InputStream.java:83)
at org.apache.lucene.index.FieldInfos.read(FieldInfos.java:195)
at org.apache.lucene.index.FieldInfos.<init>(FieldInfos.java:55)
at org.apache.lucene.index.SegmentReader.initialize(SegmentReader.java:109)
at org.apache.lucene.index.SegmentReader.<init>(SegmentReader.java:89)
at org.apache.lucene.index.IndexReader$1.doBody(IndexReader.java:118)
at org.apache.lucene.store.Lock$With.run(Lock.java:109)
at org.apache.lucene.index.IndexReader.open(IndexReader.java:111)
at org.apache.lucene.index.IndexReader.open(IndexReader.java:106)
at org.apache.lucene.search.IndexSearcher.<init>(IndexSearcher.java:43)
In this same environment I also see the following error:
Caused by: java.io.IOException: Lock obtain timed out:
Lock#/tmp/lucene-3ec31395c8e06a56e2939f1fdda16c67-write.lock
at org.apache.lucene.store.Lock.obtain(Lock.java:58)
at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:223)
at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:213)
The same code works in a test environment, however not in production. Cannot identify any obvious differences between the two environments.
File permissions are wrong (it needs write permission) or your are not able to access a locked file that the current process needs.