Glassfish startup failure - glassfish

I have set up a glassfish server for learning about it. After setting up and configuring depending on the quickstart guide, I was able to run the server and domain1 without any problems. after some time, it started to log the lines below:
[#|2013-01-11T15:43:45.246+0800|WARNING|glassfish3.1.2|java.util.prefs|_ThreadID=105;_ThreadName=Thread-2;|Could not lock User prefs. Unix error code 5.|#]
[#|2013-01-11T15:43:45.246+0800|WARNING|glassfish3.1.2|java.util.prefs|_ThreadID=105;_ThreadName=Thread-2;|Couldn't flush user prefs: java.util.prefs.BackingStoreException: Couldn't get file lock.|#]
And I made a little googling about this and found this link and applied the option which was recommended there. After restarting glassfish although the server log says it started, I am seeing this in the commandline:
./asadmin start-domain domain1
Waiting for domain1 to start .............Error starting domain domain1.
The server exited prematurely with exit code 1.
Before it died, it produced the following output:
Launching GlassFish on Felix platform
ERROR: Error creating bundle cache. (java.lang.Exception: Unable to lock bundle cache: java.io.IOException: Input/output error)
java.lang.Exception: Unable to lock bundle cache: java.io.IOException: Input/output error
at org.apache.felix.framework.cache.BundleCache.<init>(BundleCache.java:176)
at org.apache.felix.framework.Felix.init(Felix.java:629)
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiFrameworkLauncher$1.run(OSGiFrameworkLauncher.java:88)
Exception in thread "Thread-1" java.lang.RuntimeException: org.osgi.framework.BundleException: Error creating bundle cache.
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiFrameworkLauncher$1.run(OSGiFrameworkLauncher.java:90)
Caused by: org.osgi.framework.BundleException: Error creating bundle cache.
at org.apache.felix.framework.Felix.init(Felix.java:634)
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiFrameworkLauncher$1.run(OSGiFrameworkLauncher.java:88)
Caused by: java.lang.Exception: Unable to lock bundle cache: java.io.IOException: Input/output error
at org.apache.felix.framework.cache.BundleCache.<init>(BundleCache.java:176)
at org.apache.felix.framework.Felix.init(Felix.java:629)
... 1 more
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.sun.enterprise.glassfish.bootstrap.GlassFishMain.main(GlassFishMain.java:97)
at com.sun.enterprise.glassfish.bootstrap.ASMain.main(ASMain.java:55)
Caused by: org.glassfish.embeddable.GlassFishException: java.lang.NullPointerException
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntimeBuilder.build(OSGiGlassFishRuntimeBuilder.java:164)
at org.glassfish.embeddable.GlassFishRuntime._bootstrap(GlassFishRuntime.java:157)
at org.glassfish.embeddable.GlassFishRuntime.bootstrap(GlassFishRuntime.java:110)
at com.sun.enterprise.glassfish.bootstrap.GlassFishMain$Launcher.launch(GlassFishMain.java:112)
... 6 more
Caused by: java.lang.NullPointerException
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntimeBuilder.newFramework(OSGiGlassFishRuntimeBuilder.java:230)
at com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntimeBuilder.build(OSGiGlassFishRuntimeBuilder.java:133)
... 9 more
Error stopping framework: java.lang.NullPointerException
java.lang.NullPointerException
at com.sun.enterprise.glassfish.bootstrap.GlassFishMain$Launcher$1.run(GlassFishMain.java:203)
Command start-domain failed.
I have tried to find a solution, removing the cache folder in the domain directory or changing access permissions but the problem keeps occuring and i cant start my domain.
any ideas how to fix this problem?

I had the same IO Error as in that stack after installing Glassfish and found out the following:
Glassfish 3.1.2 is using the felix library for OSGI stuff and this one wants to lock files using the core Java method java.nio.channels.FileChannel.tryLock(). This appears not to work when the file to be locked is on a filesystem residing on certain kinds of NAS and leads to an IO error after a long timeout.
Make sure to install critical parts or all of the Glassfish on local disks and this error will disappear.
The error can easily be reproduced running the following Java class:
import java.io.File;
import java.io.FileOutputStream;
import java.nio.channels.FileChannel;
public class TryLock {
/**
* #param args
*/
public static void main(String[] args) {
// name of a file is the only parameter
File lockFile = new File(args[0]);
FileChannel fc = null;
FileOutputStream fos = null;
try {
fos = new FileOutputStream(lockFile);
fc = fos.getChannel();
// This is the code that fails on some NAS (low-level operation?):
fc.tryLock();
} catch( Throwable th) {
th.printStackTrace();
}
System.out.println("Success");
}
}

Related

Is kafka SSL connection supported in native?

I wan't to connect to kafka topic with SSL using smallrye-kafka in quarkus.
My code works when executing mvn compile quarkus:dev.
mvn clean package -Pnative is successful. But when i run native binary it fails because it can't find SaslClientCallbackHandler.
I followed quarkus guides for building native with SSL, and configuring kafka with smallrye extension.
My application.property file
mp.messaging.outgoing.stock-topic.connector=smallrye-kafka
mp.messaging.outgoing.stock-topic.value.serializer=org.apache.kafka.common.serialization.ByteArraySerializer
mp.messaging.outgoing.stock-topic.key.serializer=org.apache.kafka.common.serialization.IntegerSerializer
mp.messaging.outgoing.stock-topic.topic=topic
mp.messaging.outgoing.stock-topic.bootstrap.servers=${KAFKA_HOST}:${KAFKA_PORT}
mp.messaging.outgoing.stock-topic.sasl.mechanism=PLAIN
mp.messaging.outgoing.stock-topic.security.protocol=SASL_SSL
mp.messaging.outgoing.stock-topic.ssl.protocol=TLSv1.2
mp.messaging.outgoing.stock-topic.ssl.enabled.protocols=TLSv1.2
mp.messaging.outgoing.stock-topic.ssl.endpoint.identification.algorithm=HTTPS
mp.messaging.outgoing.stock-topic.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="${KAFKA_USER}" password="${KAFKA_PASS}";
quarkus.ssl.native=true
quarkus.native.enable-all-security-services=true
I expect native to work the same. When application.property is commented, native run doesn't throw exceptions.
Error log:
Exception in thread "main" java.lang.RuntimeException: Failed to start quarkus
at io.quarkus.runner.ApplicationImpl.doStart(ApplicationImpl.zig:185)
at io.quarkus.runtime.Application.start(Application.java:94)
at io.quarkus.runtime.Application.run(Application.java:218)
at io.quarkus.runner.GeneratedMain.main(GeneratedMain.zig:41)
Caused by: javax.enterprise.inject.spi.DeploymentException: org.apache.kafka.common.KafkaException: Failed to construct kafka producer
at io.quarkus.smallrye.reactivemessaging.runtime.SmallRyeReactiveMessagingLifecycle.onApplicationStart(SmallRyeReactiveMessagingLifecycle.java:22)
at io.quarkus.smallrye.reactivemessaging.runtime.SmallRyeReactiveMessagingLifecycle_Observer_onApplicationStart_4e8937813d9e8faff65c3c07f88fa96615b70e70.notify(SmallRyeReactiveMessagingLifecycle_Observer_onApplicationStart_4e8937813d9e8faff65c3c07f88fa96615b70e70.zig:51)
at io.quarkus.arc.impl.EventImpl$Notifier.notify(EventImpl.java:224)
at io.quarkus.arc.impl.EventImpl.fire(EventImpl.java:65)
at io.quarkus.arc.runtime.LifecycleEventRunner.fireStartupEvent(LifecycleEventRunner.java:23)
at io.quarkus.arc.runtime.ArcRecorder.handleLifecycleEvents(ArcRecorder.java:108)
at io.quarkus.deployment.steps.LifecycleEventsBuildStep$startupEvent27.deploy_0(LifecycleEventsBuildStep$startupEvent27.zig:58)
at io.quarkus.deployment.steps.LifecycleEventsBuildStep$startupEvent27.deploy(LifecycleEventsBuildStep$startupEvent27.zig:77)
at io.quarkus.runner.ApplicationImpl.doStart(ApplicationImpl.zig:161)
... 3 more
Caused by: org.apache.kafka.common.KafkaException: Failed to construct kafka producer
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:431)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:271)
at io.vertx.kafka.client.producer.impl.KafkaWriteStreamImpl.create(KafkaWriteStreamImpl.java:52)
at io.vertx.kafka.client.producer.KafkaWriteStream.create(KafkaWriteStream.java:92)
at io.smallrye.reactive.messaging.kafka.KafkaSink.<init>(KafkaSink.java:50)
at io.smallrye.reactive.messaging.kafka.KafkaConnector.getSubscriberBuilder(KafkaConnector.java:73)
at io.smallrye.reactive.messaging.kafka.KafkaConnector_ClientProxy.getSubscriberBuilder(KafkaConnector_ClientProxy.zig:283)
at io.smallrye.reactive.messaging.impl.ConfiguredChannelFactory.createSubscriberBuilder(ConfiguredChannelFactory.java:156)
at io.smallrye.reactive.messaging.impl.ConfiguredChannelFactory.lambda$register$5(ConfiguredChannelFactory.java:124)
at java.util.HashMap.forEach(HashMap.java:1289)
at io.smallrye.reactive.messaging.impl.ConfiguredChannelFactory.register(ConfiguredChannelFactory.java:124)
at io.smallrye.reactive.messaging.impl.ConfiguredChannelFactory.initialize(ConfiguredChannelFactory.java:118)
at io.smallrye.reactive.messaging.impl.ConfiguredChannelFactory_ClientProxy.initialize(ConfiguredChannelFactory_ClientProxy.zig:195)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:647)
at io.smallrye.reactive.messaging.extension.MediatorManager.initializeAndRun(MediatorManager.java:132)
at io.smallrye.reactive.messaging.extension.MediatorManager_ClientProxy.initializeAndRun(MediatorManager_ClientProxy.zig:100)
at io.quarkus.smallrye.reactivemessaging.runtime.SmallRyeReactiveMessagingLifecycle.onApplicationStart(SmallRyeReactiveMessagingLifecycle.java:20)
... 11 more
Caused by: org.apache.kafka.common.KafkaException: org.apache.kafka.common.KafkaException: Could not find a public no-argument constructor for org.apache.kafka.common.security.authenticator.SaslClientCallbackHandler
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:160)
at org.apache.kafka.common.network.ChannelBuilders.create(ChannelBuilders.java:146)
at org.apache.kafka.common.network.ChannelBuilders.clientChannelBuilder(ChannelBuilders.java:67)
at org.apache.kafka.clients.ClientUtils.createChannelBuilder(ClientUtils.java:99)
at org.apache.kafka.clients.producer.KafkaProducer.newSender(KafkaProducer.java:439)
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:420)
... 29 more
Caused by: org.apache.kafka.common.KafkaException: Could not find a public no-argument constructor for org.apache.kafka.common.security.authenticator.SaslClientCallbackHandler
at org.apache.kafka.common.utils.Utils.newInstance(Utils.java:314)
at org.apache.kafka.common.network.SaslChannelBuilder.createClientCallbackHandler(SaslChannelBuilder.java:289)
at org.apache.kafka.common.network.SaslChannelBuilder.configure(SaslChannelBuilder.java:127)
... 34 more
Caused by: java.lang.NoSuchMethodException: org.apache.kafka.common.security.authenticator.SaslClientCallbackHandler.<init>
at java.lang.Class.getConstructor0(DynamicHub.java:3082)
at java.lang.Class.getDeclaredConstructor(DynamicHub.java:2178)
at org.apache.kafka.common.utils.Utils.newInstance(Utils.java:312)
... 36 more
To reproduce:
Make new quarkus project with quarkus-smallrye-reactive-messaging-kafka extension
Configure application property
Build native and run it.
Note: I don't have any class in src/main/java, native fails when SSL configuration is found in application.property

Not able to start Datastax Cassandra due to missing SystemInfo class

I have installed Datastax Cassandra 5.0.4 on my mac but when i try to start, it is giving me error for SystemInfo class.
Here is related log:
Exception (com.google.inject.ProvisionException) encountered during startup: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.NoClassDefFoundError: com/datastax/hadoop/hive/util/SystemInfo
at com.datastax.bdp.plugin.HiveMetaStorePlugin.(HiveMetaStorePlugin.java:39)
at com.datastax.bdp.DseHiveModule.configure(Unknown Source) (via modules: com.datastax.bdp.DseModule -> com.datastax.bdp.DseAnalyticsModule -> com.datastax.bdp.DseHiveModule)
while locating com.datastax.bdp.plugin.HiveMetaStorePlugin
1 error
com.google.inject.ProvisionException: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.NoClassDefFoundError: com/datastax/hadoop/hive/util/SystemInfo
at com.datastax.bdp.plugin.HiveMetaStorePlugin.(HiveMetaStorePlugin.java:39)
at com.datastax.bdp.DseHiveModule.configure(Unknown Source) (via modules: com.datastax.bdp.DseModule -> com.datastax.bdp.DseAnalyticsModule -> com.datastax.bdp.DseHiveModule)
while locating com.datastax.bdp.plugin.HiveMetaStorePlugin
1 error
at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1025)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1051)
at com.datastax.bdp.plugin.PluginManager.register(PluginManager.java:139)
at com.datastax.bdp.plugin.PluginManager.postSetup(PluginManager.java:68)
at com.datastax.bdp.server.DseDaemon.postSetup(DseDaemon.java:824)
at com.datastax.bdp.server.DseDaemon.setup(DseDaemon.java:447)
at org.apache.cassandra.service.CassandraDaemon.activate(CassandraDaemon.java:568)
at com.datastax.bdp.DseModule.main(DseModule.java:91)
Caused by: java.lang.NoClassDefFoundError: com/datastax/hadoop/hive/util/SystemInfo
at com.datastax.bdp.plugin.HiveMetaStorePlugin.(HiveMetaStorePlugin.java:40)
at com.datastax.bdp.plugin.HiveMetaStorePlugin$$FastClassByGuice$$7e2893c3.newInstance()
at com.google.inject.internal.cglib.reflect.$FastConstructor.newInstance(FastConstructor.java:40)

Spark application development on local cluster by IntelliJ

I tried many things to execute the application on local cluster. However it did not work.
I am using CDH 5.7 and spark version is 1.6.
I am trying to create dataframe from hive on CDH 5.7.
If I use spark-shell, all codes works really well. However, I have no idea how can I set my intellJ configuration for efficient development environment.
Here is my code;
import org.apache.spark.{SparkConf, SparkContext}
object DataFrame {
def main(args: Array[String]): Unit = {
println("Hello DataFrame")
val conf = new SparkConf() // skip loading external settingg
.setMaster("local") // could be "local[4]" for 4 threads
.setAppName("DataFrame-Example")
.set("spark.logConf", "true")
val sc = new SparkContext(conf)
sc.setLogLevel("WARN")
println(s"Running Spark Version ${sc.version}")
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql("From src select key, value").collect().foreach(println)
}
}
When I run this program on IntelliJ, the error messages are following;
Hello DataFrame
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/05/29 11:30:57 INFO Slf4jLogger: Slf4jLogger started
Running Spark Version 1.6.0
16/05/29 11:31:02 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.1.0
16/05/29 11:31:02 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
Exception in thread "main" java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:249)
at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:329)
at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:239)
at org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:459)
at org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:459)
at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:458)
at org.apache.spark.sql.hive.HiveContext$$anon$3.<init>(HiveContext.scala:475)
at org.apache.spark.sql.hive.HiveContext.analyzer$lzycompute(HiveContext.scala:475)
at org.apache.spark.sql.hive.HiveContext.analyzer(HiveContext.scala:474)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:34)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:133)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:52)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:817)
at org.corus.spark.example.DataFrame$.main(DataFrame.scala:25)
at org.corus.spark.example.DataFrame.main(DataFrame.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:539)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
... 24 more
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:624)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:573)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:517)
... 25 more
Process finished with exit code 1
Is there anyone know a solution?
Thanks.
I found several resources about this problem. But none of them did not work.
https://www.linkedin.com/pulse/develop-apache-spark-apps-intellij-idea-windows-os-samuel-yee
https://blog.cloudera.com/blog/2014/06/how-to-create-an-intellij-idea-project-for-apache-hadoop/
Thanks all. I solved the problem by my self.
The problem is that local spark(Maven version) did not know the information of hive on our cluster.
The solution is very simple.
Just add following codes on your source code.
conf.set("spark.sql.hive.thriftServer.singleSession", "true")
System.setProperty("hive.metastore.uris","thrift://hostname:serviceport")
It works!
Let's play with spark.

"mfp push" throws NullPointerException while deploying adapter (MobileFirst Platform 7.1)

using MobileFirst Platform CLI version 7.1.0.00.20151227-1730 I have suddenly the following error when trying to push an update I made to an adapter:
Preparing for push...
Verifying Server Configuration...
Runtime 'localMFP' will be used to push the project into.
[Error:
BUILD FAILED
/Applications/IBM/MobileFirst-CLI/mobilefirst-cli/node_modules/generator-worklight-server/lib/build.xml:497: com.worklight.upgrader.UpgradeEngineException: java.lang.NullPointerException
at com.worklight.upgrader.WLUpgradeEngine.<init>(WLUpgradeEngine.java:142)
at com.worklight.upgrader.WLUpgradeEngine.<init>(WLUpgradeEngine.java:147)
at com.worklight.upgrader.ant.UpgraderTask.execute(UpgraderTask.java:100)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:435)
at org.apache.tools.ant.Target.performTasks(Target.java:456)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1393)
at org.apache.tools.ant.Project.executeTarget(Project.java:1364)
at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
at org.apache.tools.ant.Project.executeTargets(Project.java:1248)
at org.apache.tools.ant.Main.runBuild(Main.java:851)
at org.apache.tools.ant.Main.startAnt(Main.java:235)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:280)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:109)
Caused by: java.lang.NullPointerException
at java.text.MessageFormat.applyPattern(MessageFormat.java:436)
at java.text.MessageFormat.<init>(MessageFormat.java:362)
at java.text.MessageFormat.format(MessageFormat.java:840)
at com.worklight.upgrader.WLUpgradeEngine.findProjectVersion(WLUpgradeEngine.java:602)
at com.worklight.upgrader.WLUpgradeEngine.<init>(WLUpgradeEngine.java:133)
... 18 more
Total time: 3 seconds
]
Error: Sorry an error has occurred. Please check the stack above for details.
I have tried to cleanup the project, remove what was already deployed, revert my changes to what I had when I succeeded to deploy, re-install mfp cli, but I still have the issue.
Any hint on what I could do to get rid of the exception?
Thanks!
The failure is coming from the upgrader code path, as if something is missing in the adapter files.
My suggestion is to create a new adapter and see that it gets deployed. Then, start adding back code. Maybe you will find the failing part.

unable to start server Glassfish 4 on eclipse unsing java 8

I am configuring my computer for implement primesface and primefaces mobiles. I installed apache-tomcat, GlassFish 4 and java 8 (jdk 1.8). I configured GlassFish on eclipse, but when I want to start it, I receive this message
Starting GlassFish 4 at localhost [domains1] has encountered a problem :
Unable to start server due following issues:
Launch process failed with exit code 1
and the log file give me this
Launching GlassFish on Felix platform
ERROR: Unable to create cache directory: C:\Program
Files\glassfish-4.1\glassfish\domains\domain1\osgi-cache\felix
ERROR: Error creating bundle cache. (java.lang.RuntimeException:
Unable to create cache directory.) java.lang.RuntimeException: Unable
to create cache directory. at
org.apache.felix.framework.cache.BundleCache.(BundleCache.java:131)
at org.apache.felix.framework.Felix.init(Felix.java:640) at
com.sun.enterprise.glassfish.bootstrap.osgi.OSGiFrameworkLauncher$1.run(OSGiFrameworkLauncher.java:88)
Exception in thread "Thread-1" java.lang.RuntimeException:
org.osgi.framework.BundleException: Error creating bundle cache. at
com.sun.enterprise.glassfish.bootstrap.osgi.OSGiFrameworkLauncher$1.run(OSGiFrameworkLauncher.java:90)
Caused by: org.osgi.framework.BundleException: Error creating bundle
cache. at org.apache.felix.framework.Felix.init(Felix.java:645) at
com.sun.enterprise.glassfish.bootstrap.osgi.OSGiFrameworkLauncher$1.run(OSGiFrameworkLauncher.java:88)
Caused by: java.lang.RuntimeException: Unable to create cache
directory. at
org.apache.felix.framework.cache.BundleCache.(BundleCache.java:131)
at org.apache.felix.framework.Felix.init(Felix.java:640) ... 1 more
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483) at
com.sun.enterprise.glassfish.bootstrap.GlassFishMain.main(GlassFishMain.java:97)
at com.sun.enterprise.glassfish.bootstrap.ASMain.main(ASMain.java:54)
Caused by: org.glassfish.embeddable.GlassFishException:
java.lang.NullPointerException at
com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntimeBuilder.build(OSGiGlassFishRuntimeBuilder.java:170)
at
org.glassfish.embeddable.GlassFishRuntime._bootstrap(GlassFishRuntime.java:157)
at
org.glassfish.embeddable.GlassFishRuntime.bootstrap(GlassFishRuntime.java:110)
at
com.sun.enterprise.glassfish.bootstrap.GlassFishMain$Launcher.launch(GlassFishMain.java:112)
... 6 more Caused by: java.lang.NullPointerException at
com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntimeBuilder.newFramework(OSGiGlassFishRuntimeBuilder.java:241)
at
com.sun.enterprise.glassfish.bootstrap.osgi.OSGiGlassFishRuntimeBuilder.build(OSGiGlassFishRuntimeBuilder.java:135)
... 9 more Error stopping framework: java.lang.NullPointerException
java.lang.NullPointerException at
com.sun.enterprise.glassfish.bootstrap.GlassFishMain$Launcher$1.run(GlassFishMain.java:203)
Java HotSpot(TM) Client VM warning: ignoring option MaxPermSize=192m;
support was removed in 8.0
Can I have some solution please?