I am trying to upload a file to the database in the form of a blob during which the file upload program writes a temporary file to the disk.
I'm using Wso2 Stratos application server (based on Tomcat) is preventing such temp file to be written to the disk due to security reasons. I have attached the stack trace of the error.
I'm using Apache Commons Fileupload Library. Here is my upload class http://paste.org/47685 and The error is throwing from line 57. I need to avoid writing temp files How can I solve this problem?
This is my error log
java.security.AccessControlException: access denied (java.io.FilePermission F:\W
SO2ST~1.2\bin\..\tmp\upload_4e2fd9dc_1368bb5a330__7ffa_00000002.tmp write)
at java.security.AccessControlContext.checkPermission(AccessControlConte
xt.java:323)
at java.security.AccessController.checkPermission(AccessController.java:
546)
at java.lang.SecurityManager.checkPermission(SecurityManager.java:532)
at java.lang.SecurityManager.checkWrite(SecurityManager.java:962)
at java.io.FileOutputStream.<init>(FileOutputStream.java:169)
at java.io.FileOutputStream.<init>(FileOutputStream.java:131)
at org.apache.commons.io.output.DeferredFileOutputStream.thresholdReache
d(DeferredFileOutputStream.java:178)
at org.apache.commons.io.output.ThresholdingOutputStream.checkThreshold(
ThresholdingOutputStream.java:224)
at org.apache.commons.io.output.ThresholdingOutputStream.write(Threshold
ingOutputStream.java:128)
at org.apache.commons.fileupload.util.Streams.copy(Streams.java:103)
at org.apache.commons.fileupload.util.Streams.copy(Streams.java:66)
at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadB
ase.java:366)
at org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(
ServletFileUpload.java:126)
at controler.UploadDocumentServlet.doPost(UploadDocumentServlet.java:62)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:641)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
at sun.reflect.GeneratedMethodAccessor31.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.security.SecurityUtil$1.run(SecurityUtil.java:273
)
at org.apache.catalina.security.SecurityUtil$1.run(SecurityUtil.java:270
)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAsPrivileged(Subject.java:517)
at org.apache.catalina.security.SecurityUtil.execute(SecurityUtil.java:3
05)
at org.apache.catalina.security.SecurityUtil.doAsPrivilege(SecurityUtil.
java:165)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:298)
at org.apache.catalina.core.ApplicationFilterChain.access$000(Applicatio
nFilterChain.java:57)
at org.apache.catalina.core.ApplicationFilterChain$1.run(ApplicationFilt
erChain.java:193)
at org.apache.catalina.core.ApplicationFilterChain$1.run(ApplicationFilt
erChain.java:189)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:188)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperV
alve.java:240)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextV
alve.java:164)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(Authentica
torBase.java:462)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.j
ava:164)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.j
ava:100)
at org.wso2.carbon.server.CarbonStuckThreadDetectionValve.invoke(CarbonS
tuckThreadDetectionValve.java:154)
at org.wso2.carbon.server.TomcatServer$1.invoke(TomcatServer.java:254)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:
563)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineVal
ve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.jav
a:399)
at org.apache.coyote.http11.Http11NioProcessor.process(Http11NioProcesso
r.java:396)
at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.pr
ocess(Http11NioProtocol.java:356)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoin
t.java:1534)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExec
utor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:908)
at java.lang.Thread.run(Thread.java:705)
So you don't have permissions to write to F:\W
SO2ST~1.2\tmp\upload_4e2fd9dc_1368bb5a330__7ffa_00000002.tmp, do you have permissions to write to any directory on the file system? (if that tmp folder doesn't exist then that might be your problem)
If so you just need to confgure the tmp directory of your factory to be a directory you can write to (there should a tmp folder for the active user where you can store files, like C:\Documents and Settings\MyUser\Temp, or something like that)
DiskFileItemFactory.setRepository(File)
I figured out how to solve this temp file issue.
By default DiskFileItemFactory() size is 10,240 byts, it a file exceed this amount it will create a temp file for storing the file. So that's the bug my files are more than 10,240 byts size. So by increasing the size of the filefactory object solves the problem. refer this link
http://www.techiepark.com/tutorials/file-upload-using-java
Related
we have a data frame which we want to write to s3 as parquet format and in overwrite mode.
every time we write the dataframe it's always a new folder. The code to write the s3 location is as follows:
df.write
.option("dateFormat", "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'")
.option("timestampFormat", "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'")
.option("maxRecordsPerFile", maxRecordsPerFile)
.mode("overwrite")
.format(format)
.save(output)
What we observe is, at times we get FilenotFoundException (full trace below). Can somebody help me understand
when i am writing to a new s3 location (meaning nobody is reading from the location); why does the writing program throw the below exception?
how to fix it? --i see couple of stackoverflows pointing to this exception. But they say that it happens when you try to read when write is happening. But my case is not like that. i dont read when write happens.
my spark is 2.3.2 ; EMR-5.18.1 ; the code is written in scala
I am using s3:// as output folder path. Should i change it to some s3n or s3a ? will that help?
Caused by: java.io.FileNotFoundException: No such file or directory 's3://BUCKET/snapshots/FOLDER/_bid_9223370368440344985/part-00020-693dfbcb-74e9-45b0-b892-0b19fa92365c-c000.snappy.parquet'
It is possible the underlying files have been updated. You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved.
at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.org$apache$spark$sql$execution$datasources$FileScanRDD$$anon$$readCurrentFile(FileScanRDD.scala:131)
at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.nextIterator(FileScanRDD.scala:182)
at org.apache.spark.sql.execution.datasources.FileScanRDD$$anon$1.hasNext(FileScanRDD.scala:109)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:461)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec$$anonfun$doExecute$1$$anonfun$4.apply(HashAggregateExec.scala:104)
at org.apache.spark.sql.execution.aggregate.HashAggregateExec$$anonfun$doExecute$1$$anonfun$4.apply(HashAggregateExec.scala:101)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:853)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:853)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:49)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:49)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
I finally was able to solve the problem
The df : DataFrame was formed on the same s3 folder to which the same is being written in overwrite mode.
So during the overwrite; the source folder is getting cleared --which was resulting into the error
Hope this helps somebody.
I added a load balancer to proxy https requests to EMR (6.2.0) - prestosql 343. I added the following in config.properties.
http-server.process-forwarded=true
http-server.authentication.type=password
I created a password-authenticator.properties file with the following contents.
password-authenticator.name=file
file.password-file=/home/hadoop/password.db
file.refresh-period=1m
file.auth-token-cache.max-size=1000
When I view the web ui, I can see the login page, but when I login, I get either the following error.
java.lang.IllegalStateException: authenticator was not loaded
at com.google.common.base.Preconditions.checkState(Preconditions.java:508)
at io.prestosql.server.security.PasswordAuthenticatorManager.getAuthenticator(PasswordAuthenticatorManager.java:88)
at io.prestosql.server.ui.PasswordManagerFormAuthenticator.isValidCredential(PasswordManagerFormAuthenticator.java:67)
at io.prestosql.server.ui.FormWebUiAuthenticationFilter.checkLoginCredentials(FormWebUiAuthenticationFilter.java:221)
at io.prestosql.server.ui.LoginResource.login(LoginResource.java:93)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:76)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:148)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:191)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:200)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:103)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:493)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:415)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:104)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:277)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:272)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:268)
at org.glassfish.jersey.internal.Errors.process(Errors.java:316)
at org.glassfish.jersey.internal.Errors.process(Errors.java:298)
at org.glassfish.jersey.internal.Errors.process(Errors.java:268)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:289)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:256)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:703)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:416)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:370)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:389)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:342)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:229)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:755)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1617)
at io.airlift.http.server.TraceTokenFilter.doFilter(TraceTokenFilter.java:63)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
at io.airlift.http.server.TimingFilter.doFilter(TimingFilter.java:51)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:717)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:173)
at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:59)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:500)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
at org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:335)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:135)
at org.eclipse.jetty.http2.HTTP2Connection.produce(HTTP2Connection.java:170)
at org.eclipse.jetty.http2.HTTP2Connection.onFillable(HTTP2Connection.java:125)
at org.eclipse.jetty.http2.HTTP2Connection$FillableCallback.succeeded(HTTP2Connection.java:348)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
at java.base/java.lang.Thread.run(Thread.java:829)
Or 502 bad gateway page. I followed the methods specified here.
This part of source code https://github.com/trinodb/trino/blob/343/presto-main/src/main/java/io/prestosql/server/security/PasswordAuthenticatorManager.java#L62-L84 loads the config.
So search for -- Loading password authenticator -- in log that should give you more information regarding what could have gone wrong.
Check if file contents are loaded properly or not, sometime that is also the cause for the error.
I got following exception while creating new service provider with permissions , following is some portion of its code.
iManagementServiceStub = new IdentityApplicationManagementServiceStub();
iManagementServiceStub.createApplication(createApplication);
Following is exception i am getting on client side.
identity.IdentityApplicationManagementServiceIdentityApplicationManagementException: Error while storing permissions for application sp3
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at identity.IdentityApplicationManagementServiceStub.createApplication(IdentityApplicationManagementServiceStub.java:1007)
at identity.AddServiceProvider.main(AddServiceProvider.java:92)
Following are exception on server side.
Caused by: org.wso2.carbon.registry.core.exceptions.RegistryException: The path '/_system/governance/permission/applications/sp3/org.wso2.carbon.identity.application.common.model.ApplicationPermission#12809798' contains one or more illegal characters (~!##;%^*()+={}|\<>"',)
at org.wso2.carbon.registry.core.jdbc.Repository.put(Repository.java:262)
at org.wso2.carbon.registry.core.jdbc.EmbeddedRegistry.put(EmbeddedRegistry.java:717)
at org.wso2.carbon.registry.core.caching.CacheBackedRegistry.put(CacheBackedRegistry.java:591)
at org.wso2.carbon.registry.core.session.UserRegistry.putInternal(UserRegistry.java:828)
at org.wso2.carbon.registry.core.session.UserRegistry.putInternal(UserRegistry.java:796)
at org.wso2.carbon.registry.core.session.UserRegistry.access$900(UserRegistry.java:61)
at org.wso2.carbon.registry.core.session.UserRegistry$10.run(UserRegistry.java:786)
at org.wso2.carbon.registry.core.session.UserRegistry$10.run(UserRegistry.java:783)
at java.security.AccessController.doPrivileged(Native Method)
at org.wso2.carbon.registry.core.session.UserRegistry.put(UserRegistry.java:783)
at org.wso2.carbon.identity.application.mgt.ApplicationMgtUtil.storePermissions(ApplicationMgtUtil.java:299)
... 64 more
Please suggest.
When analyzing the error log, you can see that there are illegal charactors in your permission.
Caused by: org.wso2.carbon.registry.core.exceptions.RegistryException: The path '/_system/governance/permission/applications/sp3/org.wso2.carbon.identity.application.common.model.ApplicationPermission#12809798' contains one or more illegal characters (~!##;%^*()+={}|\<>"',)
Please check the permission name. Basically those charactors are reserved and have an specific use. So you cannot use them elsewhere whithout encoding them[1].
[1] https://en.wikipedia.org/wiki/Percent-encoding
I am trying to import a JSON file which has been uploaded into S3 into DynamoDB
I followed the tutorial amazon has given
http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-importexport-ddb-console-start.html
But when i try to activate the pipeline the component TableLoadActivity fails and DDBDestinationTable says CASCADE_FAILED
Both give the error
at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:520) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:512) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.JobClient.submitJobInternal(
Any help would be appreciated
I discovered that you have to select the inner folder, where 'manifest' is present.
This use to be the folder with the backup's date and time.
I received this error when I have selected the parent folder, that I gave my table name.
For a project I'm working on I need to change some property values at run time and save them. Looking for a solution I found the apache commons configuration.
I have looked at some others topics solving my first problems but now, an error that says:
org.apache.commons.configuration.ConfigurationException: Unable to load the configuration from the URL file:/C:/Users/sensor/Documents/NetBeansProjects/HermanWijn/dist/run1776677076/HermanWijn.jar!/Database/DataProptester.properties
Looking at this toppic: Property file not reflecting the modified changes using Apache Commons Configuration
My code should be working but for some reason I get an error.
The JDBCWijnDAO is a class in the same package as the properties file, Of course I want to do more than just load the prop file, but at the moment when creating the new propconfig, gives an error which I need to solve.
code:
URL resource = JDBCWijnDAO.class.getResource("DataProptester.properties");
PropertiesConfiguration config = new PropertiesConfiguration(resource.getPath());
Okay, i think i "know" what the problem is but I don't really know how to fix it, in the trace there is a:
Caused by: java.io.FileNotFoundException: C:\Users\Sensor\Documents\NetBeansProjects\HermanWijn\dist\run1776677076\HermanWijn.jar!\Database\DataProptester.properties (The system cannot find the path specified)
I assume there is something wrong with the total path.
errors:
org.apache.commons.configuration.ConfigurationException: Unable to load the configuration from the URL file:/C:/Users/Sander/Documents/NetBeansProjects/HermanWijn/dist/run1776677076/HermanWijn.jar!/Database/DataProptester.properties
at org.apache.commons.configuration.DefaultFileSystem.getInputStream(DefaultFileSystem.java:86)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:323)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:261)
at org.apache.commons.configuration.AbstractFileConfiguration.load(AbstractFileConfiguration.java:238)
at org.apache.commons.configuration.AbstractFileConfiguration.<init>(AbstractFileConfiguration.java:158)
at org.apache.commons.configuration.PropertiesConfiguration.<init>(PropertiesConfiguration.java:253)
at hermanwijn.knophandlers.DataBestandLocatieSelectieHandler.handle(DataBestandLocatieSelectieHandler.java:49)
at hermanwijn.knophandlers.DataBestandLocatieSelectieHandler.handle(DataBestandLocatieSelectieHandler.java:29)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:69)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:217)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:170)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:38)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:37)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:53)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:28)
at javafx.event.Event.fireEvent(Event.java:171)
at javafx.scene.Node.fireEvent(Node.java:6863)
at javafx.scene.control.Button.fire(Button.java:179)
at com.sun.javafx.scene.control.behavior.ButtonBehavior.mouseReleased(ButtonBehavior.java:193)
at com.sun.javafx.scene.control.skin.SkinBase$4.handle(SkinBase.java:336)
at com.sun.javafx.scene.control.skin.SkinBase$4.handle(SkinBase.java:329)
at com.sun.javafx.event.CompositeEventHandler.dispatchBubblingEvent(CompositeEventHandler.java:64)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:217)
at com.sun.javafx.event.EventHandlerManager.dispatchBubblingEvent(EventHandlerManager.java:170)
at com.sun.javafx.event.CompositeEventDispatcher.dispatchBubblingEvent(CompositeEventDispatcher.java:38)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:37)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.BasicEventDispatcher.dispatchEvent(BasicEventDispatcher.java:35)
at com.sun.javafx.event.EventDispatchChainImpl.dispatchEvent(EventDispatchChainImpl.java:92)
at com.sun.javafx.event.EventUtil.fireEventImpl(EventUtil.java:53)
at com.sun.javafx.event.EventUtil.fireEvent(EventUtil.java:33)
at javafx.event.Event.fireEvent(Event.java:171)
at javafx.scene.Scene$MouseHandler.process(Scene.java:3324)
at javafx.scene.Scene$MouseHandler.process(Scene.java:3164)
at javafx.scene.Scene$MouseHandler.access$1900(Scene.java:3119)
at javafx.scene.Scene.impl_processMouseEvent(Scene.java:1559)
at javafx.scene.Scene$ScenePeerListener.mouseEvent(Scene.java:2261)
at com.sun.javafx.tk.quantum.GlassViewEventHandler.handleMouseEvent(GlassViewEventHandler.java:228)
at com.sun.glass.ui.View.handleMouseEvent(View.java:528)
at com.sun.glass.ui.View.notifyMouse(View.java:922)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.access$100(WinApplication.java:29)
at com.sun.glass.ui.win.WinApplication$2$1.run(WinApplication.java:67)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.FileNotFoundException: C:\Users\Sander\Documents\NetBeansProjects\HermanWijn\dist\run1776677076\HermanWijn.jar!\Database\DataProptester.properties (The system cannot find the path specified)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:97)
at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
at java.net.URL.openStream(URL.java:1035)
at org.apache.commons.configuration.DefaultFileSystem.getInputStream(DefaultFileSystem.java:82)
Fixed, the problem was the package name was not included, it must be:
URL resource = JDBCWijnDAO.class.getResource("/Database/DataProptester.properties");