I have a web maven application that have the database EJB jar as a dependency.
The database EJB is the one which has all the JPA entities and the persistence.xml file, so it's responsible by all database operations.
I just read http://arquillian.org/guides/testing_java_persistence/ and it explains how to test persistence using arquillian.
The tutorial considers that the persistence.xml file is in the webapp path, so it adds META-INF/persistence.xml as a resource.
So I'd like to know, how do I add database's persistence.xml when running the arquillian tests from my webapp? Is that possible?
Maybe the answer comes a little late, but anyhow I hope it's still of value for you:
You have two options, either read the archive from a file (maybe generated my mvn package) or create the archive yourself using ShrinkWrap:
Option (1), called from somewhere annotated with #Deployment:
/** maven did it for us .. we just have to read the file */
private static Archive<?> getArchiveFromFile() {
JavaArchive artifact = ShrinkWrap.create(ZipImporter.class, ARCHIVE_NAME).importFrom(ARCHIVE_FILE)
.as(JavaArchive.class);
return artifact;
}
Option (2), I found it useful to inspect the file from time to time so there's an option to write it to the file system:
/** doing it the hard way ... guess you won't like it as EVERY class plus related stuff needs to be specified */
private static Archive<?> getArchiveManually() {
// creating archive manually
JavaArchive artifact = ShrinkWrap.create(JavaArchive.class, ARCHIVE_NAME)
.addPackage(FooServiceBean.class.getPackage()).addPackage(Foo.class.getPackage())
.addPackage(FooService.class.getPackage()).addAsResource("META-INF/persistence.xml")
.addAsResource("META-INF/beans.xml");
// so we might write it for further inspection
if (WRITE_ARCHIVE) {
artifact.as(ZipExporter.class).exportTo(new File("D:/abc/xyz/tmp/" + ARCHIVE_NAME), true);
}
return artifact;
}
So your answer is included within the second option ;-)
Related
I am working ON Puppets Learning VM which uses Ruby which I am not too familiar with. I am stuck on Exercise 5. Manifests and classes, Task 2 https://kjhenner.gitbooks.io/puppet-quest-guide/content/quests/manifests_and_classes.html
In the previous Task I create cowsay.pp :
class cowsayings::cowsay {
package { 'cowsay':
ensure => present,
provider => 'gem',
}
}
Then in task two I am suppose to create the same file in another location with instructions:
Task 2:
If you were going to apply this code to your production infrastructure, you would use the console's node classifier to classify any nodes that needed cowsay installed with the cowsay with your cowsay class. As you're working on a module, however, it's useful to apply a class directly. By convention, these test manifests are kept in an examples directory. (You may also sometimes see these manifests in the tests directory.)
To actually declare the class, create a cowsay.pp test in the examples directory.
vim cowsayings/examples/cowsay.pp
In this manifest, declare the cowsay class with the include keyword.
include cowsayings::cowsay
I am not sure how to create this second file and where to add this line. I have tried both:
class cowsayings::coway {
include cowsayings::cowsay
package { 'cowsay':
ensure => present,
provider => 'gem',
}
}
and
class cowsayings{
include cowsayings::cowsay
}
Although it does not seem to be working and when I run it, it does not install cowsay correctly in Task 3 (in the link above that I posted
The manifest in the examples directory just needs that one line with "include cowsayings::cowsay".
There are two tasks that have to happen with puppet, "defining" classes and "declaring" them. The cowsayings/manifests/cowsay.pp contains the definition, but you need to actually declare the class in order to make something happen.
That's what puppet apply cowsayings/examples/cowsay.pp does, it declares the class.
I have REST Assured working in one of our JAR projects. Now I'm trying to add a similar test class in our WAR project.
I added REST Assured to the WAR project:
<dependency conf="test->default" org="com.jayway.restassured"
name="rest-assured" rev="1.8.1"/>
I have also ASM on the test classpath (asm-4.0, asm-analysis-4.0, asm-commons-4.0, asm-tree-4.0, asm-util-4.0); mentioning this since the only search results on my problem suggested a relationship with ASM.
When I run my test, it gives the following error:
java.lang.NoSuchMethodError: org.codehaus.groovy.runtime.ScriptBytecodeAdapter.castToType(Ljava/lang/Object;Ljava/lang/Class;)Ljava/lang/Object;
at com.jayway.restassured.internal.ResponseParserRegistrar.<init>(ResponseParserRegistrar.groovy)
at com.mycompany.testSomething(SomethingTest.java:50)
I've minimized my test to the following:
#Test
public void testSomething() {
ResponseParserRegistrar r = new ResponseParserRegistrar();
}
Obviously I have no direct need to create a ResponseParserRegistrar, but this is what REST Assured does and fails on when I use REST Assured.
Your help would be much appreciated!
Have a look at FAQ #2 at https://code.google.com/p/rest-assured/wiki/FAQ, that would solve your classpath issues. Also I would encourage you to upgrade to a newer version since 1.8.1 is really old.
There is possibility server can't read your method or it will require a some parameters.
NoSuchMethodError is being thrown when program tries to call a class method that doesn’t exist. The method can be static or it can be an instance method too
I have been trying to refactor our Activiti implementation into using CDI but ran into a number of problems. I've spent way too much time trying to resolve this already, but I just can't let it go...I think I've pinned the problem down now, setting up a clean structured war without involving Activiti and have been able to reproduce what I think is the main problem.
Basically I have jar1 and jar2, both CDI enabled by including META-INF/beans.xml. Both jars specify a class in META-INF/services/test.TheTest pointing to implementations local to respective jar. jar1 depends on jar2. Also, both jars point to an implementation of javax.enterprise.inject.spi.Extension, triggering the scenario. In each implementation of Extension, I have a method like:
public void afterDeploymentValidation(
#Observes AfterDeploymentValidation event, BeanManager beanManager) {
System.out.println("In jar1 extension");
ServiceLoader<TheTest> loader = ServiceLoader.load(TheTest.class);
Iterator<TheTest> serviceIterator = loader.iterator();
List<TheTest> discoveredLookups = new ArrayList<TheTest>();
while (serviceIterator.hasNext()) {
TheTest serviceInstance = (TheTest) serviceIterator.next();
discoveredLookups.add(serviceInstance);
System.out.println(serviceInstance.getClass().getName());
}
}
Now, my problem is that the ServiceLoader does not see any implementations in either case when running WebLogic12c. The same code works perfectly fine in both Jboss 7.1.1 and Glassfish , listing both implementations of the test.TheTest interface.
Is it fair to assume that this is indeed a problem in WebLogic 12c or am I doing something wrong? Please bare in mind that I am simply trying to emulate the production setup we use when incorporating Activiti.
Regards,
/Petter
There is a Classloader Analysis Tool provided with WLS, have you seen if this will help with the diagnosis of your issue.
You can access this tool by going to ip:port/wls-cat/index.jsp
Where port will be the port of the managed server where your application is deployed.
I need a little help getting started. I have a new JSF-2 web application that I intend to deploy under GlassFish 3.1 (or higher). Normally the server stores all its log files as text in one of its private directories, which also includes the logging I do with ether System.println( .. ) or something like java.util.logging.Logger.getLogger( ... )
What I want to do is instead of those logging entries going to the text file, capture them and file them into my SQL data base. I can then add table columns for timestamp and key values so it can be easily searched as part of the admin web page in the application, rather than having to go to the admin console for it. It would be possible also to expose some of that data to users.
Can this be done and how?
Follow up question: could this be done in a way that would be portable to Tomcat or another container?
You will need to write custom log handler. Custom log handler is a class that extends java.util.logging.Handler:
package test.stackoverflow;
import java.util.logging.Handler;
..
public class AlanHandler extends Handler {
..
#Override
public void publish(LogRecord record) {
//CODE THAT STORES LOG RECORD INTO THE DATABASE
}
}
Additionally, you will have to slightly change logging.properties file:
handlers=java.util.logging.ConsoleHandler, test.stackoverflow.AlanHandler
Deploy JAR of AlanHandler on Glassfish (as a library), restart the server and that should do it.
I'm trying to use groovy scripts in my application. The problem is that GroovyScriptEngine#run always compiles the script, even if it was compiled in previous runs and hadn't changed since. Even if I set a physical output folder to save compilation results in the configuration.
What is the best way of working around this? The optimum for me is that I'm able to send the script with a folder containing precompiled results and no compilation is done (unless the script is modified of course)
Grails 1.3.5 is using Groovy 1.7.5. In that Groovy version, GroovyScriptEngine.run(..) calls the following methods: createScript(String, Binding) --> loadScriptByName(String) --> isSourceNewer(ScriptCacheEntry).
isSourceNewer(ScriptCacheEntry) is defined as (unfortunately, I didn't find a matching source file on the web):
protected boolean isSourceNewer(ScriptCacheEntry entry)
throws ResourceException {
// ...
for (String scriptName : entry.dependencies) {
// ...
return true; // without any further condition!
}
return false;
}
Which implements the (queer) logic "if a script has dependencies, it is newer than the cached script (and needs to be re-compiled)". That's not what the code is supposed to do; it's supposed to decide by modification time.
In newer versions of GroovyScriptEngine, this has been corrected (there've been massive changes to the logic), but for now, you'd need to subclass GroovyScriptEngine and overwrite isSourceNewer(ScriptCacheEntry) to fix the logic yourself.
Edit: The bug has been reported and fixed in Groovy 1.7.6. - So try using Groovy 1.7.6 in your Grails lib folder.
The solution (hack) I used at last was to stream out the scriptCache variable using xstream and to read it back and set it in the object
Not sure if this helps you, but you can alter GroovyScriptEngine's behaviour using CompilerConfiguration (see GroovyScriptEngine.setConfig). There's an option CompilerConfiguration.setRecompileGroovySource, which can be used to set whether the sources will be reloaded and recompiled if they change. You can read more about CompilerConfiguration here (page 282).