We're finding it very cumbersome to develop automated nunit integration tests which require domain objects to be registered (e.g. in order to load and save projects with said objects), as things like DataSourceFactories need to be registered in the Integrate or Initalize phase.
Registered plugins seem to be ignored in test mode.
Essentially, it seems modules need to be registered the old fashioned way (in the test assembly .config), which becomes a pain to maintain when we have a significant amount of modules in various plugins.
Is there a way to make Petrel load plugins in test mode?
Is there a way to programmatically register modules during startup of Petrel in test mode?
Are there any best or suggested practices for structuring and automating unit/integration tests in a Petrel environment, where tests ideally should run with plugins and their modules having been loaded and initialized, as close to the end-user experience as possible?
The only way to load plug-ins when running Petrel in test mode is registering the in the test assembly .config. As for structuring your plug-ins' tests, I can suggest to keep several copies of .config files, containing registration entities for different plugins/modules, and substituting the test assembly .config with the one you need when running a particular test.
Related
We are building a shop for a customer on Shopware 6.3.5.2 and want to use tests to
ensure that core functionality is not broken by our customizations (static plugins)
write new tests for new functionality
There is Running End-to-End Tests but this seems to be for core development and uses psh.phar which is not available in the production template.
How should this be done?
edit
This question is meant a bit broader and concerns also Unit Tests.
Actually, you can use the E2E tests of the platform project - as Cypress itself doesn't care where to run the test against. However, as you already noticed you cannot use psh commands to run them. You may run the tests though the basic Cypress commands, setting your shop's url as baseUrl of the tests, for example via this command:
./node_modules/.bin/cypress run --config baseUrl="<your-url>"
It works with cypress open as well.
The only thing what may become troublesome is the setToInitialState command in most of the tests which takes care about the clean up of shopware's database using psh scripts, unfortunately. You may need to adjust it by overriding the command in order to reset the database of the Production template.
I hope I was able to help a bit. 🙏
There are actually two parts here:
ensure that core functionality is not broken by our customizations (static plugins)
write new tests for new functionality
re 1: For regression tests like this I would suggest end-to-end tests. Either test through the UI with tools like selenium or through the HTTP API (I don't know if the shopware API is sufficient for extensive regression tests).
re 2: Since plugins do not run on their own I would extract all relevant functionality into plain old PHP classes that are independent of shopware and test those in isolation. Explore if some of that functionality can be made visible through an API and test the plugin integration through this. Depending on the actual plugin you might have to resort to UI tests again.
I have experience in swapping business logic in .NET by loading assemblies, and using reflection to find an implemented interface. This enabled behaviour composition at runtime, by simply distributing and placing DLL files into its working directory. How can I achieve the same in Clojure?
I have been informed I could compile my Lein project without AoT compilation, with dependency on a class which the JVM will search for I assume from sibling JAR files? I've also seen Java 9 has a solution called "Jigsaw", and there other projects such as lein-jlink too. I'm unsure if those are suitable.
I'd really appreciate an article/tutorial, working example, or a good few hints on how to do this as I'm new to JVM also.
My project in particular would involve a business logic model "module" loaded at startup, consuming messages and producing messages in return. It's meant to be somewhat a blackbox.
An alternate route I'd like to avoid is an MQTT-style approach where distributed modules are relatively heavy standalone programs.
Thank you for your time.
In plain Java you can have use same approach as you did in C#: you develop a core and provide interfaces that can be used for extensions, then you inspect (using reflection) the Java CLASSPATH for implementations of the interface in Jar files (this is the same idea of DLLs), but the Java CLASSPATH is either an environment variable or a command-line parameter with a list of paths where to search for Jar files.
In Clojure, you have the advantage that you can distribute libraries either as compiled code or as source code which the Clojure runtime will load. I'd recommend looking into the Deps and CLI guide because it will give you good guidance into how to:
add dependencies on a configuration file through various means, including loading dependencies from private repos, or even a dependency in a git repo at an exact commit
launching you code with the various switches or configuration you might need, so that you can change behaviour by editing a config file
I need a automation tool for my web application that develop in unity webplayer. I am a beginner in Unity development, a complete step-by-step tutorial would be awesome.
You can use Integration Test Framework in the Unity Test Tools
Integration Tests allow you to automate the verification process of your assets directly in a scene. They are designed to be used on existing content, directly within the Editor, to build tests which verify the behaviour of single assets or the interaction between them.
How to use the Integration Test Framework you can read here https://bitbucket.org/Unity-Technologies/unitytesttools/wiki/IntegrationTestsRunner
So I'm looking to bring web application testing into our .Net environment with a framework such as Selenium. At first, it'll probably be the developers writing the tests, but later it may be just the QA team. I'm wondering where the tests should actually live. Should they live in the same solution that the web application lives or should they live in a completely separate solution that is just for the tests? Please, note these are regression tests that will be done via automating a web browser so access to the web app's assemblies is not required. The answer probably is based on the environment and other factors, but I'm curious about what other people have done in this situation.
Regression Testing covers both Unit and Functional Tests. Functional tests exercise the complete program with various inputs. Unit tests exercise individual functions, subroutines, or object methods.
Unit Tests are part of the solution's code and should live with the Primary Code as with Microsoft MVC. Since Functional Tests examine the whole system and not just components, they can live anywhere. However, since your Functional Test are automated scripts, they should be included inside the solution.
The advantage to having both Functional and Unit tests live with the code is the issue of project management. Having all project related files in one repository links code version with test version. Testing scripts need to be stored in a repository (version control system) just like any other project code, so it is good to keep them with the solution.
That way the test team can do white box testing (testing with access to code) by checking-out the solution just like a developer. Their work can be saved, shared, and documented inside Visual Studio. Microsoft even includes some web based management tools with Team Foundation Server that can be used for managing the testing with open communication between test team and developers.
We wish to introduce Selenium testing to our maven build process. Happily, there is a ton of information available on how to do this, but I'm having trouble figuring out how to handle one of our requirements.
In an effort to separate our testing layers, we want to use mock service objects for the ui tests. All of these objects are already defined in Spring configuration files that we use in unit tests. Wiring these services is easy in a unit test (we're using #ContextConfiguration), but I don't know how to handle this configuration swap when we're deploying the war to Jetty for Selenium tests.
We're using:
Spring MVC 3.0
Maven
Hudson
Worst: introduce special user/interface parameter/checkbox/role. In an application remember to use mocks for this special case everywhere in the code. Horrible in maintenance, error-prone and, let's face it, pretty lame. Most common thou...
Easiest solution: develop conditional includes in your Spring application context:
<import resource="services-${env}.xml"/>
where ${env} comes from pom.xml:
<properties>
<env>prd</env> <!-- or test depending on build profile -->
</properties>
Remember to turn on resources filtering and use build profiles.
when doing Selenium tests. Switching can be done during Maven build or by some other filtering tool. Both files (services-prd.xml and services-test.xml) define same beans (same interfaces and/or ids), but of course the latter one uses mock implementations.
Best (IMHO): if you need to change the implementation at runtime, AOP + JMX will be great. Just wrap your real services with aspects and depending on some flag (accessible via JMX), use real services or mocks. Very clean and noninvasive.