I am trying to fix a bug with SSL in a product and noticed that although the code sets SSL to be true, in the next line in the code SSL is still at false. I wrote a unit test for this and the unit test confirms my suspicions.
[TestMethod]
public void SecureSocketLayerSetToTrue( )
{
var ldapConnection = new LdapConnection(
new LdapDirectoryIdentifier( "ldap.test.com", 636 ));
ldapConnection.SessionOptions.SecureSocketLayer = true;
Assert.IsTrue( ldapConnection.SessionOptions.SecureSocketLayer );
}
The test fails. Is there something here that I am missing?
It turns out that the way that the DirectoryServices.Protocols implements it's LDAP calls is by passing them through to a low-level LDAP API. This LDAP API is what is queried when a get is done on the property.
The low-level API is only updated when the methods are executed. You can think about this like it is building command-line arguments for an executable that hasn't been launched yet.
When a call like Bind() is made, then the executable is launched and the properties will report the correct value.
So, just because the property was saying that the value was false, it was using the true when necessary.
Related
We are using unit / integration tests during Shopware 6 development.
One technique we use is to disable database transaction behaviour to see the results for example of fixtures in the admin panel, for an easier debugging / understanding:
trait IntegrationTestBehaviour
{
use KernelTestBehaviour;
// use DatabaseTransactionBehaviour;
use FilesystemBehaviour;
use CacheTestBehaviour;
use BasicTestDataBehaviour;
use SessionTestBehaviour;
use RequestStackTestBehaviour;
}
Similar to this it would be helpful to send out actual emails during some tests (only for development, not in the CI and so on).
It is already possible to automatically test emails like this:
$eventDidRun = false;
$listenerClosure = function (MailSentEvent $event) use (&$eventDidRun): void {
$eventDidRun = true;
};
$this->addEventListener($dispatcher, MailSentEvent::class, $listenerClosure);
// do something that sends an email
static::assertTrue($eventDidRun, 'The mail.sent Event did not run');
But sometimes we want to manually see the actual email.
The .env.test already contains a valid mailer URL:
MAILER_URL=smtp://x:y#smtp.mailtrap.io:2525?encryption=tls&auth_mode=login
But still no mails get send during the test.
While I guess that this is fully intentional, is there some method to workaround the blockage of getting mails sent during testing?
The reason is the MAILER_URL variable is pre-set to null://localhost in the phpunit.xml.dist of the platform repository:
<server name="MAILER_URL" value="null://localhost"/>
You could set the MAILER_URL environment variable yourself before the tests of the class are executed:
/**
* #beforeClass
*/
public static function setMailerUrl(): void
{
$_SERVER['MAILER_URL'] = 'smtp://x:y#smtp.mailtrap.io:2525?encryption=tls&auth_mode=login';
}
I am working on Test Automation Script using JAVA and Selenium WebDriver ,
My test is running on cloud environment (crossbrowsertesting.com).
There is an feature to take snapshots of browser window ,
When I was using RemoteWebDriver this line of code work fine , but need to replace it with WebDriver because reason not bale to get windowHandles.
But I am getting following error now , stating
"The method getSessionId() is undefined for the type WebDriver"
snapshotHash=myTest.takeSnapshot(driver.getSessionId().toString());
//takeSnapshot method :
public String takeSnapshot(String seleniumTestId) throws UnirestException {
System.out.println("Screen Shots Taken.");
/*
* Takes a snapshot of the screen for the specified test.
* The output of this function can be used as a parameter for setDescription()
*/
HttpResponse<JsonNode> response = Unirest.post("http://crossbrowsertesting.com/api/v3/selenium/{seleniumTestId}/snapshots")
.basicAuth(username, api_key)
.routeParam("seleniumTestId", seleniumTestId)
.asJson();
// grab out the snapshot "hash" from the response
snapshotHash = (String) response.getBody().getObject().get("hash");
return snapshotHash;
}
I dont quite understand why you need to use "WebDriver" in place of "RemoteWebDriver" ? "RemoteWebDriver" is the mother of all web driver implementations and it should be good enough to work with any remote grid environment. I dont understand why you need to switch to using "WebDriver" reference which is one of the interfaces that "RemoteWebDriver" implements. getSessionId() is NOT part of any interface specifications but its a direct implementation that RemoteWebDriver provides.
getWindowHandles() is part of WebDriver interface specification and you should still be able to use it.
What I want to do?
I want to create and consume java objects in PowerBuilder and call methods on it. This should happen with less overhead possible.
I do not want to consume java webservices!
So I've a working sample in which I can create a java object, call a method on this object and output the result from the called method.
Everything is working as expected. I'm using Java 1.8.0_31.
But now I want to attach my java IDE (IntelliJ) to the running JVM (started by PowerBuilder) to debug the java code which gets called by PowerBuilder.
And now my question.
How do I tell PowerBuilder to add special options when starting the JVM?
In special I want to add the following option(s) in some way:
-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005
The JVM is created like following:
LONG ll_result
inv_java = CREATE JavaVM
ll_result = inv_java.CreateJavaVM("C:\Development\tms java\pbJavaTest", FALSE)
CHOOSE CASE ll_result
CASE 1
CASE 0
CASE -1
MessageBox ( "", "jvm.dll was not found in the classpath.")
CASE -2
MessageBox ( "", "pbejbclient90.jar file was not found." )
CASE ELSE
MessageBox ( "", "Unknown result (" + String (ll_result ) +")" )
END CHOOSE
In the PowerBuilder help I found something about overriding the static registry classpath. There is something written about custom properties which sounds like what I'm looking for.
But there's no example on how to add JVM options to override default behavior.
Does anyone have a clue on how to tell PowerBuilder to use my options?
Or does anyone have any advice which could guide me in the right direction?
Update 1
I found an old post which solved my initial issue.
If someone else want to know how it works take a look at this post:
http://nntp-archive.sybase.com/nntp-archive/action/article/%3C46262213.6742.1681692777#sybase.com%3E
Hi, you need to set some windows registry entries.
Under HKEY_LOCAL_MACHINE\SOFTWARE\Sybase\Powerbuilder\9.0\Java, there
are two folders: PBIDEConfig and PBRTConfig. The first one is used when
you run your application from within the IDE, and the latter is used
when you run your compiled application. Those two folders can have
PBJVMconfig and PBJVMprops folders within them.
PBJVMconfig is for JVM configuration options such as -Xms. You have to
specify incremental key values starting from "0" by one, and one special
key "Count" to tell Powerbuilder how many options exists to enumerate.
PBJVMprops is for all -D options. You do not need to specify -D for
PBJVMProps, just the name of the property and its value, and as many
properties as you wish.
Let me give some examples:
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SOFTWARE\Sybase\PowerBuilder\9.0\Java\PBIDEConfig\PBJVMprops]
"java.security.auth.login.config"="auth.conf"
"user.language"="en"
[HKEY_LOCAL_MACHINE\SOFTWARE\Sybase\PowerBuilder\9.0\Java\PBRTConfig\PBJVMconfig]
"0"="-client"
"1"="-Xms128m"
"2"="-Xmx512m"
"Count"="3"
[HKEY_LOCAL_MACHINE\SOFTWARE\Sybase\PowerBuilder\9.0\Java\PBRTConfig\PBJVMprops]
"java.security.auth.login.config"="auth.conf"
"user.language"="en"
Regards,
Gokhan Demir
But now there's another issue...
PB isn't able to create EJB Proxies for my sample class which is really simple with java 1.8.0_31. They were created with the default version, which is 1.6.0_24.
public class Simple
{
public Simple()
{
}
public static String getValue()
{
return "blubber";
}
public int getInt32Value()
{
return 123456;
}
public double getDoubleVaue()
{
return 123.123;
}
public static void main(String[] args)
{
System.out.println(Simple.getValue());
}
}
The error is the following. :D
---------- Deploy: Deploy of project p_genapp_ejbclientproxy (15:35:18)
Retrieving PowerBuilder Proxies from EJB...
Generation Errors: Error: class not found: (
Deployment Error: No files returned for package/component 'Simple'. Error code: Unknown. Proxy was not created.
Done.
---------- Finished Deploy of project p_genapp_ejbclientproxy (15:35:19)
So the whole way isn't a option because we do not want to change the JAVA settings in PB back and forth just to generate new EJB Proxies for changed JAVA objects in the future...
So one option to test will be creating COM wrappers for JAVA classes to use them in PB...
I am trying to create an aws instance using jclouds 1.9.0 and then run a script on it (via ssh). I am following the example locate here but I am getting authentication failed errors when the client (java program) tries to connect at the instance. The AWS console show that instance is up and running.
The example tries to create a LoginCrendentials object
String user = System.getProperty("user.name");
String privateKey = Files.toString(new File(System.getProperty("user.home") + "/.ssh/id_rsa"), UTF_8);
return LoginCredentials.builder().user(user).privateKey(privateKey).build();
which is latter used from the ssh client
responses = compute.runScriptOnNodesMatching(
inGroup(groupName), // predicate used to select nodes
exec(command), // what you actually intend to run
overrideLoginCredentials(login) // use my local user & ssh key
.runAsRoot(false) // don't attempt to run as root (sudo)
.wrapInInitScript(false));
Some Login information are injected to the instance with following commands
Statement bootInstructions = AdminAccess.standard();
templateBuilder.options(runScript(bootInstructions));
Since I am on Windows machine the creation of LoginCrendentials 'fails' and thus I alter its code to
String user = "ec2-user";
String privateKey = "-----BEGIN RSA PRIVATE KEY-----.....-----END RSA PRIVATE KEY-----";
return LoginCredentials.builder().user(user).privateKey(privateKey).build();
I also to define the credentials while building the template as described in "EC2: In Depth" guide but with no luck.
An alternative is to build instance and inject the keypair as follows, but this implies that I need to have the ssh key stored in my AWS console, which is not currently the case and also breaks the functionality of running a script (via ssh) since I can not infer the NodeMetadata from a RunningInstance object.
RunInstancesOptions options = RunInstancesOptions.Builder.asType("t2.micro").withKeyName(keypair).withSecurityGroup(securityGroup).withUserData(script.getBytes());
Any suggestions??
Note: While I am currently testing this on aws, I want to keep the code as decoupled from the provider as possible.
Update 26/10/2015
Based on #Ignasi Barrera answer, I changed my implementation by adding .init(new MyAdminAccessConfiguration()) while creating the bootInstructions
Statement bootInstructions = AdminAccess.standard().init(new MyAdminAccessConfiguration());
templateBuilder.options(runScript(bootInstructions));
Where MyAdminAccessConfiguration is my own implementation of the AdminAccessConfiguration interface as #Ignasi Barrera described it.
I think the issue relies on the fact that the jclouds code runs on a Windows machine and jclouds makes some Unix assumptions by default.
There are two different things here: first, the AdminAccess.standard() is used to configure a user in the deployed node once it boots, and later the LoginCredentials object passed to the run script method is used to authenticate against the user that has been created with the previous statement.
The issue here is that the AdminAccess.standard() reads the "current user" information and assumes a Unix System. That user information is provided by this Default class, and in your case I'm pretty sure it will fallback to the catch block and return an auto-generated SSH key pair. That means, the AdminAccess.standard() is creating a user in the node with an auto-generated (random) SSH key, but the LoginCredentials you are building don't match those keys, thus the authentication failure.
Since the AdminAccess entity is immutable, the better and cleaner approach to fix this is to create your own implementation of the AdminAccessConfiguration interface. You can just copy the entire Default class and change the Unix specific bits to accommodate the SSH setup in your Windows machine. Once you have the implementation class, you can inject it by creating a Guice module and passing it to the list of modules provided when creating the jclouds context. Something like:
// Create the custom module to inject your implementation
Module windowsAdminAccess = new AbstractModule() {
#Override protected void configure() {
bind(AdminAccessConfiguration.class).to(YourCustomWindowsImpl.class).in(Scopes.SINGLETON);
}
};
// Provide the module in the module list when creating the context
ComputeServiceContext context = ContextBuilder.newBuilder("aws-ec2")
.credentials("api-key", "api-secret")
.modules(ImmutableSet.<Module> of(windowsAdminAccess, new SshjSshClientModule()))
.buildView(ComputeServiceContext.class);
I have a dependency with parameters constructor. When I call the action more than 1x, it show this error:
Error activating IValidationPurchaseService
More than one matching bindings are available.
Activation path:
1) Request for IValidationPurchaseService
Suggestions:
1) Ensure that you have defined a binding for IValidationPurchaseService only once.
public ActionResult Detalhes(string regionUrl, string discountUrl, DetalhesModel detalhesModel)
{
var validationPurchaseDTO = new ValidationPurchaseDTO {...}
KernelFactory.Kernel.Bind<IValidationPurchaseService>().To<ValidationPurchaseService>()
.WithConstructorArgument("validationPurchaseDTO", validationPurchaseDTO)
.WithConstructorArgument("confirmPayment", true);
this.ValidationPurchaseService = KernelFactory.Kernel.Get<IValidationPurchaseService>();
...
}
I'm not sure what are you trying to achieve by the code you cited. The error is raised because you bind the same service more than once, so when you are trying to resolve it it can't choose one (identical) binding over another. This is not how DI Container is supposed to be operated. In your example you are not getting advantage of your DI at all. You can replace your code:
KernelFactory.Kernel.Bind<IValidationPurchaseService>().To<ValidationPurchaseService>()
.WithConstructorArgument("validationPurchaseDTO", validationPurchaseDTO)
.WithConstructorArgument("confirmPayment", true);
this.ValidationPurchaseService = KernelFactory.Kernel.Get<IValidationPurchaseService>();
With this:
this.ValidationPurchaseService = new ValidationPurchaseService(validationPurchaseDTO:validationPurchaseDTO, confirmPayment:true)
If you could explain what you are trying to achieve by using ninject in this scenario the community will be able to assist further.
Your KernelFactory probably returns the same kernel (singleton) on each successive call to the controller. Which is why you add a similar binding every time you hit the URL that activates this controller. So it probably works the first time and starts failing after the second time.