The following is an addition program to add two numbers.
my Server-side coding and Client-side coding as follows.
it throws error like
ReferenceError: com is not defined at (compiled_code):24
To work with Java Adapter Http Adapter is mandatory.
Server.js and client.js as follows
package com.mss;
public class Calculator {
public int addTwoIntegers(String first, String second){
int c=Integer.parseInt(first)+Integer.parseInt(second);
return Integer.toString(c);
}
}
function addTwoIntegers(){
alert("hi");
var calcInstance = new com.mss.Calculator();
return {
result : calcInstance.addTwoIntegers("1","2")
};
}
To work with Java Adapter Http Adapter is mandatory
The above sentence in false. In MFP 7.0 you have both JavaScript adapters and Java adapters. To use a Java adapter you are not required to use HTTP adapter. That doesn't make sense. They are two different types of adapters.
Read the following tutorials: Server-side development
Have you taken a look at the UsingJavaInAdapter adapter in the Adapters sample? It demonstrates exactly what you're trying to do.
Did you actually create such a com.mss Java class and placed it in the server\java folder of your MFP project?
The question is just missing information.
Read the Java in JavaScript adapters tutorials.
Java class
package com.sample.customcode;
public class Calculator {
// Add two integers.
public static int addTwoIntegers(int first, int second){
return first + second;
}
// Subtract two integers.
public int subtractTwoIntegers(int first, int second){
return first - second;
}
}
Adapter implementation
function addTwoIntegers(a,b){
return {
result: com.sample.customcode.Calculator.addTwoIntegers(a,b)
};
}
function subtractTwoIntegers(a,b){
var calcInstance = new com.sample.customcode.Calculator();
return {
result : calcInstance.subtractTwoIntegers(a,b)
};
}
Related
I am exploring gRPC by downloading and following the PortfoliosSample from here.
The sample code are all working fine. When I tried to create my own simple service and client by following the sample, however, I noticed that the generated code on the client side doesn't include the class and functions needed for accessing the service.
In the PortfoliosSample, the client side code generated based on the portfolios.proto includes and class named PortfoliosClinet (in PortfoliosGrpc.cs)
public partial class PortfoliosClient : grpc::ClientBase<PortfoliosClient>
Various functions (such as Get, in the class) are available for client side program to use for invoking the service.
In my generated code, BrokerGrpc.cs, there is no "GroupClient" class or anything similar in it. As a result, my client side code cannot use the generated code to access the service. What am I missing?
Here is the TSAPIBroker.proto file defined on the server
syntax = "proto3";
option csharp_namespace = "Test.API.TSAPIBroker.Protos";
package TSAPIBroker;
message Group {
int32 id = 1;
string name = 2;
}
message Groups {
repeated Group group = 1;
}
message GetRequest {
int32 groupId = 1;
}
message GetResponse {
Group group = 1;
}
service GroupService
{
rpc Get(GetRequest) returns (GetResponse);
}
And here is the generated TSAPIBrokerGrpc.cs
// <auto-generated>
// Generated by the protocol buffer compiler. DO NOT EDIT!
// source: TSAPIBroker.proto
// </auto-generated>
#pragma warning disable 0414, 1591
#region Designer generated code
using grpc = global::Grpc.Core;
namespace Test.API.TSAPIBroker.Protos {
public static partial class GroupService
{
static readonly string __ServiceName = "TSAPIBroker.GroupService";
static readonly grpc::Marshaller<global::Test.API.TSAPIBroker.Protos.GetRequest> __Marshaller_TSAPIBroker_GetRequest = grpc::Marshallers.Create((arg) => global::Google.Protobuf.MessageExtensions.ToByteArray(arg), global::Test.API.TSAPIBroker.Protos.GetRequest.Parser.ParseFrom);
static readonly grpc::Marshaller<global::Test.API.TSAPIBroker.Protos.GetResponse> __Marshaller_TSAPIBroker_GetResponse = grpc::Marshallers.Create((arg) => global::Google.Protobuf.MessageExtensions.ToByteArray(arg), global::Test.API.TSAPIBroker.Protos.GetResponse.Parser.ParseFrom);
static readonly grpc::Method<global::Test.API.TSAPIBroker.Protos.GetRequest, global::Test.API.TSAPIBroker.Protos.GetResponse> __Method_Get = new grpc::Method<global::Test.API.TSAPIBroker.Protos.GetRequest, global::Test.API.TSAPIBroker.Protos.GetResponse>(
grpc::MethodType.Unary,
__ServiceName,
"Get",
__Marshaller_TSAPIBroker_GetRequest,
__Marshaller_TSAPIBroker_GetResponse);
/// <summary>Service descriptor</summary>
public static global::Google.Protobuf.Reflection.ServiceDescriptor Descriptor
{
get { return global::Test.API.TSAPIBroker.Protos.TSAPIBrokerReflection.Descriptor.Services[0]; }
}
/// <summary>Base class for server-side implementations of GroupService</summary>
[grpc::BindServiceMethod(typeof(GroupService), "BindService")]
public abstract partial class GroupServiceBase
{
public virtual global::System.Threading.Tasks.Task<global::Test.API.TSAPIBroker.Protos.GetResponse> Get(global::Test.API.TSAPIBroker.Protos.GetRequest request, grpc::ServerCallContext context)
{
throw new grpc::RpcException(new grpc::Status(grpc::StatusCode.Unimplemented, ""));
}
}
/// <summary>Creates service definition that can be registered with a server</summary>
/// <param name="serviceImpl">An object implementing the server-side handling logic.</param>
public static grpc::ServerServiceDefinition BindService(GroupServiceBase serviceImpl)
{
return grpc::ServerServiceDefinition.CreateBuilder()
.AddMethod(__Method_Get, serviceImpl.Get).Build();
}
/// <summary>Register service method with a service binder with or without implementation. Useful when customizing the service binding logic.
/// Note: this method is part of an experimental API that can change or be removed without any prior notice.</summary>
/// <param name="serviceBinder">Service methods will be bound by calling <c>AddMethod</c> on this object.</param>
/// <param name="serviceImpl">An object implementing the server-side handling logic.</param>
public static void BindService(grpc::ServiceBinderBase serviceBinder, GroupServiceBase serviceImpl)
{
serviceBinder.AddMethod(__Method_Get, serviceImpl == null ? null : new grpc::UnaryServerMethod<global::Test.API.TSAPIBroker.Protos.GetRequest, global::Test.API.TSAPIBroker.Protos.GetResponse>(serviceImpl.Get));
}
}
}
#endregion
Using the container image mcr.microsoft.com/dotnet/sdk:5.0, I'm able to use your proto to generate both files:
TSAPIBroker.cs
TSAPIBrokerGrpc.cs
Repro:
dotnet new console
dotnet add package Grpc --version 2.33.1
dotnet add package Grpc.Tools --version 2.33.1
dotnet add package Google.Api.CommonProtos --version 2.2.0
Reference your proto from ther project file and then build.
My generated *Grpc.cs contains GroupServiceClientclass.
NOTE the message Groups is defined but not used.
UPD. Sorry, guys.
I have an application that acts as a SOAP server, how do I write a PHPUnit test to test it?
SOAP extension is reading data from PHP input stream. You just provide your own data there and create some integration/unit tests for your API.
Take a look at the signature of SoapServer::handle() method. It takes as an argument a string which is a request itself. This parameter is optional and if you don't pass anything in, PHP will just read the data itself. But you can simply override it.
I used streams to do it. First you wrap the SoapServer with your own class like this:
class MyServer
{
/** \SoapServer */
private $soapServer;
public function __construct(\SoapServer $soapServer)
{
$this->soapServer = $soapServer;
}
public function handle(Psr\Http\Message\StreamInterface $inputStream): void
{
$this->soapServer->handle($inputStream->getContent());
}
}
Now you are ready to mock the request.
In your test you can do:
class MyTest extends TestCase
{
public function testMyRequest(): void
{
$mySoapServer = $this->createMySoapServer();
$request = $this->createRequest();
$mySoapServer->handle($request);
}
private function createRequest(): StreamInterface
{
$requestString = '<soap:Envelope></soap:Envelope>';
$fh = fopen('php://temp', 'rw');
fwrite($fh, $requestString);
fseek($fh, SEEK_SET);
return new Psr\Http\Message\StreamInterface\Stream($fh);
}
private function createMySoapServer(): MyServer
{
return new MyServer(new \SoapServer());
}
}
One thing to keep in mind - this test will generate output. You may want to test this output or ignore it. Depends on your use case.
Another side note. What you are asking for has really nothing to do with PHPUnit. It just a matter of designing your SOAP server correctly.
If you are wondering how to set up the stream when you have a live request, this is really simple:
$server->handle(new Psr\Http\Message\StreamInterface\Stream(fopen('php://input', 'r+')));
Given this entry in application.properties:
server.port=0
which causes Spring Boot to chose a random available port, and testing a spring boot web application using spock, how can the spock code know which port to hit?
Normal injection like this:
#Value("${local.server.port}")
int port;
doesn't work with spock.
You can find the port using this code:
int port = context.embeddedServletContainer.port
Which for those interested in the java equivalent is:
int port = ((TomcatEmbeddedServletContainer)((AnnotationConfigEmbeddedWebApplicationContext)context).getEmbeddedServletContainer()).getPort();
Here's an abstract class that you can extends which wraps up this initialization of the spring boot application and determines the port:
abstract class SpringBootSpecification extends Specification {
#Shared
#AutoCleanup
ConfigurableApplicationContext context
int port = context.embeddedServletContainer.port
void launch(Class clazz) {
Future future = Executors.newSingleThreadExecutor().submit(
new Callable() {
#Override
public ConfigurableApplicationContext call() throws Exception {
return (ConfigurableApplicationContext) SpringApplication.run(clazz)
}
})
context = future.get(20, TimeUnit.SECONDS);
}
}
Which you can use like this:
class MySpecification extends SpringBootSpecification {
void setupSpec() {
launch(MyLauncher.class)
}
String getBody(someParam) {
ResponseEntity entity = new RestTemplate().getForEntity("http://localhost:${port}/somePath/${someParam}", String.class)
return entity.body;
}
}
The injection will work with Spock, as long as you've configured your spec class correctly and have spock-spring on the classpath. There's a limitation in Spock Spring which means it won't bootstrap your Boot application if you use #SpringApplicationConfiguration. You need to use #ContextConfiguration and configure it manually instead. See this answer for the details.
The second part of the problem is that you can't use a GString for the #Value. You could escape the $, but it's easier to use single quotes:
#Value('${local.server.port}')
private int port;
Putting this together, you get a spec that looks something like this:
#ContextConfiguration(loader = SpringApplicationContextLoader, classes = SampleSpockTestingApplication.class)
#WebAppConfiguration
#IntegrationTest("server.port=0")
class SampleSpockTestingApplicationSpec extends Specification {
#Value("\${local.server.port}")
private int port;
def "The index page has the expected body"() {
when: "the index page is accessed"
def response = new TestRestTemplate().getForEntity(
"http://localhost:$port", String.class);
then: "the response is OK and the body is welcome"
response.statusCode == HttpStatus.OK
response.body == 'welcome'
}
}
Also note the use of #IntegrationTest("server.port=0") to request a random port be used. It's a nice alternative to configuring it in application.properties.
You could do this too:
#Autowired
private org.springframework.core.env.Environment springEnv;
...
springEnv.getProperty("server.port");
I would like to pass a complete JSON object to a java adapter in worklight. This adapter will call multiple other remote resources to fulfill the request. I would like to pass the json structure instead of listing out all of the parameters for a number of reasons. Invoking the worklight procedure works well. I pass the following as the parameter:
{ "parm1": 1, "parm2" : "hello" }
Which the tool is fine with. When it calls my java code, I see a object type of JSObjectConverter$1 being passed. In java debug, I can see the values in the object, but I do not see any documentation on how to do this. If memory serves me, the $1 says that it is an anonymous inner class that is being passed. Is there a better way to pass a json object/structure in adapters?
Lets assume you have this in adapter code
function test(){
var jsonObject = { "param1": 1, "param2" : "hello" };
var param2value = com.mycode.MyClass.parseJsonObject(jsonObject);
return {
result: param2value
};
}
Doesn't really matter where you're getting jsonObject from, it may come as a param from client. Worklight uses Rhino JS engine, therefore com.mycode.MyClass.parseJsonObject() function will get jsonObject as a org.mozilla.javascript.NativeObject. You can easily get obj properties like this
package com.mycode;
import org.mozilla.javascript.NativeObject;
public class MyClass {
public static String parseJsonObject(NativeObject obj){
String param2 = (String) NativeObject.getProperty(obj, "param2");
return param2;
}
}
To better explain what I'm doing here, I wanted to be able to pass a javascript object into an adapter and have it return an updated javascript object. Looks like there are two ways. The first it what I answered above a few days ago with serializing and unserializing the javascript object. The other is using the ScriptableObject class. What I wanted in the end was to use the adapter framework as described to pass in the javascript object. In doing so, this is what the Adapter JS-impl code looks like:
function add2(a) {
return {
result: com.ibm.us.roberso.Calculator.add2(a)
};
The javascript code in the client application calling the above adapter. Note that I have a function to test passing the javascript object as a parameter to the adapter framework. See the invocationData.parameters below:
function runAdapterCode2() {
// x+y=z
var jsonObject = { "x": 1, "y" : 2, "z" : "?" };
var invocationData = {
adapter : "CalculatorAdapter",
procedure : 'add2',
parameters : [jsonObject]
};
var options = {
onSuccess : success2,
onFailure : failure,
invocationContext : { 'action' : 'add2 test' }
};
WL.Client.invokeProcedure(invocationData, options);
}
In runAdapterCode2(), the javascript object is passed as you would pass any parameter into the adapter. When worklight tries to execute the java method it will look for a method signature of either taking an Object or ScriptableObject (not a NativeObject). I used the java reflection api to verify the class and hierarchy being passed in. Using the static methods on ScriptableObject you can query and modify the value in the object. At the end of the method, you can have it return a Scriptable object. Doing this will give you a javascript object back in the invocationResults.result field. Below is the java code supporting this. Please note that a good chunk of the code is there as part of the investigation on what object type is really being passed. At the bottom of the method are the few lines really needed to work with the javascript object.
#SuppressWarnings({ "unused", "rawtypes" })
public static ScriptableObject add2(ScriptableObject obj) {
// code to determine object class being passed in and its heirarchy
String result = "";
Class objClass = obj.getClass();
result = "objClass = " + objClass.getName() + "\r\n";
result += "implements=";
Class[] interfaces = objClass.getInterfaces();
for (Class classInterface : interfaces) {
result += " " + classInterface.getName() ;
}
result += "\r\nsuperclasses=";
Class superClass = objClass.getSuperclass();
while(superClass != null) {
result += " " + superClass.getName();
superClass = superClass.getSuperclass();
}
// actual code working with the javascript object
String a = (String) ScriptableObject.getProperty((ScriptableObject)obj, "z");
ScriptableObject.putProperty((ScriptableObject)obj, "z", new Long(3));
return obj;
}
Note that for javascript object, a numeric value is a Long and not int. Strings are still Strings.
Summary
There are two ways to pass in a javascript object that I've found so far.
Convert to a string in javascript, pass string to java, and have it reconstitute into a JSONObject.
Pass the javascript object and use the ScriptableObject classes to manipulate on the java side.
As explained in these questions I'm trying to build an application that consists of a host and multiple task processing clients. With some help I have figured out how to discover and serialize part definitions so that I could store those definitions without having to have the actual runtime type loaded.
The next step I want to achieve (or next two steps really) is that I want to split the composition of parts from the actual creation and connection of the objects (represented by those parts). So if I have a set of parts then I would like to be able to do the following thing (in pseudo-code):
public sealed class Host
{
public CreationScript Compose()
{
CreationScript result;
var container = new DelayLoadCompositionContainer(
s => result = s);
container.Compose();
return script;
}
public static void Main()
{
var script = Compose();
// Send the script to the client application
SendToClient(script);
}
}
// Lives inside other application
public sealed class Client
{
public void Load(CreationScript script)
{
var container = new ScriptLoader(script);
container.Load();
}
public static void Main(string scriptText)
{
var script = new CreationScript(scriptText);
Load(script);
}
}
So that way I can compose the parts in the host application, but actually load the code and execute it in the client application. The goal is to put all the smarts of deciding what to load in one location (the host) while the actual work can be done anywhere (by the clients).
Essentially what I'm looking for is some way of getting the ComposablePart graph that MEF implicitly creates.
Now my question is if there are any bits in MEF that would allow me to implement this kind of behaviour? I suspect that the provider model may help me with this but that is a rather large and complex part of MEF so any guidelines would be helpful.
From lots of investigation it seems that is not possible to separate the composition process from the instantiation process in MEF so I have had to create my own approach for this problem. The solution assumes that the scanning of plugins results in having the type, import and export data stored somehow.
In order to compose parts you need to keep track of each part instance and how it is connected to other part instances. The simplest way to do this is to make use of a graph data structure that keeps track of which import is connected to which export.
public sealed class CompositionCollection
{
private readonly Dictionary<PartId, PartDefinition> m_Parts;
private readonly Graph<PartId, PartEdge> m_PartConnections;
public PartId Add(PartDefinition definition)
{
var id = new PartId();
m_Parts.Add(id, definition);
m_PartConnections.AddVertex(id);
return id;
}
public void Connect(
PartId importingPart,
MyImportDefinition import,
PartId exportingPart,
MyExportDefinition export)
{
// Assume that edges point from the export to the import
m_PartConnections.AddEdge(
new PartEdge(
exportingPart,
export,
importingPart,
import));
}
}
Note that before connecting two parts it is necessary to check if the import can be connected to the export. In other cases MEF does that but in this case we'll need to do that ourselves. An example of how to approach that is:
public bool Accepts(
MyImportDefinition importDefinition,
MyExportDefinition exportDefinition)
{
if (!string.Equals(
importDefinition.ContractName,
exportDefinition.ContractName,
StringComparison.OrdinalIgnoreCase))
{
return false;
}
// Determine what the actual type is we're importing. MEF provides us with
// that information through the RequiredTypeIdentity property. We'll
// get the type identity first (e.g. System.String)
var importRequiredType = importDefinition.RequiredTypeIdentity;
// Once we have the type identity we need to get the type information
// (still in serialized format of course)
var importRequiredTypeDef =
m_Repository.TypeByIdentity(importRequiredType);
// Now find the type we're exporting
var exportType = ExportedType(exportDefinition);
if (AvailableTypeMatchesRequiredType(importRequiredType, exportType))
{
return true;
}
// The import and export can't directly be mapped so maybe the import is a
// special case. Try those
Func<TypeIdentity, TypeDefinition> toDefinition =
t => m_Repository.TypeByIdentity(t);
if (ImportIsCollection(importRequiredTypeDef, toDefinition)
&& ExportMatchesCollectionImport(
importRequiredType,
exportType,
toDefinition))
{
return true;
}
if (ImportIsLazy(importRequiredTypeDef, toDefinition)
&& ExportMatchesLazyImport(importRequiredType, exportType))
{
return true;
}
if (ImportIsFunc(importRequiredTypeDef, toDefinition)
&& ExportMatchesFuncImport(
importRequiredType,
exportType,
exportDefinition))
{
return true;
}
if (ImportIsAction(importRequiredTypeDef, toDefinition)
&& ExportMatchesActionImport(importRequiredType, exportDefinition))
{
return true;
}
return false;
}
Note that the special cases (like IEnumerable<T>, Lazy<T> etc.) require determining if the importing type is based on a generic type which can be a bit tricky.
Once all the composition information is stored it is possible to do the instantiation of the parts at any point in time because all the required information is available. Instantiation requires a generous helping of reflection combined with the use of the trusty Activator class and will be left as an exercise to the reader.