RAD webservice generater and *_Deser.java, *_Ser.java - rad

When i generate a webservice from my DAO in RAD it creates a Helper.java, Deser.java and Ser.java classes. Then when i put this code into my CVS(Clearcase) i notice that these files get hijacked when a call is made to my service.
Is there a way to avoid using theses generated classes? Thanks
The service methods i have return custom object arrays.

Generally those files should not be added to source control, as they are generated at runtime. The only files that should be added to source is the inputs/outputs/proxy/soap files.

You asked if there's a way to avoid them. Yes. Stop using JAX-RPC and switch to use JAX-WS 2.0.

Related

When I use WsdlImporter I got different data then with Add Web Reference

I have to connect to the service provided by the third party client. The issue is to do that dynamically. When I generate proxy on a static way with Add web reference everything is OK. With usage of WsdlImporter and CodeDom I get some strange generated classes (for proxy client per instance).
Then I saw in case of Add Service Reference I got the same values as with WsdlImporter. My conclusion WsdlImporter is used by svcutil.exe.
Does someone knows what is here so different ?
Service is using SOAP1.1
They're just two different tools. svcutil.exe actually uses WsdlImporter under the covers (which is why the two outputs are the same). Add Web Reference uses the same classes as the tool wsdl.exe (I don't know which class they use internally, but you can use a tool such as ILSpy or reflector to see what wsdl.exe uses.
"Add Web Reference" is part of the legacy ASMX support, not part of WCF. Don't use it if you have a choice.
The solution of this problem is to use ServiceDescriptionImporter. This importer is working as wsdl.exe .
Additionally, XSD schemes have to be imported (also take care on nested schemas). Great sample for this is on the following :
http://forums.asp.net/post/1740748.aspx
Thank you all, it works now

Automatic serialization

I want to download xsd specifications from a web service and automatic converting (serialize) these schemas to classes (visual studio - vb.net). If the organization that is responsible for the xsd schemas alter them in a way that only my class corresponding to the xsd have to be altered (not the rest of my code) I would like to automatic update my xsd corresponding class. Is this possible? If so, can somebody tell me how to do it?
Thanks!
I use vs2010. What I want to do is: call a web service where I can send in an input parameter to the service which specifies the xsd I want to retrieve (the service is GetShemaDefenition and returns an object with the schema specification in a string property of the object). I den have to read the xsd string from the string property and convert this to a class representation of this xsd specification. Is it possible to do this automatically? I have done this manually by using xsd.exe. If the owner organization of the xsd has altered the xsd specification, I have to test if there is a new specification, and if there is I have to build a new class representation of this xsd? Is it possible to do what I want? And how would I know if it has been a big change in the xsd which also affect other parts of my code, not just the class representation of the xsd?
Tanks a lot for your reply! So what you are saying, if I understand you correct, is that there is not a good solution for automating this functionality because if the xsd change I most likely (in some occasions’) have to change my code manually? So I have to choose, either in my application or in my intermediate service? But what is the purpose for providing the xsd in a web service? What can I use the web service for? I just wondering, maybe it is clear but I am new to web services and is eager to learn more.
Update:
Thanks! But can you explain a little bit more. What I have to do is: I use one web service where one of the properties is a string. The string is an XML inside a CDATA block. The organization which provides the web service will not pares the xml inside the CDATA block but instead forward this to another organization that will use the xml data. The organization which uses the xml data specifies the xsd schem that I have to follow to generate my xml correct. This is the xsd schema I can get from another web service. I don’t really understand what I can do with this xsd file from the web service. What can I do with it and why do I want to download it from the web service, when I can’t use it automatically? Because I have to manually do the changes when the xsd changes I can easily download the xsd schema from the organization’s home page and make the new class with xsd.exe. I understand there is something I don’t understand :o), can you pleas clarify?
What visual studio version you are using?, Normally you can click on the project's references and Add Web service. In this case Visual studio creates automatically the objects required to consume the service. you can update it any time by a right click on the reference.
However if it is very likely to change often, One solution is to implement an adapter class. use create an interface that provides the same functionality and call the actual web service. In your application you use only the proxy class and not the Web Service. Later when the web service interface changes all you have to do is to change the internals of this intermediate class.
Update:
you can use this tool to create you object model in code. Then you can compile your new object model and use it in you application. There are many complications in what you want to do and the bottom line is; when the object model changes, your code will fail. There is absolutely no way to imagine how the interface will change so while you can do all that automatically there is nothing to do if the name of a function changes.
However the answer to your situation is indirection. If you can't guaranty the stability of a external service. Why not create a stable intermediate service that calls the actual one? this way in future you don't need to touch you application. All you have to do is to modify the intermediate service while keeping it's interface compatible.

questions about using MEF in a WCF service

I'm just starting to play with MEF and have a couple questions.
1) I wrote a WCF service that takes in some xml and passes the xml off to a parser. The parsers are composed using MEF. (metadata in the xml lets me determine which parser to use). I can add a new parser, and support new XML, by just dropping the dll in a directory. That part all works. But, WCF services can be instantiated multiple times, I want my parser catalog to be static, that is, if multiple instances of my service are spun up, and they get the same XML, I only need one instance of the parser running, they are written to be thread safe. I can't seem to configure MEF to do this. Anyone know how?
2) I can drop in a new parser into the directory and a catalog refresh will automatically discover it, that works great. But if I try to drop a modified dll into the directory, and that parser has been activated in the service, I get an error saying the file is in use. Is there a way to override this?
1) It sounds like you should make your MEF container and catalogs static so they only get created once. Make sure you specify that the CompositionContainer should be thread safe by using the constructor with the isThreadSafe parameter and setting it to true.
2) You can enable shadow copying which will prevent the file from being locked when the DLL is loaded. However, you can't unload DLLs from an AppDomain in .NET, and furthermore it is not safe to recompose a CompositionContainer that can be used on multiple threads. In other words, using the isThreadSafe parameter only makes the container thread-safe for "reading"/pulling exports from the container, not modifying it via composition/recomposition.
So if you want to add a new parser it's probably best to restart the service.

WCF service reference update

Right now we have around 5 service reference added to our projects in a single solution.
I am force to add service reference even for projects having indirect dependencies calling service methods. Is there a way to get around for this situation.
For every single change in the service method, I have to update every single service reference to effect those changes. It is very time consuming too.
I am just wondering, is there any way i cam manage these things globally by making single service reference for the whole solution.
help appreciated.....:)
You should be able to use the svcutil.exe command line utility to generate a single service file (.cs file for example) from multiple service URL's. The nice thing about this is that you can share clinet-side DTO's and message types accross services if they have the same schema.
SvcUtil Reference: http://msdn.microsoft.com/en-us/library/aa347733.aspx
In regards to the requirement of adding the service reference to projects with indirect dependencies. You should probably not consume the service reference and related types directly from your service client. To improve maintainability and adaptability, you should wrap your service reference(s) in a facade. The facade would map between local types and service reference types, and give you much more agility in terms of responding to service changes. You would then only need to have the service references in a single location (preferably an independent project) along with the facade. The facade, which should change infrequently, will buffer you from the issues you are currently having with your service references.
You won't be able to get a single reference if you have multiple service, unfortunately.I stand corrected - see jrista's answer.
What you could do is create and update the service references automatically: instead of adding them manually in Visual Studio using Add Service Reference check out the svcutil.exe command line tool which will basically do the same thing.
Since it's a command line tool, you can have it run as e.g. part of your continuous build and update the necessary proxy client files every time you build the app.
Check out these additional links for tutorials and explanations about the details of using svcutil.exe:
http://msdn.microsoft.com/en-us/library/ms734712.aspx
http://asadsiddiqi.wordpress.com/2008/10/25/how-to-generate-wcf-client-proxy-class-using-svcutilexe/
http://www.xvpj.net/2008/03/08/wcf-step-by-step-tutorial/
Marc

WCF ChannelFactory vs generating proxy

Just wondering under what circumstances would you prefer to generate a proxy from a WCF service when you can just invoke calls using the ChannelFactory?
This way you won't have to generate a proxy and worry about regenerating a proxy when the server is updated?
Thanks
There are 3 basic ways to create a WCF client:
Let Visual Studio generate your proxy. This auto generates code that connects to the service by reading the WSDL. If the service changes for any reason you have to regenerate it. The big advantage of this is that it is easy to set up - VS has a wizard and it's all automatic. The disadvantage is that you're relying on VS to do all the hard work for you, and so you lose control.
Use ChannelFactory with a known interface. This relies on you having local interfaces that describe the service (the service contract). The big advantage is that can manage change much more easily - you still have to recompile and fix changes, but now you're not regenerating code, you're referencing the new interfaces. Commonly this is used when you control both server and client as both can be much more easily mocked for unit testing. However the interfaces can be written for any service, even REST ones - take a look at this Twitter API.
Write your own proxy - this is fairly easy to do, especially for REST services, using the HttpClient or WebClient. This gives you the most fine grain control, but at the cost of lots of service API being in strings. For instance: var content = new HttpClient().Get("http://yoursite.com/resource/id").Content; - if the details of the API change you won't encounter an error until runtime.
Personally I've never liked option 1 - relying on the auto generated code is messy and loses too much control. Plus it often creates serialisation issues - I end up with two identical classes (one in the server code, one auto generated) which can be tided up but is a pain.
Option 2 should be perfect, but Channels are a little too limiting - for instance they completely lose the content of HTTP errors. That said having interfaces that describe the service is much easier to code with and maintain.
I use ChannelFactory along with MetadataResolver.Resolve method. Client configuration is a bother, so I get my ServiceEndpoint from the server.
When you use ChannelFactory(Of T), T is either the original contract that you can get from a reference in you project or a generated contract instance. In some projects, I generated the code from a Service Reference because I could not add a reference to the contract dll. You can even generate an asynch contract with the service reference and use that contract interface with ChannelFactory.
The main point of using ChannelFactory for me was to get rid of the WCF client config information. In the sample code below, you can see how to achieve a WCF client without config.
Dim fixedAddress = "net.tcp://server/service.svc/mex"
Dim availableBindings = MetadataResolver.Resolve(GetType(ContractAssembly.IContractName), New EndpointAddress(fixedAddress))
factoryService = New ChannelFactory(Of ContractAssembly.IContractName)(availableBindings(0))
accesService = factoryService.CreateChannel()
In my final project, the availableBindings are checked to use net.tcp or net.pipe if available. That way, I can use the best available binding for my needs. I only rely on the fact that a metadata endpoint exist on the server.
I hope this helps
BTW, this is done using .NET 3.5. However it does work also with 4.0.
Well in order to use ChannelFactory<T> you must be willing to share contract assemblies between the service and the client. If this is okay with you then ChannelFactory<T> can save you some time.
The proxy will build async functions for which is kind of nice.
My answer is a kind of summary of Keith's and Andrew Hare's answers.
If you do not control server, but have only WSDL/URL- generate proxy using Visual Studio or svcutil. (Note that Visual Studio sometimes failed, when svcutil works better).
When you control both server and client, share interfaces/contracts and call ChannelFactory
.
It's not just a matter of time saved. Using the WSDL generated proxy is dangerous because if you forget to update the service reference you can leave the solution in an inconsistent state. Everything compiles but the service contract is broken. I definetly suggest to use a ChannelFactory whenever possible, you make your life much easier.
A possible alternative could be to write a prebuild script that calls the SVCUtil utility to create the proxy everytime you build your project, but anyway ChannelFactory is much more neat and elegant.