Custom Tool Warning: Cannot import wsdl:portType - wcf

I am not sure what this error is. Thought I would ask you guys on stack overflow what it could be. I had to change the contract on my service, on a test client that I used, I updated the service reference. Now I am getting this warning. How can I resolve this particular issue.

I found the answers in What does this WCF error mean: "Custom tool warning: Cannot import wsdl:portType" help. In my case, I chose unticking the 'Re-use types' box and that solved it.
Additional Thoughts: SOA, Distributed Objects, & Coupling
The ”Service Oriented” Vision implied by a WSDL and the WS-* standards is that the WSDL itself tells your client everything you need to know use the service. On this vision, unticking the 'Re-use types' box is the correct approach. You shouldn't be reusing types from anywhere except the wsdl.
Ticking the 're-use types' box is more a "Distributed Objects" approach: your client and service become coupled through the types in a shared dll. This is a strong distributed dependency. If the shared objects are updated, the service and all its clients must be updated, all in sync with each other. This is one reason why distributed objects fell very much out of favour and SOA took over.
Unless you company chose (possibly accidentally, by sharing libraries on a Nuget feed) a distributed objects architecture, and understand the costs, I would always untick the re-use types.
It reduces coupling.

This was the first answer I found when searching for a similar problem, but my issue was a [DataContract] attribute applied to an enum without any [DataMember] attributes, making an empty data contract.
I used this as a resource:
http://www.lukepuplett.com/2010/02/empty-datacontract-causes-misleading.html
It appears as though it is advisable to allow WCF to infer a datacontract for enums.

I find that this also can be solved by using the ?singleWsdl instead of just ?wsdl at the end. There are multiple WSDL files that are linked to, so something too naive to browse them can throw errors.

We had this problem today when trying to convert and old WSDL to a .cs fike with dotnet-svcutil.
We solved by replacing:
<xsd:complexContent>
<xsd:restriction base="xsd:anyType" />
</xsd:complexContent>
With
<xsd:complexContent>
<xsd:sequence>
<xsd:element name="deleteThisField" type="xsd:string" />
</xsd:sequence>
</xsd:complexContent>
And then manualy delete the deleteThisField string in the generated result code.

Related

Unable to derive module descriptor for legacy signed JAR

I'm trying to update a software system to JDK-11 using modules, and everything was going just fine right up until I slammed head-on into the aforementioned issue.
I have a legacy signed JAR that I need to incorporate for interaction with legacy systems. There's no way to update the JAR and no way to get a new version. The JAR must be signed in order to be usable (the whole "trusted code" deal and whatnot). The problem is that the JAR contains classes in the unnamed (root) package. Yeah. Stupid. Bad practice. Blablabla. It's still there, and I still need to use it.
I've not found any documentation or answers anywhere that would remotely suggest that what I need is possible. In fact, the opposite is true: everyone is adamant that in the "new"(ish) module system, no class may reside in the unnamed package.
Needless to say I'm unable to both modify the contents of the JAR, or get at the sources to render a new one - that's without even considering the issue of the signature...
That said: I refuse to believe the folks at Oracle would leave such a glaring oversight with regards to legacy code. As we all know, a lot of the time we have no choice but to use it for legitimate reasons, and we can't do anything to fix/update/refactor/etc... I would have hoped there was a mechanism added to the module system to support this, albeit for extreme cases only, etc...etc...
Disclaimer: I do fully understand why this isn't meant to be supported. What I'm having a hard time with is the lack of a workaround...
Thanks!
I've already tried:
creating a facade module that transitively adds the offending module (obviously no dice, same problem)
unpacking-and-repacking the module while temporarily disabling signature validation in a test env (fails because the class is apparently referenced within many other, properly-organized classes)
finding an updated module (no luck here, either)
beheading a chicken and roasting it over a pentagram while invoking the aid of ancient pagan gods (tasty, but didn't fix it)
curling up in a ball under my desk and weeping until execution succeeds (that's where I'm typing this from)...

Unmarshaling SOS DescribeSensor response via JSONIX yields incomplete object

I am attempting to use jsonix to unmarshal xml response from an SOS DescribeSensor request. In the broader scope I am going to be using jsonix to unmarshal all responses from SOS, particularly 2.0. I noticed that the response uses SML or SensorML namespace and so I added the extra module dependencies and sub-dependencies (namely GML_3_1_1, SWE_1_0_1, IC_2_0, SMIL_2_0, SMIL_2_0_Language, and of course SensorML_1_0_1). Before I added these I noticed the return was a generic json (see first screenshot, particularly near sml:physicalsystem). After I added the dependencies I got an error in my console during part of the unmarshaling process which I do not understand (see second screenshot). Here is a link to the xml response from the server for reference. https://drive.google.com/file/d/0B8LdnPVJpHz7M3VGb0FZc2lQcjQ/view?usp=sharing. I would really like to understand if this has anything to do with the ordering of the modules when I create the context though I believe it is fine. Once the solution to this is discovered I have two follow up questions.
Is it reasonable to expect (in general) that using the modules built from the ogc-schemas on the highsource github page should allow me to handle all responses via jsonix? i.e. every element will always be mapped to a defined type. I know these schemas/mappings are very complicated.
Are there any other tools I can use to verify the modules or validate them against schemas to make life easier rather than tracking down elements on an individual basis or tracing through various module files when jsonix seems to parse incorrectly?
Thanks in advance - Richard3d
var context = new Jsonix.Context([XLink_1_0, GML_3_2_1, IC_2_0, SMIL_2_0, SMIL_2_0_Language, GML_3_1_1, SWE_1_0_1, SensorML_1_0_1, OWS_1_1_0, SWE_2_0, SWES_2_0, WSN_T_1, WS_Addr_1_0_Core, OM_2_0, ISO19139_GMD_20070417, ISO19139_GCO_20070417, ISO19139_GSS_20070417, ISO19139_GTS_20070417, ISO19139_GSR_20070417, Filter_2_0, SOS_2_0]);
Disclaimer: I am the author of jsonix and main dev of ogc-schemas.
First of all, you're on the right track, stay on it.
Yes, if you have all the required mappings then you should get a "nice" JSON with all the properties with specific types, cardinatilities etc.
The goal of Jsonix is to provide bi-directional XML<->JSON conversion with deterministic structure, types and cardinalities.
The goal of OGC Schemas is to provide JAXB and Jsonix mappings for all of the OGC Schemas.
So togethere these two should allow to transform any OGC XMLs from/to JSON.
"Generic JSON" was actually just DOM. If a property allows DOM and Jsonix does not have mapping for certain element, it is just taken as DOM. You were just missing SensorML mappings.
You're right the structure of schema dependencies is very complex. But this is something we should take to OGC. :) It's a bit crazy that you need, like, a dozen of schemas to read sensor data. I was actually intending to build automatic loading of dependencies but did not yet implement this feature.
The next GML_3_1_1.AbstractFeatureType problem is probably this issue. Try changing the order of mappings (move GML_3_1_1 to the earlier places). Actually the order of mappings should not be significant, but, well, there's a bug.
Tools to cross-check - no, probably not. My approach is to do roundtrip tests (unmarshal-marshal-unmarshal-check equality). From experience, there are normally a couple of caveats at the start, but then it works by design. Of course there are bugs in Jsonix and there may be problems with mappings, but this gets sorted out.
Also feel to create a support project here:
https://github.com/highsource/jsonix-support
For instance https://github.com/highsource/jsonix-support/s/sos.
Here's an example of such a support project:
https://github.com/highsource/jsonix-support/tree/master/l/lightstalker89
I need this because just downloading XML from Google Drive (a) takes me effort to set up the support project (b) legally dangerous as I have not idea where this XML comes from and if I have rights/license to add these files to my test suites.

Type reference forwarding in the MonoDroid project requiring it

Regarding to the solution described in this post, a third assembly is required to forward the type resolution to the correct assembly.
When adding this reference to the Android class library project using the type, the forwarding seems to not be done. The reference needs to be added in the Android application project which is the end point of the build process.
Does any solution exist to add the reference embedding the forwarding in the project requiring it ?
I mean, if in my solution architecture I use :
MyApp.Core - PCL
MyApp.Core.Droid - Android class library
MyApp.UI.Droid - Android Application
The usage of System.Net namespace (System.Net.Socket.AddressFamily for example) is done in my ViewModel, which is located in MyApp.Core.Droid (redirection of MyApp.Core with some plugins). In this case, it is more logical (and readable) to have the reference in the MyApp.Core.Droid. But in the fact, the assembly resolution is done (from what I understand) when packaging the application, so in MyApp.UI.Droid. So in this case, the reference needs to be added to MyApp.UI.Droid in order to be found (failed if added to MyApp.Core.Droid).
In this case the solution works, but its quite obvious to understand for a new programmer joining the team which, has not been facing the trouble and understands why this reference needs to be added to the UI project...
I'm not sure my thought is easy to understand by the way I introduce it. Let me know if you need more explanation.
Thanks,
Guillaume.
I'm not entirely sure why this 'fails if added to MyApp.Core.Droid' - it feels like this should be added. However, I know that Xamarin have tweaked and changed the dependency resolution scripts a few times.
With that said, I think the best answer to your question is 'don't worry about it too much' - this is only a small inconveneinve right now and it will be resolved by Xamarin's updates 'soon'.
The current PCL support is something that I and a number of others have worked on in order to make things work. This set of 'hacks' is a workaround for the lack of 'proper PCL' support - it simulates what the Microsoft PCL build platform does on WindowsPhone, WPF, etc, but it isn't a perfect implementation.
Xamarin have now committed to 'proper PCL' support. When that happens then these type-forwarding dependencies will automatically be added. The good news is that this support is perhaps now only days, weeks or at most months away.

svcutil.exe generates errors while wsdl.exe runs through without

I'm looking into generating a web-service conforming to the WSDL found at:
http://assets.cdn.gamigo.com/xml/connection-service/1.0.10/account.wsdl
When I run with svcutil.exe like this:
svcutil.exe /language:C# /out:GamigoServices.cs http://assets.cdn.gamigo.com/xml/connection-service/1.0.10/account.wsdl
I get these errors:
Error: Cannot import wsdl:binding
Detail: The given key was not present in the dictionary.
XPath to Error Source: //wsdl:definitions[#targetNamespace='http://connection.ga
mes.gamigo.com/v_1_0']/wsdl:binding[#name='DefaultAccountServiceServiceSoapBindi
ng']
Error: Cannot import wsdl:port
Detail: There was an error importing a wsdl:binding that the wsdl:port is depend
ent on.
XPath to wsdl:binding: //wsdl:definitions[#targetNamespace='http://connection.ga
mes.gamigo.com/v_1_0']/wsdl:binding[#name='DefaultAccountServiceServiceSoapBindi
ng']
XPath to Error Source: //wsdl:definitions[#targetNamespace='http://connection.ga
mes.gamigo.com/v_1_0']/wsdl:service[#name='AccountService']/wsdl:port[#name='Acc
ountServicePort']
I also tried a tool, Wscf:Blue, which gives me the same errors (it's a WCF VS plugin which, supposably, would do yet a lot more for me once I get past this step).
On the other hand, if I use wsdl.exe (which I don't want because I want to use WCF, and, as far as I understand, I need to use svcutil.exe for WCF, but I just tried wsdl.exe in my attempts to narrow down the source of the problems) like this:
wsdl.exe http://assets.cdn.gamigo.com/xml/connection-service/1.0.10/account.wsdl /serverInterface
there are no errors.
I've been trying all kinds of things with local copies of the WSDL (and the types.xsd which it references), commenting out sections etc. to narrow down the problem. However, it really boils down to exactly what the error message is referring to, the definition of that binding. I've also googled, but the few references to this kind of error are not helpful at all.
Besides, I'm particularly puzzled by the fact that wsdl.exe seems perfectly fine with that WSDL. I also used
http://xmethods.net/ve2/WSDLAnalyzer.po# to validate the WSDL, no errors were shown.
So, now I'm at the point where I really got no idea how to proceed. As the whole issue is somewhat time-critical - by next week I should really start with implementation -, I might end up using the code generated by wsdl.exe and going for the older technology obsoleted by MS, but for several (obvious) reasons I'd rather not go that route. So if anyone has any idea what to do to make svcutil.exe work with that, I'd be grateful.
I might add that while I cannot modify the definition, I might be able to convince the publisher of that WSDL to perform certain edits or at least publish a second version for my purposes.
Many thanks,
Max
Vienna,
Austria
step1. Stare at your WSDL file
step2. ensure that the wsdl:portType "aligns with" wsdl:binding (i.e. all operations are defined in a corresponding way under portType and binding).
step3. Thank me for the best advice ever when dealing with svcUtil errors such as "the given key was not present in the dictionary" :-)
Svcutil.exe is used for the WCF service. If its a web service wsdl.exe will work fine. I think you are using the svcutil.exe for the web service so it is giving error.

Castle-ActiveRecord Tutorial with .NET 3.5 broken?

Has anyone tried the ActiveRecord Intro Sample with C# 3.5?
I somehow have the feeling that the sample is completely wrong or just out of date. The XML configuration is just plain wrong:
<add key="connection.connection_string" value="xxx" />
should be :
<add key="hibernate.connection.connection_string" value="xxx" />
(if I understand the nhibernate config syntax right..)
I am wondering what I'm doing wrong. I get a "Could not perform ExecuteQuery for User" Exception when calling Count() on the User Model.
No idea what this can be. The tutorial source differs strongly from the source on the page (most notably in the XML configuration), and it's a VS2003 sample with different syntax on most things (no generics etc).
Any suggestions? ActiveRecord looks awesome..
(This was too long for a comment post)
[#Tigraine] From your comments on my previous answer it looks like the error lies not with the configuration, but with one of your entities. Removing the "hibernate" corrected the configuration so that it geve you the real error, which appears to be that the entity "Post" is not properly attributed for ActiveRecord to create its mapping.
If you further down in the error that it gives, it likely has some details as to what about "Post" failed.
Some common things include:
THe class does not have the [ActiveRecord] attribute.
There is no property with the [PrimaryKey] attribute.
There is no matching table called "Post" (or "Posts" if PluralizeTableNames is "true").
There is no matching column(s) for attributed properties.
Your attributed properties and public methods are not virtual (this one kills me all the time).
The 'hibernate' portion of the key was removed in NHibernate version 2.0.
This version is correct for NHibernate 2.0 onwards:
<add key="connection.connection_string" value="xxx" />
Edit:
I see that the quickstart doesn't come with the binaries for Castle and NHibernate. You must have downloaded the binaries from somewhere; it would be helpful if you could provide the version number of your NHibernate.dll file.
Confusingly, at least SOME of the quickstart has been updated to be current with NHibernate (NH) 2.0, but the latest 'proper' Castle release is still the 1.0 RC3 (almost a year old now), which does not include NH 2.0.
You can go two ways. You can continue using Castle RC3 and in this case you will indeed need to add the 'hibernate' prefix to your configuration entries. Or you can download a build of Castle from the trunk, which should be running against NH 2.0. The problem with the latter approach is that some of the other breaking changes introduced in NH 2.0 might not be fixed in the quick start.
Delete the "hibernate." part for all configuration entries. Your first example is the correct one.