I'm trying to find a way to monitor the actual XML being generated and received with some Webservices I am painfully trying to implement on Domino 8.5.3
SoapLog, as mentioned by Julian Robichaux does not seem to work anymore.
The particular case I have here is that I am accessing, server to server (because it's an HTTPS Call), an external Webservice with a Java Web Service Consumer.
Any suggestions?
Related
I have a RavenDB IIS instance that is working just fine via the Silverlight interface. I am trying to connect to as an embedded client by targeting the web folder, but I keep getting an error saying that it cannot find a lucene DLL. Is this even possible?
No, that is not possible. In embedded mode, the EmbeddableDocumentStore actually contains the database instance. Only one can be spun up at a time. You cannot have multiple embedded clients using the same set of files.
If you have an instance running in IIS, then don't connect with embedded mode. Connect using the regular client and point at the URL of your server.
I have a chat application developed using WCF call back contract.This use netTcp binding for the client server communication.
Client is a Windows Forms application will be running in the client machine(XP or Windows8 machine)
This WCF service hosted as a windows service in the server machine.I am maintaining a Client Session list in the service, this will store the details about each client connected to the server, this list is static variable.
The work flow is, whenever a client connect to the server using the connect operation,client details will be added to the client session list,this session list will be used by the server to send message back to the client whenever its needed.
Everything works fine in the single server environment,Now I want to know how can I handle this in the load balancing scenario, that means I have two server machine,at a time one server will be active.if Server 1 is fail, Server 2 will be active. In this scenario, How can I manage my client sessions share between two servers and working as usual with out disturbing my clients?
One option is to use a Session State store provider, which will provide the session state for both instances of server service.
As MSDN states: http://msdn.microsoft.com/en-us/library/z414bbk9(v=vs.100).aspx
for Web farm configurations, it can be stored out of process using
either the ASP.NET State service or a Microsoft SQL Server database.
The ASP.NET state service is quite well documented http://msdn.microsoft.com/en-us/library/ms178581(v=vs.100).aspx
As for the database solution... well... you have to analyse the added overhead due to database access.
Also, if you are hosting the service using IIS, you could consider using Out-of-Process session state (http://technet.microsoft.com/en-us/library/cc754032%28v=ws.10%29.aspx).
These are just some ideas. You can look into other web farm synchronization techniques made available for Microsoft technologies.
On any other operating system, if I browse to my WCF service in the browser, I can see information about that service. In Windows Server 2012, I get no information about the service, and it even acts as if there's nothing there at that address. I can still access the service from a client, but I had to add a server feature just to make it work (.NET Framework 4.5 Features -> WCF Services -> HTTP Activation).
Browsing to the service is normally a quick way to make sure that the service is running, but it's a slight annoying that I no longer have this luxury. Does anyone know how I can get Windows Server 2012 to show information about my service when I browse to it?
Make sure your services' configuration or code modifies the default behavior to enable metadata to be published:
How to: Publish Metadata for a Service Using a Configuration File
I want to make an app that displays new data whenever they arrive inside a folder via xml. I want to use html5 web sockets but I am confused on how it should be done. I am using xaamp on my machine for development. Do I have to install another server to use websockets? Is apache as it is compatible and if yes how do I make the connection with the client. Thank you in advance..
Your options are:
Use something like mod_websocket, as pointed out by Phillip Kovalev. Or pywebsocket. You could also try PHP WebSocket.
Use a dedicated self-hosted realtime web technology for realtime communication between server and client. If you do this you'll also need to define a way of application to realtime web server communications - normally achieved through message queues.
Use a hosted realtime web solution and offload the realtime push aspect of your application.
There are concerns about using Apache with this type of technology since this technology maintains long-running persistent connections between the server and client and Apache isn't know to be too great at this. So, the best solution may be to:
Go with a 2nd dedicated realtime web server in conjunction with using Apache as your application server
Use a self-hosted realtime web server that has the ability to handle many concurrent connections
Use a hosted service along with your Apache application server.
If you don't expect many concurrent connections or if you are just trying out the technology then it's possible that Apache alone will be all you need.
Look at mod_websocket. It supports latest and commonly implemented by browsers vendors protocol version.
Summary:
Does anybody know if there are known issues or configuration gotchas with an IIS service connecting to an Azure based service?
Scenario:
I currently have a scenario that requires me to host two web-services, one in Azure, and one on a server running IIS. The IIS hosted service (a WCF service) connects to the Azure hosted service (actually the Azure storage API) in order to fetch certain information. This information is manipulated and returned to the client.
Client -> IIS Service -> Azure Storage Service
Issue:
I'm running into issues with the IIS service connecting to the Azure Service. The hostname cannot be resolved. I'm using the Azure Storage client from my code, but have actually tried this using the azure API calls, and they also do not work from IIS. I captured the requests using Fiddler (on a different machine), they match the azure REST API calls, as expected. These requests, when made outside of IIS on the host machine execute properly. It is only when they are issued by the IIS service that they fail.
In my research other people have been running into this issue when there's a firewall problem, but since I can hit the service properly from the machine, that doesn't seem to fit the bill. My hunch is that there's a configuration issue I need to sort out in IIS, but I've failed to find anything useful with my searches.
Does anyone have any information on why this might be occuring (known bugs, gotchas etc)? Any workarounds? From a SOA perspective, this seems fairly critical to understand.
Any assitance anyone has would be helpful. Thank you.
Sounds like a proxy configuration issue. Check how your IIS server connected to Internet. If you are using some sort of proxy to get to Internet, that connection has to be configured correctly.
Specifically, if your proxy servers are Microsoft ISA server, or Microsoft Forefront TMG, then you need to check two things:
ISA server client or Forefront TMG client software is installed on the server
The account used by IIS application pool is domain user. ISA Server/TMG are designed to work only with user account, not service account. Alternative workaround for this limitation is using "defaultProxy" configuration in web.config, however it only wokrs for HTTP/HTTPS.
If you use different proxy server, then other issues might be involved, for example proxy might require authentication.