Web Server being used as File Storage - How to improvise? - backup

I am making a DR plan for a web application which is hosted on a production web server. Now that web server also acts as a file storage for storing the feed uploads files (used by the web application as input) and report files( output of web application processing). Now if the web server goes down , the files data is also lost, so need to design a solution and give recomendations which eliminates this single point of failiure.
I have thought of some recommendations as follows-
1) Use a seperate file server however it requires a new resources
2) Attach a data volume mounted on the web server which is mapped to some network filer ( network storage) which can be used to store the feeds and reports. In case the web server goes down , the network filer can be mounted and attached to the contingency web server.
3) There is one more web server which is load balanced however that is not currently being used as file storage , and if we can implement a feature which takes the back up of the file data regularly to that load balanced second web server , we can start using that incase the first web server goes down. The back up can be done through a back up script, or seperate windows service , or some scheduling job for scheduling the backup job every night.
Please help me to review above or suggest new recommendations to help eliminate this single point of failiure problem on the web server. It would be highly appreciated?
Regards
Kapil

I've successfully used Amazon's S3 to store the "output" data of web and non-web applications. Using a service like that is beneficial from the single-point-of-failure perspective because then any other instance of that web application, or a different type of client, on the same server or in a completely different datacenter still has access to the same output files. Another similar option is Rackspace's CloudFiles.
Both of these services are very redundant, and you could use them as the back, and keep the primary storage on your server, or use them as the primary and keep a backup on your other web server. There are lots of options! Hops this info helps.

Related

Verify Load balancing Azure Container Service

I am using the Azure Container Service with Kubernetes orchestrator and have an app deployed on a cluster with 3 nodes. It has 5 replicas. How can I verify load balancing in action e.g. I want to be able to see that every time I hit the external IP I am being routed to perhaps a different node. Thanks.
The simplest solution is to connect (over ssh for example) to 3 nodes and run WinDump there. In order everything is working properly you will be able to see what happens on every node.
Also here is Microsoft documentation for testing a load balancer:
https://learn.microsoft.com/en-us/azure/virtual-machines/windows/tutorial-load-balancer#test-load-balancer
The default Load Balancer which are available to your Windows Azure Web and Worker roles are software load balancers and not so much configurable however they do work in Round Robin setting. If you want to test this behavior this is what you need to do:
Create two (or more) instances of your service with RDP access
enabled so you can RDP to both instances
RDP to your both instances and run NETMON or any network monitor
solution in it.
Now access your Windows Azure web application from your desktop You
need to understand that when a network connection is made from your
desktop the connection is still alive based on network settings
(default 60 seconds) so you need to wait until default timeout is
passed to access your Windows Azure web application again.
When you will access your Windows Azure Web application again you can
verify that seconds time the request went to next instance. BE sure
to pass the connection timeout otherwise your request will be keep
handled by same instance.
Note: If you dont want to use RDP, you sure can also create a test ASP.NET page to write some special code based on your specific instance which will show you that this page is specific to certain instance. The best way to do is to read the Instance ID as below:
int instanceID = RoleEnvironment.CurrentRoleInstance.Id;
If you want to have more control over Windows Azure Load Balancing, i would suggest using the Windows Azure Traffic Manager which will help you to route the traffic to your site via Round-Robin, Performance or backup based scenario. More info on using Traffis Manager is in this article.

Azure Virtual Machines not holding Logged in User Session

I am developing a MVC4 application . We have hosted our application on Windows Azure IAAS Model . Right now we have configured 2 virtual machines and everything is working good. But we have an issue with maintaining User Loging .
If i login in virtual machine 1 , its not getting carried over ,when the next request is coming from Virtual machine 2 . We have mapped two virtual machines over load balance .
Should i look into Cache solutions . Any input will be greatly helpful ...
Thanks,
Jaswanth
You're hitting two completely separate VMs (yes load balanced, but separate). This mandates the need for storing any type of session data external to the VMs (or you need to sync the session content and have it identical in both VMs).
Azure doesn't do anything to sync session data for you. That's on you, to build it into your app's architecture. You mentioned caching, which is certainly a viable solution (which you pick, though, is up to you). There are other solutions too such as database-based session storage. Again, that's up to you.
But bottom line: If you're going to scale an app beyond a single server (VM in this case), in a load-balanced way, you cannot store session data in a specific vm.
Use a durable session state store (like Redis or SQL Server, etc) or put your state in a cookie and read/write it on each request. If cookie includes sensitive content, encrypt it.

Upload text logs to MVC 4 web server every second

I have a Web Server implemented using dot net MVC4. There are clients connected to this web server which perform some operations and upload live logs to the server using WebClient.UploadString method. Sending these logs from client to server is being done in group of 2500 characters at a time.
Things work fine until 2-3 client upload logs. However when more than 3 clients try to upload logs simultaneously they start receiving "http 500 internal server error".
I might have to scale up and add more slaves but that will make the situation worse.
I want to implement Jenkins like live logging, where logs from slave are updated live.
Please suggest some better and scalable solution to this problem.
Have you considered looking into SignalR?
It can be used for anything from instant messaging to stocks! I have implemented both a chatbox, and a custom system that sends off messages, does calculations and then passes them back down to client. It is very reliable, there are some nice tutorials, and I think it's awesome.

Handling connection from 3000 mobile clients

I have an iPad client application that are installed in around- 3000 - 4000 iPads. They are available in remote areas and are talking to a web service for submitting the data they collect. The data submission call from the iPads may happen together. I have one single server where all the data is stored in SQL server. The web services are written in .NET and are hosted in IIS 7.
Currently the iPad application does not work as expected as the web services are not able to handle that many requests simultaneously.
What is the best possible way to handle this scenario? Is the delay/scalability issue caused by DB access? Can an in-memory caching at web-service side solve the issue?
I am not in a position to invest in a separate server. So would like to know the best solution for handling as many requests together. The DB insertion can be done asynchronously. Most important task is to bring the data collected on iPads to server.

Which is a Better Solution in this scenario of WCF

i have a WCF service which Monitors a Particular Drive and Creates a New Folder weekly which i am using as a Document Storage
i have many Drives configured for Document Storage and i have to Monitor which Drive is Active(only one drive can be Active at one time ) and on Weekly Basis i have to Add a new Folder in My Active Drive at a predefined Path
provided at the configuration Time.
The Client can make any Drive Inactive or the drive can become Inactive if it is Full and i need to make another Drive Active dynamically using a Service based on priority for example
i have following drives
drive A priority 1 Active yes
drive B priority 2 Active no
if A Becomes Full i have to Make Drive B as Active
Now should i Implement a WCF Service in IIS or as a Windows Service as My Program Will Watch has to Perform Many Actions Like check the drive size and make another drive Active and send Updates in the Database
Which is a Better Way IIS or Windows Service
I need A Service which Get the Information about Drives path From the Database and I have a Configuration WIndows Application which needs to communicate with this Service also to check the drive path and Check the size if it is invalid Application will not Configure the Drive Path and if it is valid it will keep the entry in the Database and any client can have multiple directories and only one directory will be Active So that i can Store Documents in it
What about the Performance and can i configure WCF for IIS like IIS does not Refresh the Application Pool as i want my Service to Run periodically say every 30minutes – Nitin Bourai just now edit
It seems to me a better architecture would be to have a service responsible for persisting your Documents, it can then decide where (and how) to store it and where to read it from based on who's requesting it / how much disk space is available etc. This way all your persistance implementation details are hidden from consumers - they only need to care about Documents, not how they are persisted.
As to how to host it... there is lots of useful information out there documenting both:
IIS : here
Windows Service: here
Both would be more than capable of hosting such a service.
I would go with a windows service in this case. Unless I misunderstand, you want this all to happen with no human intervention, correct? So, I don't see a contract, which means its not a good candidate for WCF.
As I see it both Windows Service or IIS hosted service will work well in your scenario. Having said that, I would go with the Windows Service. It is just a feeling matter but I guess you have a little more config support 'out of the box'. I believes it is easier to config what to do if it fail to start, config the user you want the service to run with and so on.
But as I said, it is a matter of feeling