Accessing the C:\ drive on azure VPS with ASP.NET - asp.net-mvc-4

Basically I have a asp.net site hosted on windows azure, and I also have a virtual machine hosted there. Is it possible for me to read/write files to the disk on the virtual machine from my website?
I'm using ASP.NET MVC 4 if that's any help at all. Sorry for the possibly vague question, if you need more info i'll happily try to provide it.
Thanks in advance!

Sure. Since you want to access the VM's drive, and you have complete control over the VM it's easy.
Server
Expose your VM's drive using WebDAV. It's a HTTP API for file sharing.
Here's a WebDAV setup guide for Windows: http://mythoughtsonit.com/2013/05/deploy-a-file-server-in-the-cloud-webdav-on-windows-azure/
Client
If you are using Windows Azure Websites (very restrictive), your only option on your ASP.NET site is to add C#/VB code to read/write from the WebDAV share. Here's some .NET WebDAV clients:
https://github.com/kvdb/WebDAVClient
http://webdavnet.codeplex.com/
http://www.independentsoft.de/webdav/
If you are using a Windows Azure Cloud Project and have a Web Role (more flexible), on that web role you could make a startup task which maps the WebDAV as a network drive and use normal System.IO.File code. I think this is easier than the WebDAV client stuff, but it's up to you.

I came to a similar conclusion to #Yoshi. However, when trying to map the share to a drive on the instance hosting the web role via a startup task, I found that this requires WebDav client to be installed. This is installed as part of the Desktop Experience feature. To date I have been unable to install this using a startup task as it requires a reboot.
I have tried writting a startup task that includes the steps from this article but cannot get it to work so far.
Has anyone else managed this?

Related

Where to begin with managing web servers / business document file management

I've inherited a couple of web servers - one linux, one windows - with a few sites on them - nothing too essential and I'd like to test out setting up back-ups for the servers to both a local machine and a cloud server, and then also use the cloud server to access business documents and the local machine as a back-up for these business documents.
I'd like to be able to access all data wherever I am via an internet connection. I can imagine it running as follows,
My PC <--> Cloud server - access by desktop VPN or Web UI
My PC <--> Web Servers - via RDP, FTP, Web UI (control panels) or SSH
My PC <--> Local Back-up - via RDP, FTP, SSH or if I'm in the office, Local Network
Web servers --> Local Back-up - nightly via FTP or SSH
Cloud Server --> Local Back-up - nightly via FTP or SSH
Does that make sense? If so, what would everyone recommend for a cloud server and also how best to set up the back-up server?
I have a couple of spare PC's that could serve as local back-up machines - would that work? I'm thinking they'd have to be online 24/7.
Any help or advice given or pointed to would be really appreciated. Trying to understand this stuff to improve my skill set.
Thanks for reading!
Personally I think you should explore using AWS's S3. The better (S)FTP clients can all handle S3 (Cyberduck, Transmit, etc.), the API is friendly if you want to write a script, there is a great CLI suite that you could use in a cron job, and there are quite a few custom solutions to assist with the workflow you describe. s3tools being one of the better known ones. The web UI is fairly decent as well.
Automating the entire lifecycle like you described would be a fairly simple process. Here's one process for windows, another general tutorial, another windows, and a quick review of some other S3 tools.
I personally use a similar workflow with S3/Glacier that's full automated, versions backups, and migrates them to Glacier after a certain timeframe for long-term archival.

web application with local webserver on client machine

I have simple web application built on html5 and RoR. It is simple application where user records his voice(html5 web audio) and then it is saved locally. Application won't be hosted on server instead it would be hosted on individual user machine. I need to develop some portable package like exe which will be run on Windows and Mac.
Is this possible? If yes, then what are the ways to achieve it?
You might find some useful info here
Here are some more recent tools

Deploying an application server to a server

I am building a client-server application, this is all running locally on my computer whilst I am developing the system. However, eventually I would like to deploy the server-side part of the application to a server to run 24/7, enabling client applications to connect and consume the service at will. What I would like to know is, when I come to doing this would I simply just install the server-side application on the server, hit run and that's it? That just seems... well not right (to me), is this the way it is done? or is there a lot more to it? I imagine there is, but I can't seem to find any content on this subject.
FYI - the server is a self hosted WCF application.
You'd want to take your program's executable, support dlls and config files and drop them into a folder. Then create a Windows Service to run the program; if you don't use a Windows Service, the program will only run while you're logged on, which isn't good. As a Windows Service, a reboot of the server will bring the program back online even if you're not logged on.
Here's a knowledge base article from MS on how to make a windows service.
http://support.microsoft.com/kb/251192
If you're program is compiled as a DLL, then create a small .exe program to run it (a wrapper) then deploy the program as described in the article.
Good luck.

How deploy intranet WCF-service for Windows 8 store application

We are developing a desktop Windows 8 application, that work with WCF-service. We want enable work with Azure WCF-service and with WCF-service in local network(choosingly). Application will be publish in Windows Store.
What the best practise deploy WCF-service on local server of company?
I understand what you want. Let's pretend the Azure part is not an option. How can a Windows Store App use a local service (WCF or not)? That's the fundamental question.
Here's the answer(s):
First, a Windows Store App cannot access intranet services unless it has private network access declared in its manifest. It looks like this:
Second, in order to use private networks in your manifest and get accepted into the Windows Store, you must be a company publisher and not an individual. More on this is discussed in this SO question: Which features are allowed for company store accounts and not individual?
Third, a local service cannot be mistakenly thought of as localhost. To this end, localhost is not available to Windows Store Apps, unless they are side-loaded (which means they are manually installed and not delivered through the Windows Store at all). To access localhost, you can enable loopback, but, as I stated, this disqualifies you from the Windows Store. There's more on this here: How does Windows 8 Loop Back work?
Forth, because you are talking about a service, you might want to authenticate the user. This is accomplished using enterprise authentication (just like in number 1) only a few checkboxes higher. And it has the same restrictions as private networks.
Fifth, you are not asking this, but to be clear, local access does not mean you can speak to a local SQL server. The reason for this is because the SQL namespace is not part of WinRT or .Net for WinRT. Windows Store Apps are intended to be service-based apps.
And, that's about it. I think you are good if you follow that.
It doesn't sound like you are talking about a pure enterprise-play, but it might be interesting to you to read through some of the strategies for enterprise developers: http://blog.jerrynixon.com/2012/08/windows-8-apps-whats-enterprise-to-do.html
As for deploying WCF, there's nothing special just because a Windows Store App is accessing it. So for deployment, just use vanilla techniques you are already using. :) That's it.
Best of luck!

Out Of Browser Silverlight app with local offline database and WCF-RIA

I have the following scenario:
We develop a silverlight 4 app for our customers, that will be used as an out-of-browser app. The app is working offline, i.e. app and database are on the users local machine. The app is using WCF-RIA-services to connect to the local database. The database will be an SQL Server Express, SQL Server CE or MySQL. We are using MVVMLight and MEF.
An external webserver is only used for updating the app from time to time or adding new modules to the app. To achieve this we do something similar as shown in Jeremy Likness blog (http://www.wintellect.com/CS/blogs/jlikness/archive/2010/05/25/silverlight-out-of-browser-dynamic-modules-in-offline-mode.aspx )
The reasons why we are doing such a scenario are complex. But to keep a long story short it is mainly for compatibility reasons for a later online version and we don't want to use WPF. So we need to get this working with Silverlight and WCF-RIA services.
Ok, that's the scenario and here's the question:
Do we need a local webserver in this scenario? The app is programmatically installed as out-of-browser, the database is local and connected via WCF-RIA.
If yes, which webserver would be sufficient? It should be installed and configured via an initial setup that is executed by the customer. The customer should not have to do anything with configuring the webserver.
Any other ideas or comments on this scenario? Any other possible solutions for this?
Thanks for your help
Dirk
silverlight wasn't meant to be used this way I think. So it would be like when you are developing app in visual studio and use Cassini to see result - everything runs locally - but you still need a web server. Maybe more info here - http://www.infoq.com/news/2010/06/WPF-vs-Silverlight
I´m not able to provide with a full answer to your problem, as we are currently facing the same problem. (WPF not being cross-platform, Very specific hardware on some clients)
But I may share some of our thoughts on our type of Thick-Silverlight-Client:
To keep deployment etc. simple we use a self-hosting process (installed as background process)
We may not use RIA as the background process has to run using Mono VM (but for MS-only solution see Can WCF RIA Services be self hosted? )
Architectural thoughts on standalone "Clients":
Depending on your requirements implementing a server for each client communicating with the "main"-server by messages (NServiceBus) may be overkill. But if you want to use a client database if offline and silverlight for ui you should consider using an event-driven-architecture.
There is a slideshow on combining "Event-Driven-Architecture" & "CQRS" with Silverlight. But i would not use it as a blueprint more like an inspiration.
http://www.slideshare.net/dennisdoomen/cqrs-and-event-sourcing-an-alternative-architecture-for-ddd