Is there a way I can back up my azure cloud service( backup of .pkg and .cscfg files) .
We have an existing azure cloud service deployed an year ago. Now we don't have either the old version the source code nor the setup files (.pkg file and the .cscfg files ). We want to create a backup of the current cloud service . We created a new version of our cloud service and tried to do a VIP swap which didn't worked as (Windows Azure cannot perform a VIP swap between deployments that have a different number of endpoints.) our new version has many new changes which are not compatible with old version .
I need to find a way to take a backup of .pkg and .cscfg files from existing deployment in cloud .
Any suggestions /workarounds for this situation
It looks like there is a Get Package operation on the service management API that can retrieve the .cscfg and .cspkg files from a Cloud Service deployment. See http://msdn.microsoft.com/en-us/library/windowsazure/jj154121.aspx
You could also try Cerebrata's Azure Management Studio (AMS). AMS contains a "Save Package" button in the Cloud (Hosted) Service deployment. I'm assuming AMS is using the same API to download the .cscfg and .cspkg files. Done in a few button clicks. :)
Related
We receive weekly FULL and daily DIFF back ups from our hosted ERP quoting system provider. We are using Powershell code and a task to download the most recent file from a blob container to a local server location we use for our back ups.
When I download the files from Azure Storage explorer manually and run the restore job it works fine. When I run the restore job from the Powershell downloaded file i get an error.
.bak' is incorrectly formed and can not be read.
I cannot figure out why this is happening. Anyone run into this and fix it?
I have an on premise server with the Microsoft Integration Runtime installed.
In Azure Data Factory V2 I created a pipeline that copies files from the on premise server to a blob storage.
After a successful transfer I need to delete the files on the on premise server. I am not able to find a solution for this in the documentation. How can this be achieved?
Recently Azure Data Factory introduced a Delete Activity to delete files or folders from on-premise storage stores or cloud storage stores.
You have the option to call Azure Automation using webhooks, with the web activity. In Azure Automation you can program a powershell or python script with a Hybrid Runbook Worker to delete the file from the on premise server. You can read more on this here: https://learn.microsoft.com/en-us/azure/automation/automation-hybrid-runbook-worker
Another easier option would be to program a script to be run on the server with the windows task scheduler where you run a script to delete the file. Make sure you program the script to be run after data factory has copied the files to the blob, and that's it!
Hope this helped!
If you are simply moving the file then you can use a Binary dataset in a copy activity. This combination makes a checkbox setting visible that when enabled will automatically delete the file once the copy operation completes. This is a little nicer as you do not need the extra delete activity and the file is "moved" only if the copy operation is a success.
I've a very simple application built in MVC4. This application allow the users to upload a file, and the application generates an output.
This app works great locally, but when I publish to azure (by right click -> publish), I get a less descriptive error. I've figured out that the error was because in the code, we accessed to a server relative path, and that is not possible in azure. So I've found a way to solve that in this link, that says that I should use LocalResource, rather than Server.MapPath. That make sense for me, but so far, I'm struggling with the suggested line.
LocalResource localResource = RoleEnvironment.GetLocalResource("DownloadedTemplates");
I'm not able to get it working, and also can't get a proper error. BTW I'm not sure how to enable the error log in azure :(
So, after going deeper in MSDN, I've seen that I should configure the Local Storage Resources, but as I've created a local MVC4 project, I can't find where I should configure this.
I need to be able to store a temporary file in the application (hosted in azure).
Did someone faced with this problem?
Anybody knows how to enable the Local Storage Resource in a project like that?
TIA!
Milton RodrÃguez
Well, after struggling a while, I've ended up using Windows Azure Tools.
The steps:
Add a new project
Under Cloud category, select Windows Azure Cloud Service.Note that if you don't have this option, an option to install the needed SDK will be shown. Install it first.
Name it properly :)
New Windows Azure Cloud Service window will appear, select the role that fits your needs. In my case, I choose ASP.Net MVC4, and then removed it.Note that you can edit the name of the created role at the right.
In the Roles folder of your new project, select Add, and then Web Role Project in solution. Your project will be an option to add.
You can remove the other role in the folder, the web project created in step 4, and also the folder ending in Content (ie. WebRole1Content). Basically, you can remove the created assets, but the Azure Service, and link the service to your project.
You're almost done. Follow this link to configurate your local storage :)
Now you're done!
Can anyone say if there is a complete/accessible API for the latest Azure bits that will allow the complete creation (not just scale out like Scale Windows Azure roles programmatically) of a 'worker role' application? Not building a web site or needing any SQL or table storage. I would like to build an EXE that will create the full container and allow the upload of DLL/config artifacts so the app will exist and start up.
Thanks.
I'm not sure this completely solves your challenge, but... Take a look at these PowerShell cmdlets:
New-AzureServiceProject
Add-AzureWebRole
Add-AzureWorkerRole
They will spin the scaffold code for your web and worker role, respectively, and optionally allow you to specify a template folder.
The Operations on Hosted Services page should get you started. Specifically you'll want to Create a Hosted Service if you don't have it already, then Create a Deployment. You'll have to point to a deployment in blob storage that you've uploaded in order to create and then start the deployed instances.
I've successfully created site using Umbraco now its time to upload it on hosting server..
i've searched and got one paid product for the same..and i dont want to use it.
Has any body tried developing Umbraco site on local and then uploading it on server?
If yes then please help me doing that.
First I run the umbraco install from a local IIS website. Then I setup my visual studio solution for that website (and my souce control). Then I work, until I reach a beta version, then I go through this process for deploying:
Ftp over to the remove website and copy the whole website (I actually use Beyond compare).
Connect to my local database with management studio and create a .bak file.
Upload the .bak file to the database server.
Restore that database
Review connection strings in web.config
Then I'm pretty much done.
Once I'm "live" and have content I don't want to lose, when I want to work on the website, I bring back the live database through a .bak file, then I make my changes. They often include DB changes since the schema is basically in the database. I note all the operations I do. Once my changes are ready I manually replicate the changes on the live site as I update the files.
This is very painfull but I tried solutions like courrier and other things like that and they are not reliable enough for production I find. Manually is the only risk free way I see for the moment.
Hope this helps.
Yes, that happens all the time. Use FTP to copy your local installation to your webserver, modify the web.config to point to the correct database and your website should be up-and-running.
I'm sure there are more elegant solutions with less clicks but here's how I do it on azure websites with sql, not sure what hosting/db you're using:
1) Create an empty db on azure with the same login and user as my local db.
2) Create an empty site on azure connected to my db.
3) Download the publishing profile.
4) Upload the db the first time with Sql Azure Migration Wizard.
5) Import the publishing profile into and upload the site with WebMatrix.
6) Thereafter I deploy the site and db with WebMatrix.
WebMatrix uses WebDeploy or FTP, you can use WebDeploy through IIS if you like, and FTP.