My question is more specific to a feature in our application. Our environment is currently a dedicated hosting. There is an initiative to move everything to cloud.
There is a data replication scheduler job (this job will trigger copy scripts saved in .bat files) configured in a SQL Server. This job runs every night and copies data in a few tables from one database instance (live) to another (duplicate) on the same server. The idea behind doing the data copy to a duplicate DB is to execute complex application specific reporting queries in an additional DB without interrupting main (live) DB.
Since we are moving to the cloud, we won't get access to Windows servers having SQL Server installed. And we cannot configure Windows scheduler jobs which trigger scripts in .bat files to do DB copy.
Do you have any suggestion to handle this?
Appreciate your help!
Related
I have a job set up to Incrementally sync my analysis database in TFS 2015. This has been working without issue, up until recently where it now just runs and runs without completing.
The job never completes so there is no error message, but I am concerned that it will start to impact the operation of the system.
Can someone let me know;
Can I safely "kill" the job (What's the best way to do this?)
What I can do to stop this from happening repeatedly
If you are talking about Analysis Services Sync job. You can try to follow the article below to manually process data warehouse and analysis services cube for Team Foundation Server.
http://msdn.microsoft.com/en-us/library/ff400237.aspx
If manually processing still received the same issue, you can try to delete the Tfs_Analysis database and recreate it using TFS Admin Console. To delete the Tfs_Analysis database, you can connect to Analysis Service using Microsoft SQL Server Management Studio, expand the Databases node, and delete the Tfs_Analysis database.
To recreate the Tfs_Analysis database, please refer to the following
steps:
Launch TFS Admin Console>>Application Tier>>Reporting, in the right panel, click Edit to open the Reporting windows.
Under the Analysis Services tab, input the Tfs_Analysis under Database textbox, and re-provide the required account under Account for accessing data source.
Click Ok to generate the Tfs_Analysis database in Analysis Services.
Click Start Jobs and Start Rebuild to rebuild warehouse and the Analysis Services database.
We are developing a small application that needs to have a local database installed on each users computer that will then sync up to the main database, via web services etc...
Anyways when we deploy the application on the users computer we want to use clickonce deployment. Now I have used this before but not attaching a SQL Server database. I know you can go to prerequisites in clickonce properties and click SQL Server Express.
Now the question is, when you have created your .mdf database file including stored procedures and all - how do you get this attached and setup automatically in the local database that is just installed through clickonce?
Also once this is finished in the future we may want to run updates to the database on the clients machines. We would like to use clickonce for this to publish database updates. Obviously we don't want to overwrite the database and just publish the latest updates based on if they already have the database or not and what version they have.
How could this be achieved using clickonce? Thanks
I have a large amount of XML files that I transfer via ftp to an azure website folder on a daily basis. I currently use c# to transfer the data to azure sql server tables. However, it is extremely slow.
Is there a way I can run an Azure SQL job to bulk import these files and if so, how do I access the files in the web apps folder?
I know how to do this on a standard SQL server with XML files residing on a share drive but am unsure how to do this in azure.
Currently, we do not support any T-SQL interface to read files from blob store or container. So you have to push the data from outside of SQL Server.
One option is to use Azure Automation to run your code periodically or based on a schedule. See post below on how to use Azure Automation:
http://azure.microsoft.com/en-us/documentation/articles/automation-manage-sql-database/
I am a total newbie in SQL/SQL server stuff, and I am using SSRS to make a new reporting server/service and upload some .rdl files to it
I have a reporting server on a machine, which has a lot of reports and data sources uploaded to it's database.
I created a new reporting server with a fresh database on another machine, and what I want to do is to copy the old database content to the fresh one (the reports and the datasources..etc)
I have no copy of the individual reports to upload them to the new server using localhost/reports
is there's a fast solution to what i am having? please do it in detail because I never worked with SQL before.
Different ways to do this:
Report Server Databases
Use the detach/attach or backup/restore instructions here. Both of these methods require a backup of encryption keys on the existing instance, which are then restored to the new report server instance. Instructions on backup/restore of encryption keys here. Migrating the ReportServer and ReportServerTempdb databases is the easiest way to ensure all content is available on the new server.
Report Object Scripting
Reporting Services Scripter is an older (but still working with SSRS 2008R2, not sure about 2012) tool that can be used to transfer objects (folders, shared data sources, shared data sets, reports, etc) between report servers. Good choice if you want to pick and choose what is migrated.
If you are receiving an error regarding unsupported scale-out deployment, this means you are running Standard edition and need to remove the old report server entry from the database in the new location. It can be done using Reporting Services Configuration Manager, or by using rskeymgmt at command line.
Reporting Services Configuration Manager
Open Reporting Services Configuration Manager and connect to the new report server instance.
Click on Scale-out Deployment to view registered report servers.
Select the old report server instance and click the Remove Server button.
Command line and rskeymgmt
Browse to the Tools\Binn folder of your SQL Server client installation.
Run the following to list registered report servers
rskeymgmt -l -i
Using the installation ID (GUID) of the old report server, remove it
rskeymgmt -r -i
More info on scale-out deployments and rskeymgmt here.
To migrate Reporting Services, use migration manual from MSDN (https://msdn.microsoft.com/en-us/library/ms143724(v=sql.120).aspx). If you encounter "the feature: scale-out deployment is not supported in this edition of reporting services. (rsoperation notsupported)" error, go to ReportServer database and remove the old encryption key from table dbo.Keys.
I have a large database in an AWS instance running SQL Server 2008 on a Windows Server 2008 R2.
The database is constantly changing and writing information, and its size is about ~100GB
I wish to migrate from our Amazon services to Microsoft Azure.
But I cannot afford any lost of information more them for more than 20-30 minutes
I don't mind using the Azure SQL or running a SQL Server under a VM in Azure Cloud, but I must keep the databases live and updated, there are few main tables that information is being added to them constantly
What would be the best way to do so ?
if you are using an AWS instance and not RDS and you are going to an Azure instance and not "Azure SQL Database" you can use log-shipping or something similar to get the downtime down to a few seconds: http://msdn.microsoft.com/en-us/library/ms187103.aspx
The steps you need to take:
Take full backup on AWS
restore full backup without recovery on Azure
take log backup on AWS
restore log backup without recovery on Azure
repeat 3 and 4 until the time it takes is short enough (you probably want to script this out)
take app offline
take another log backup on AWS
restore that log backup WITH recovery on Azure
repoint App to Azure
bring App online again.
3, 4 and 5 is what log-shipping would automate, but you could just write a powershell script too.