SQL Daily, weekly, monthly backup - sql

I have daily backups set up for all databases. The servers tend to get full of backups after a while and I have to go through and periodically clean it up. I would like to retain the daily backups for 30 days, then retain one backup weekly for 6 months, and then retain one backup monthly for 12 months. How do i set this up in SQL or outside of SQL? The retention needs will also vary between databases. Thanks in advance.

Create Three agent job FULLBACK_DAILY, FULLBACK_WEEKLY, FULLBACK_MONTHLY
Step 1 Take fullbackup for DB
Step 2 Cleanup old backup using powershell.
Scheduled them daily, weekly and monthly
Question: The retention needs will also vary between databases.
You can do this in the fullbackup step.
Good luck

Related

2 terabyte table to send from DB to DB in SQL Server

I have a table with 2 terabyte worth of data and I need to transfer it between two databases on the same server.
The data is in prod so when I use ETL the production gets affected.
It's 40 days of data and my manager suggested to send daily data starting from the last 40 days.
What is the best solution?

How can I write the results of a SQL query to azure cloud storage?

Our current data set is not friendly in terms of looking at historic records. I can see what a value for an account is at the time of execution but if I want to look up last month's counts and values that's often lost. To fix this I want to take a "snapshot" of our data by running it at specific times and storing the results in the cloud. We're looking at just over 30,000 records and I'd only run it at the end of the month keeping 12 separate months at a time so the count doesn't get too high.
I can't seem to find anything about how I could do this so I'm hoping someone has experience or knowledge and would like to share.
FYI we're using an on premise oracle DB.
Thanks!
You can use Azure Data Factory (ADF) to schedule a monthly run for a pipe that execute a query/stored procedure against your Azure SQL and writes the data to Azure Storage.

SQL server database log file increasing enormously

I have 5 SSIS jobs running in sql server job agent and some of them are pulling transactional data into our database over the interval of 4 hours frequently. The problem is log file of our database is growing rapidly which means in a day, it eats up 160GB of disk space. Since our requirement dont need In-point recovery, so I set the recovery model to SIMPLE, eventhough I set it to SIMPLE, the log data consumes more than 160GB in a day. Because of disk full, the scheduled jobs getting failed often.Temporarily I am doing DETACH approach to cleanup the log.
FYI: All the SSIS packages in the job is using Transaction on some tasks. for eg. Sequence Cointainer
I want a permanent solution to keep log file in a particular memory limit and as I said earlier I dont want my log data for future In-Point recovery, so no need to take log backup at all.
And one more problem is that in our database,the transactional table has 10 million records in it and some master tables have over 1000 records on them but our mdf file size is about 50 GB now.I dont believe that this 10 million records should make it to 50GB memory consumption.Whats the problem here?
Help me on these issues. Thanks in advance.

Copy Azure SQL Database and Change Scale

We are using the command CREATE DATABASE X AS COPY OF Y to copy an Azure SQL database so that we can take a transactionally consistent backup to our local network. The database is running as a P2 so the copy is also created as a P2 and we are therefore incurring double charges due to daily rate charging of the new database sizes.
Is there any way to copy a database with a different scale setting? Or, are there other ways to take the transactionally consistent backup?
As far as I know currently the way to do the transactionally consistent backups is to either use the COPY command, which you are doing, or rely on the point in time Backup/Recovery provided by Microsoft. If your goal is simply to have the backup somewhere you may look at the GeoReplication options (standard and active) which gets the data into another region in Azure. If your requirements is definitely to get a local copy, the COPY + Export is pretty much your option.
There is not a way currently to perform a COPY from one Database tier level to another; however, in code you can change the tier level for a database, so in theory you could change the Copy to a lower tier immediately after the COPY (there is a sample on how do to this with PowerShell on MSDN using Set-AzureSqlDatabase). However, SQL Database is billed at the day, so you even if you change this immediately, you'd get charged for the P2 instance of the copy for that day. If you are doing these COPY-Export operations daily and deleting the Copy as soon as you get the export down then you won't be saving any money. They have announced that hourly billing is coming to SQL Database along with pricing changes and some other things. It looks like the new pricing will go into affect Nov 1st, and while it's not explicit, I'm assuming that means hourly billing then as well. With Hourly billing at least once you get the copy completed you can reduce the tier on the copy and only pay for that one hour, then after you pull down the export you can delete the copy and save money.
You can set the size of the DB during copy.
CREATE DATABASE db_copy
AS COPY OF ozabzw7545.db_original ( SERVICE_OBJECTIVE = 'P2' ) ;
https://learn.microsoft.com/en-us/sql/t-sql/statements/create-database-transact-sql?view=azuresqldb-current

How to create only one backup file for a SQL Server database

I've created a Back Up Database Task for a few selected databases in my server. What I want to do is to have only one backup file for any database. The new one could overwrite the old one or create a new one and delete the old one, doesn't matter. I've checked Backup set will expire: 2 days but but evidently this doesn't do what I thought it'd do, it keeps creating new backup files every day. How can accomplish this?
I wouldn’t set the backup to expire in 2 days, as this means that you can only restore the backup for two days once the backup expires you can no longer rebuild the database using it.
In the same why you built a maintenance plan to backup the database you can create a maintenance plan to clean up the system and delete backup over x days old. then just run it after your backup plan.
Use a history cleanup task and then pick, remove everything older than 1 day or whatever your desired time frame is