I have a big database(60GB) and I want to generate Multiple SQL script for database back up.
In stackOverFlow we discussed this here - Get .sql file from SQL Server 2012 database
Can we generate multiple sequential SQL files for data.I want to create around 20 sql sequential scripts each with around 3 GB data.
Is it possible either by t-sql query or from SQL server options?
Related
Is it possible to take a table's data backup as Excel in SQL Server 2012?
We can get the full database backup by using a maintenance plan automatically in .bak format or using import/export wizard to take Excel backup manually.
But I need to take a table's data backup in Excel (.xlsx/.xls) after 6 hour time interval automatically. Is it possible in SQL Server 2012?
No, you won't be able to do that. The SQL Server backup creates a .BAK file and that is the only option you have. However, what I might suggest you do is to create an SSIS package that takes the data in your database and creates an Excel file, and schedule the SSIS package to fire every 6 hours.
This might be a bit tedious to set up, especially if you have a lot of tables, but it can be done. And, as you probably know, if any of your tables has more than 1M rows, you're going to lose data sending it to Excel.
I have multiple remote machines on a network that have a csv file on them stored in the same location that has been created by a powershell script.
Whats the best way to insert that data into a Microsoft Sql Server express database?
One way of doing this is to collect all the CSV-files from all the servers to your script server. And then run the SQL INSERT query from the script server to put your data in an SQL table.
As described in one of my previous answers you can use this module Invoke-SQLCmd2 to do this.
The other way, when you don't want to collect all the CSV-files first, is running the SQL INSERT query from every server that has a CSV-file. To do this you have to connect to the other server and then import the module from the script server, so you can use it:
Import-Module -Name '\\SCRIPTSERVER\C$\Windows\system32\WindowsPowerShell\v1.0\Modules\Invoke-SQL'
I am trying to run a process that rolls over several time periods using data that sits in a Sql Server 2008 R2 Database. If it were smaller I'd pull it all in to R and just subset based on date. However the data is ~15GB and I need to be able to generate csv files directly from Sql Server (I've found the R-Sql connectors to be too slow to move large amounts of data) via the R-SQL connector (RODBC, etc.) and then read them into R.
It seems that sqlcmd or bcp are the only options but I wanted to check before going in that direction.
Any suggestions would be appreciated.
I use the package RODBC a lot it works like a charm, large query using sqlQuery can take some times though. Once the query is done you will get a dataframe that you can easily export in csv via write.csv. It will work without any issues.
I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.
Please Let me know if any best approach available for these two SSIS Activities.
migrating access data to SQL Server table
multiple Access dbs are used as a source to migrate the data into a SQL table.
SQL server table to SQL server table
Is this a one time job that should it be done periodically?
If one time job, then you can use SQL server Import Data utility to accomplish both things easily.
If it is just copy of data from source to destination without any modification to data, then there is no need to use SSIS.
Use the SQL server Import and export wizard, at the end don't click on run now, and save the package , you can schedule these packages to do the same for you,
Yes you can copy from access to SQL table, and from SQL table to SQL table.
http://msdn.microsoft.com/en-us/library/ms140052%28v=sql.105%29.aspx