SQL Server Script Database with Seed Data - sql

I am creating an install script for a Sql server 2008 database using Tasks =>generate Scripts option. This works fine and creates a script including database,schema and seed data.
One problem I notice is that the A Stored procedure is created before the table it refers to is created and this gives error when creating the database.

I don't think there's any built-in functionality for ordering the scripting, but you could split them up in separate scripts and control the order in which those are executed yourself.
For example, select just the tables and generate a tables create script, then select the sprocs and generate another script.

Related

Move data between two Azure SQL databases without using elastic query

I am in need of suggestion to move data from a particular table in one azure sql database to the other azure sql database which has the same table structure without using elastic query
Using SQL Server Management Studio to connect to SQL azure database, right click the source database and select generate scripts.
During the wizard, after have select the tables that you want to output to a query window, then click advanced. About half way down the properties window there is an option for "type of data to script". Select that and change it to "data only", then finish the wizard.
The heck the script, rearrange the inserts for constraints, and change the using at the top to run it against my target DB.
Then right click on the target database and select new query, copy the script into it, and run it.
This will migrate the data.
Please consider using the "Transfer SQL Server Objects task" in SSIS. You can learn all the advantages it provides on this article.
You can use PowerShell to query each database and move data between them as needed. Here's an example article on how to get this done.
Using PowerShell when working with Azure has a number of other benefits in what you can do and can control as well. It's a good choice to spend time learning.
In the source database I created SPs to select the data from the tables.
In the target database I created table types (which would be available in programmability) for the tables with the same structure as in the source.
I used Azure function to move the data into table type from source.
In the target database I created SPs to insert data into the tables from their respective table types.
After ensuring the transfer of data, I would be deleting those records moved to the target in the source database and for this I created SPs.

How do I copy a SQL Server table with triggers?

I want to copy a SQL Server 2008 R2 database table A1 to create a clone table A1Clone (empty table). Is there any way to do this operation in one shot? Thanks.
Edited:
I tried through SSMS, DB level, Generate Scripts, and I'm able to generate scripts with advanced options to enable scripts to create triggers and indexes.
Wondering if there is any other shortcut to copy over the whole table object.
If you are asking ,how to copy table related objects like indexes,triggers in one go..you can't do that by normal ways like select into...Your options
generate scripts of existing object,on which you want to perform operation..
normal way of creating entirely

How to backup script for subset of tables in SQL Express DB

I have developed a SQL Express database. I need to backup all but one table in that database in an automated way. I was thinking i could write a SQL script to do this, trigger it using sqlcmd from a batch file but not sure how to write that SQL script.
I was also thinking, if nothing else possible, i could create a second db that has the tables i want to backup then i write a script that copys data 'into' the second db and then do a auto backup of that entire db. This has the disadvantage of having a procrastinated unpacking of that backup when wanting to use it - its not a small install script.
Is this a possibility, is it the only option or is there tools for SQL Express to do this?
There is no option to exclude just one table while backing up .Few things i could think of
1.Right click database ->Tasks ->generate scripts ->exclude the table you want and choose to save the script and run this every time
2.you could also choose Export option,but since you are using SQL Express,you wont have the option to save this package
Keep the large table in a different database and just backup the original database. You can still use the large table even in a different database, i.e.
SELECT *
FROM MyDb.dbo.SomeTable s
JOIN OtherDb.dbo.LargeTable l
ON (expression);

How to Export data to Excel in SQL Server using SQL Jobs

I need to export the data from a particular table in my database to Excel files (.xls/.xlsx) that will be located into a shared folder into my network. Now the situation is like this -
I need to use SQL SERVER Agent Jobs.
2.I need to generate a new excel file in every 2 minutes that will contain the refreshed data.
I am using sql server 2008 that doesn't include BI development studio. I'm clueless how to solve this situation. First, I'm not sure how to export the data using jobs because every possible ways I tried had some issues with the OLEDB connection. The 'sp_makewebtask' is also not available in SQL 2008. And I'm also confused how to dynamically generate the names of the files.
Any reference or solution will be helpful.
Follow the steps given below :
1) Make a stored procedure that creates a temporary table and insert records to it.
2) Make a stored procedure that read records from that temporary table and writes to file. You can use this link : clickhere
3) Create an SQL-job that execute step 1 and step 2 sequentially.
I found a better way out. I have created a SSIS(SQL Server Integration Services) package to automate the whole Export to Excel task. Then I deployed that package using SQL Server Agent Jobs. This is a more neat and clean solution as I found.

Bteq Scripts to copy data between two Teradata servers

How do I copy data from multiple tables within one database to another database residing on a different server?
Is this possible through a BTEQ Script in Teradata?
If so, provide a sample.
If not, are there other options to do this other than using a flat-file?
This is not possible using BTEQ since you have mentioned both the databases are residing in different servers.
There are two solutions for this.
Arcmain - You need to use Arcmain Backup first, which creates files containing data from your tables. Then you need to use Arcmain restore which restores the data from the files
TPT - Teradata Parallel Transporter. This is a very advanced tool. This does not create any files like Arcmain. It directly moves the data between two teradata servers.(Wikipedia)
If I am understanding your question, you want to move a set of tables from one DB to another.
You can use the following syntax in a BTEQ Script to copy the tables and data:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH DATA AND STATS;
Or just the table structures:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH NO DATA AND NO STATS;
If you get real savvy you can create a BTEQ script that dynamically builds the above statement in a SELECT statement, exports the results, then in turn runs the newly exported file all within a single BTEQ script.
There are a bunch of other options that you can do with CREATE TABLE <...> AS <...>;. You would be best served reviewing the Teradata Manuals for more details.
There are a few more options which will allow you to copy from one table to another.
Possibly the simplest way would be to write a smallish program which uses one of their communication layers (ODBC, .NET Data Provider, JDBC, cli, etc.) and use that to take a select statement and an insert statement. This would require some work, but it would have less overhead than trying to learn how to write TPT scripts. You would not need any 'DBA' permissions to write your own.
Teradata also sells other applications which hides the complexity of some of the tools. Teradata Data Mover handles provides an abstraction layer between tools like arcmain and tpt. Access to this tool is most likely restricted to DBA types.
If you want to move data from one server to another server then
We can do this with the flat file.
First we have fetch data from source table to flat file through any utility such as bteq or fastexport.
then we can load this data into target table with the help of mload,fastload or bteq scripts.