Converting data types from one database table to another - sql

I did a bulk insert on a large text file that was an update for an existing database that I had. I ran into all kinds of trouble with truncation errors so I just set everything to varchar(max). Now that everything is in SQL Server I'd like to convert the data types from database b, to those of database a. If both databases had the same table and field names, what are some methods of getting this done? Or would it be best to have a pre-existing script you run after import that's 'hardcoded'

Try something like this and modify the result and execute it
declare #sql varchar(max)
set #sql=''
select
#sql=#sql+'Alter table '+table_name+' alter column '+column_name+' '+cast(data_type as varchar(100))+
case when data_type like '%char%' then '(' else '' end +cast(coalesce(CHARACTER_MAXIMUM_LENGTH,'' ) as varchar(100))+
case when data_type like '%char%' then ')' else '' end+';'
from
information_schema.columns
where
table_name='test'
print #sql

You could create a view of the table your looking to pull from and just cast() the columns you need to change to the data types you need to change them to. Then you can just do all of your inserts off of that View in the appropriate data type. Hope that helps.

If you are moving data from one sql server database to another you could use Atlantis Interactive Data Inspector. If you don't want to do that, you can script out the table in database a then paste ALTER TABLE [table] ALTER COLUMN before every column and run in on database b.

Related

Alter table to add dynamic columns based off some previously selected query

Is it possible to alter a table and add columns with a dynamic name/data type based off some previously select query?
The pseudo equivalent for what I'm looking to do in SQL would be:
foreach row in tableA
{
alter tableB add row.name row.datatype
}
This is for SQL Server.
As mentioned, you can do this with dynamic sql. Something along these lines:
Declare #SQL1 nvarchar(4000)
SELECT #SQL1=N'ALTER TABLE mytable'+NCHAR(13)+NCHAR(10)
+N' ADD COLUMN '+ my_new_column_name + ' varchar(25)'+NCHAR(13)+NCHAR(10)
-- SELECT LEN(#SQL1), #SQL1
EXECUTE (#SQL1)
Apart from the fact that this is messy, error prone, a security risk, requires high authorization to execute and needs multiple variables for batches bigger than 4000 characters, it is usually also a bad idea from a design point of view (depending on when/why you are doing this).
Sure, you can do this with dynamic sql.

Using dynamic Server and Database Name in SQL Server stored procedure over linked servers

I have a requirement.
Using SSIS i am import data from flat file/excel file into my staging table. From staging table i need to filter data and transfer it to different databases over different linked server.i.e. let say for California i have dbCalifornia on Server A, For Taxes i have dbTaxes on Server B etc etc.
I need to read config table and redirect data accordingly.i.e. if column value =CALI insert data in dbCalifornia.tblA, for column value =TAX insert data in dbTaxes.tblA. I am trying to use Server Name and Database name as variable (because i am reading these from config table) i.e.
INSERT INTO [#server].[#database].[DBO].[BASIC]
But i am getting error .
I am not expert DBA please suggest my solution how can i implement this scenario.
TIA
You can do it using dynamic sql like this:
declare #server varchar(100), #database varchar(100);
DECLARE #sql varchar(8000) ='INSERT INTO[' + #server +'].['+ #database + '.[DBO].[BASIC]' +
'(EmpID,EmployeeID,ADDR1, ADDR2, ADDR3,ADDR4,TELNUM,MARRIED,LNAME,MNAME,FNAME,' +
'SEX, EMAIL,COUNTRYCODE, CITIZEN) ' +
'select EmpID,EmployeeID,ADDR1, ADDR2, ADDR3,ADDR4,TELNUM,MARRIED,LNAME,MNAME,FNAME,
SEX, EMAIL,COUNTRYCODE, CITIZEN from dbo.myExcelTable where state = ' + #database;
exec(#sql);
I don't understand what are your 100 variables tht you use in your insert,
didn't you say
if column value =CALI insert data in dbCalifornia.tblA, for column
value =TAX insert data in dbTaxes.tblA.
?
So you just need to filter your table using #database value and insert those rows in corresponding table
normally when reading a table from another server and DB. I use the LinkedServer, user and table like this:
Select * from Link..User.Table ;
after a linked server I have to use 2 dots.
I don't know if this can help since this is to read from an Oracle database.

Alter table structure to match copy table

I have 2 tables corporate and corporate_copy. Initially they were same in structure but people started added new columns into corporate and forgot do do so for corporate_copy.
Somewhere in the application there is less used functionality that copies data from corporate to corporate_copy and that kept failing without anyone noticing. Now I have to add 28 columns (ofcourse with same type and length and constraints etc....).
I know it can be done in one ALTER TABLE statement but I still feel it is lengthy task.
Do we have any luxury that will make copy table same as main table by keeping data and adding default values in newly added columns?
I am asking much but is there anything like that?
--Generate a dynamic query which contain all the missing column list and Execute it
--for eg I tried Something
BEGIN TRAN
DECLARE #SqlSelect NVARCHAR(MAX),#ColumnDeclaration VARCHAR(2000)
SELECT DISTINCT ' '+COLUMN_NAME+' '+ DATA_TYPE +' '+ISNULL(CONVERT(NVARCHAR(10), CHARACTER_MAXIMUM_LENGTH ),'')+' 'Missing_Column INTO #T FROM INFORMATION_SCHEMA.COLUMNS a
WHERE a.column_name not in (SELECT column_name FROM INFORMATION_SCHEMA.COLUMNS b
WHERE b.table_name in ('Corporate_Copy'))
and a.table_name in ('Corporate')
SELECT #ColumnDeclaration=STUFF((
SELECT ', ' + Missing_Column
FROM #T
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(max)'), 1, 1, '')
SET #SqlSelect=' ALTER TABLE Corporate_Copy Add'+ #ColumnDeclaration + ');'
PRINT #SqlSelect
ROLLBACK TRAN
You could use schema compare, found in SQL Server data tools (free) to generate a change script automatically.
But if this is just a copy, you could just run this:
DROP TABLE Corporate_Copy;
SELECT *
INTO Corporate_Copy
FROM Corporate;
It's not clear whether you really need to preserve the data in the copy. If so, it's not really a copy is it?
From SQL-Server 2015, you can use the following query to extract all different columns between 2 tables:
select distinct a.* from INFORMATION_SCHEMA.COLUMNS a
where a.column_name not in (select column_name from INFORMATION_SCHEMA.COLUMNS b
where b.table_name in ('tbl_A'))
and a.table_name in ('tbl_B')
order by a.column_name
The output gives you enough information to create a simple script to add the columns which are missing:
For exmaple:
Alter table tbl_A ADD res.Column_Name res.Data_Type ....
generate CREATE script in SSMS (right-click on table, then "script table as...")
Delete all things that already exists. Usually they are in the begining and it's a simple
change CREATE to ALTER ... ADD
That should be possible using SELECT INTO, for example the following SQL statement creates a backup copy of corporate:
SELECT * INTO corporate_copy
FROM corporate ;

How to get the "CREATE TABLE" query?

When I right-click on my view and click on "Script View As", I get the following error:
Property TextHeader is not available for View '[dbo].[TableName]'. This
property may not exist for this object, or may not be retrievable due
to insufficient access rights. The text is encrypted.
(Microsoft.SqlServer.Smo)
I was able to use bcp to get a dump of the table and also create a format file as given here. The table has about 60 columns and I do not want to manually write the CREATE TABLE query. Is there a way to do this automatically?
I was hoping that
BULK INSERT DB.dbo.TableName
FROM 'E:\Databases\TableName'
WITH (FORMATFILE = 'E:\Databases\TableName.Fmt');
GO
would do the trick but it looks like the table itself should be present in the database before I can execute the above query. Any suggestions?
You can construct the create table statement from INFORMATION_SCHEMA.Columns. Something like:
select (column_name + ' ' + data_type +
(case when character_maximum_length is not null
then '('+character_maximum_length+')'
else ''
end) + ','
) as ColumnDef
from Information_Schema.columns
order by ordinal_position
This is probably good enough. You can make it more complicated if you have to deal with numerics, for instance, or want "is null" to be accurate.
Copy the results into a new window, add the create table statement, remove the final comma and add the final closing paren.
You can do all the last step in a more complex SQL statement, but it is easier to do manually for a one-time effort.

Create a stored procedure to iterate through a list of tables and truncate them in MySQL

I'm debating whether or not to try running through a list of tables and truncating them with a stored procedure. Would it be that easy with MySql and how would I do it?
The main piece of info you need is the list of tables. Most platforms support this:
select table_name from information_schema.tables
However, before you code the sproc, do a select * from information_schema.tables and examine the entries, there may be some you do not expect -- system tables and such, so you may need to craft a filter to get the set you want.
Since I don't do mySQL that much, I can't show you the code, but if you can translate this from MS SQL, and fill in some blanks you can make it work:
declare #table_name varchar(200)
while 1=1 begin
select top 1 #table_name = table_name
from information_schema.tables
where ....possible filter...
if #table_name is null break
-- for this line you may need dynamic sql
truncate table #table_name
end