SQL Server created table runs slow - sql

I'm creating a temp table called #PILOTTERR. There are over 40,000 records that I need to insert into this table. This takes about 10 min to execute. Is there a way I can make this faster?
CREATE TABLE #PILOTTERR
(
Zip TEXT,
Office CHAR(4),
Branch NVARCHAR(33),
District NVARCHAR(37),
Region NVARCHAR(26),
SE_Territory NVARCHAR(42),
ISE_Territory NVARCHAR(43)
);
INSERT INTO #PILOTTERR
VALUES ('00544','NY04','Long Island','New York-Long Island','US Northeast','SE-New York/Long Island-A','ISE-New York/Long Island-D')
INSERT INTO #PILOTTERR
VALUES ('01001','MA01','North Boston','North Boston','US Northeast','SE-North Boston-C','ISE-North Boston-B')
etc....

Try this approach:
CREATE TABLE #PILOTTERR (
Zip NVARCHAR(10),
Office CHAR(4),
Branch NVARCHAR(33),
District NVARCHAR(37),
Region NVARCHAR(26),
SE_Territory NVARCHAR(42),
ISE_Territory NVARCHAR(43)
);
Rather than doing multiple insert into statements have only 1:
INSERT INTO #PILOTTERR
VALUES (N'00544','NY04',N'Long Island',N'New York-Long Island',N'US Northeast',N'SE-New York/Long Island-A',N'ISE-New York/Long Island-D'),
(N'01001','MA01',N'North Boston',N'North Boston',N'US Northeast',N'SE-North Boston-C',N'ISE-North Boston-B'),
(etc..)
And if your datatype is NVARCHAR then begin the quotes with N'' thsi way the server does not have to do the conversion from VARCHAR to NVARCHAR.
And replace TEXT with a much smaller sensible datatype such as NVARCHAR(10) which I think should suffice for a ZIPCODE
Better yet as I mentioned in my comment if you are sourcing your data from a excel or csv you can use SQL Server's Import data Wizard. You can find out more about it HERE

Related

Store more character than the assign column in SQL Server

Can I do something like this, column is of type nchar(8), but the string I wanted to store in the table is longer than that.
The reason I am doing this is because I want to convert from one table to another table. Table A is nchar(8) and Table B is nvarchar(100). I want all characters in Table B transfer to Table A without missing any single character.
If the nvarchar(100) contains only latin characters with a length up to 16 chars, then you can squeeze the nvarchar(100) into the nchar(8):
declare #t table
(
col100 nvarchar(100),
col8 nchar(8)
);
insert into #t(col100) values('1234567890123456');
update #t
set col8 = cast(cast(col100 as varchar(100)) as varbinary(100))
select *, cast(cast(cast(col8 as varbinary(100)) as varchar(100)) as nvarchar(100)) as from8to100_16charsmax
from #t;
If you cannot modify A, then you cannot use it to store the data. Create another table for the overflow . . . something like:
create table a_overflow (
a_pk int primary key references a(pk),
column nvarchar(max) -- why stop at 100?
);
Then, you can construct a view to bring in the data from this table when viewing a:
create view vw_a as
select . . . , -- all the other columns
coalesce(ao.column, a.column) as column
from a left join
a_overflow ao
on ao.a_pk = a.pk;
And, if you really want to "hide" the view, you can create an insert trigger on vw_a, which inserts the appropriate values into the two tables.
This is a lot of work. Simply modifying the table is much simpler. That said, this approach is sometimes needed when you need to modify a large table and altering a column would incur too much locking overhead.

How to insert into type text

I wish to insert into an SQL table in a field whose data type is text. However I am informed of an error saying ' check datatype' my Name field is of type nvarchar and my job field is of type text.
INSERT INTO Table1 (Name, Job) VALUES ('John', 'Clerk')
In MS SQL Server, you wont be able to insert string values(with more than 1 characters) in table if the column of type nvarchar. You can only insert only one character using nvarchar.
If you wish to insert some text, please specify the some size with nvarchar.
For example in your case:
Create table Table1(Name nvarchar(5), Job Text)
Insert into Table1(Name, Job) values ('John','Clerk')
This will work.
Hope it will help you out.

Bulk inserting data gives error

Attempting to bulk insert into a table and I am getting the error:
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 31, column 4 (Birthday).
Below is the code I am trying to use to insert the data:
Bulk Insert Dzt.dbo.Player
From 'A:\New Folder\Seed_Files\Player.csv'
With
(
FieldTerminator=',',
RowTerminator='\n',
FirstRow=2
)
Here is the code I used when making the table:
Use Dzt
Create Table Player
(
Player_ID int,
FirstName varchar(255),
LastName varchar(255),
Birthday date,
Email varchar(255),
L_Flag varchar(255)
);
This is my first attempt at making a table and inserting data so I am thinking it is likely a datatype error for the Birthday field but I have been unable to find anything online that I am able to grasp my head on at this time. I have also tried use the datatype datetime instead of date but I received the same error.
I am using SSMS 2012 to create and insert the data onto a 2012 SQL Server.
Let me know if there is anything else I can provide that might help.
As you suspect it could be a date format error, I would suggest importing the csv into a table with Birthday column set to varchar type. Then use this query to filter the erroneous records.
select birthday from temptable where isdate(birthday) = 0
You could then correct those records and then insert them into your old table.

Finding character values outside ASCII range in an NVARCHAR column

Is there a simple way of finding rows in an Oracle table where a specific NVARCHAR2 column has one or more characters which wouldn't fit into the standard ASCII range?
(I'm building a warehousing and data extraction process which takes the Oracle data, drags it into SQL Server -- UCS-2 NVARCHAR -- and then exports it to a UTF-8 XML file. I'm pretty sure I'm doing all the translation properly, but I'd like to find a bunch of real data to test with that's more likely to cause problems.)
Not sure how to tackle this in Oracle, but here is something I've done in MS-SQL to deal with the same issue...
create table #temp (id int, descr nvarchar(200))
insert into #temp values(1,'Now is a good time')
insert into #temp values(2,'So is yesterday')
insert into #temp values(2,'But not '+NCHAR(2012))
select *
from #temp
where CAST(descr as varchar(200)) <> descr
drop table #temp
Sparky's example for SQL Server was enough to lead me to a pretty simple Oracle solution, once I'd found the handy ASCIISTR() function.
SELECT
*
FROM
test_table
WHERE
test_column != ASCIISTR(test_column)
...seems to find any data outside the standard 7-bit ASCII range, and appears to work for NVARCHAR2 and VARCHAR2.

Activity displayed as one character only

Good Morning
I created a little procedure where I add activities to a database table.
This is the code I used.
USE dbActivities
GO
CREATE PROCEDURE addActivity
#ActivityDate VARCHAR,
#description VARCHAR,
#maxPeople INT,
#Cost MONEY
AS
BEGIN
INSERT INTO Activity(ActivityDate,Description, MaxPeople, Cost)
VALUES(#ActivityDate, #description, #maxPeople, #Cost)
END
GO
I then select the table to view it.
USE dbActivities
GO
CREATE PROCEDURE viewActivities
AS
SELECT * FROM Activity
GO
The strange thing however is that the Description is displayed as only one character in the datatable. So, if I added the description...say "Swimming", when I view the table it is only displayed with one character 'S'.
Why is that?
regards
Arian
The VARCHAR equals to VARCHAR(1). Use e.g. VARCHAR(60) instead.