SQL Server 2005 - Bulk Insert failing - sql

I have a txt file that contains 1600 rows and 82 columns of comma delineated data that I am trying to import into a table. I get the following error on every row on the very last field:
Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 81 (DB252D20C8).
The import statement is
BULK
INSERT [ENERGY].[dbo].[READINGS1]
from 'c:\readings2.txt'
with
(
DATAFILETYPE='widechar',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
The table structure is as follows, the top and bottom of the script:
USE [ENERGY]
GO
/****** Object: Table [dbo].[READINGS1] Script Date: 05/13/2013 20:00:30 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[READINGS1](
[DateAndTime] [datetime] NOT NULL,
[DB240D4C7] [float] NULL,
[DB240D8C7] [float] NULL,
[DB240D12C7] [float] NULL,
[DB240D16C7] [float] NULL,
[DB252D12C8] [float] NULL,
[DB252D16C8] [float] NULL,
[DB252D20C8] [float] NULL,
CONSTRAINT [READINGS1DataTimeStamp] PRIMARY KEY CLUSTERED
(
[DateAndTime] ASC
)WITH (PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
The text file is as follows:
2013-02-19 00:00:00.000,6,945,1886,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,22,2040,6299,0,0,6,567,1248,0,0,251,8859,8655,0,0,10,316,1786,0,0,7,180,1206,0,0,1,16,56,0,0,368,18953,36949,0,0,NULL,NULL
2013-02-19 01:00:00.000,6,147,1886,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,22,1516,6299,0,0,3,115,1248,0,0,250,5077,8655,0,0,9,219,1786,0,0,5,147,1206,0,0,1,15,56,0,0,362,8907,36949,0,0,NULL,NULL

Alright so what you need to do is alter your statement so that after the end of the file you use KEEPNULLS. This informs SQL server that you wish to keep your null values. Currently it's trying to convert NULL as a string into your FLOAT COLUMN. Alter your statment to look like this.
BULK
INSERT [ENERGY].[dbo].[READINGS1]
from 'c:\readings2.txt'
with
(
DATAFILETYPE='widechar',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n',
KEEPNULLS
)
GO
There is an article on BOL about this. .
Otherwise you can always build a Integration Services package to handle this. That is an easy fast way to import information from flat file sources.

It turns out that there were too many fields in the input text file for the table.

Related

How to retrieve German characters from a large CSV File into SQL Server 2017 script

I have a CSV file including a list of employees, where some of them includes German characters like 'ö' in their names. I need to create a temp table in my SQL Server 2017 script and fill it with the content of the CSV file. My script is:
CREATE TABLE #AllAdUsers(
[PhysicalDeliveryOfficeName] [NVARCHAR](255) NULL,
[Name] [NVARCHAR](255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[DisplayName] [NVARCHAR](255) NULL,
[Company] [NVARCHAR](255) NULL,
[SAMAccountName] [NVARCHAR](255) NULL
)
--import AD users
BULK INSERT #AllAdUsers
FROM 'C:\Employees.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
However, even though I use "Nvarchar" variable type with the collation of "SQL_Latin1_General_CP1_CI", the German characters are not seem OK, for instance "Kösker" seems like:
"K├╢sker"
I've tried many other collations but couldn't find a fix for it. Any help would be very much appreciated.

Create a database schema-script in ssms

I have a fully functional database in sql server. Around 40 tables. I have to install this schema (only the schema, not the data) on multiple other sql server instances. SSMS offers a nice way to auto generate schemas using Tasks --> Generate Scripts. It kinda works, but I am not sure if I understand it correctly:
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[TableName]') AND type in (N'U'))
BEGIN
CREATE TABLE [dbo].[TableName](
[id] [uniqueidentifier] NOT NULL,
[history] [varchar](max) NOT NULL,
[isdeleted] [bit] NOT NULL,
CONSTRAINT [PK_RecGroupData] PRIMARY KEY CLUSTERED
(
[rid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
END
GO
IF NOT EXISTS (SELECT * FROM dbo.sysobjects WHERE id = OBJECT_ID(N'[dbo].[DF_TableName_id]') AND type = 'D')
BEGIN
ALTER TABLE [dbo].[TableName] ADD CONSTRAINT [DF_TableName_id] DEFAULT (newid()) FOR [id]
END
--Just showing one ALTER TABLE and IF NOT EXISTS. The others are generated in the same way.
What happens, if I execute the script, create a new script with the exact same content, but add a new column to it (--> id, history, isdeleted and timestamp)? Does it automatically add the new line? I think yes, of course, but I don't get, how it would know, if the column should be NOT NULL, VARCHAR, BIT, or something similar. It would just execute
ALTER TABLE [dbo].[TableName] ADD CONSTRAINT [DF_TableName_id] DEFAULT (newid()) FOR [id]
(id => new sample column)
But there isn't any information about the data type or any other modifiers.
Also, if I execute my script like this one a second time, it'll throw some errors:
Meldung 1781, Ebene 16, Status 1, Zeile 3
An die Spalte ist bereits ein DEFAULT-Wert gebunden.
Which translates to this:
Message 1781, level 16, status 1, line 3
A DEFAULT value is already bound to the column.
Why does this happen?
The error message is saying that there was a default value assigned to that column before.
Also:
ALTER TABLE [dbo].[TableName] ADD CONSTRAINT [DF_TableName_id] DEFAULT (newid()) FOR [id]
is not the syntax for adding new column - this is to add default value of NEWID() to the column [id].
To add a column you you should follow this steps (with an example inside).
Also how would the SQL Server know the setup settings for the new columns from your manually added lines? It would simply allow you to define them as you want and accept if the syntax is right or through an error if not during the script parse process (can be done by [ctrl] + [F5] in SSMS).

cannot write to newly created table in SQL Azure

in our Azure SQL Service db we had a table App_Tracking that is/was used to track user actions. We needed to increase the size of the log buffer so I first copied over all the records to an archive table that was defined using this SQL statement
CREATE TABLE [dbo].[App_Tracking_Nov20_2015](
[ID] [int] IDENTITY(1,1) NOT NULL,
[UserID] [nvarchar](50) NOT NULL,
[App_Usage] [nvarchar](1024) NOT NULL,
[Timestamp] [datetime] NOT NULL )
Then using SQL Management Studio 2012 I recreated the original table using :Drop/Create script Generation:
USE [tblAdmin] GO
/****** Object: Table [dbo].[App_Tracking] Script Date: 11/21/2015 11:42:01 AM ******/
DROP TABLE [dbo].[App_Tracking] GO
/****** Object: Table [dbo].[App_Tracking] Script Date: 11/21/2015 11:42:01 AM ******/
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[App_Tracking](
[ID] [int] IDENTITY(1,1) NOT NULL,
[UserID] [nvarchar](50) NOT NULL,
[App_Usage] [nvarchar](4000) NOT NULL,
[Timestamp] [datetime] NOT NULL,
CONSTRAINT [PrimaryKey_ 7c88841f-aaaa-bbbb-cccc- c26fe6a5720e] PRIMARY KEY CLUSTERED (
[ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) )
GO
this is the automated drop/create that SMS2012 creates for you
I then updated statistics on App_Admin using EXEC sp_updatestats
The gotcha is that I can no longer programattically add records to this table.
If I open App_Admin from manage.windowsazure.net and "open in Visual Studio" I can manually add a record to it. but if in SMS2012 I run the code
USE [tblAdmin] GO
UPDATE [dbo].[App_Tracking] SET
[UserID] = 'e146ba22-930c-4b22-ac3c-15da47722e75' ,
[App_Usage] = 'search search: Bad Keyword: asdfadsfs' ,
[Timestamp] = '2015-11-20 20:00:18.700'
GO
nothing gets updated but no error is thrown.
If programmatically I use
var adminContext = new App_AdminEntities();
string prunedAction = action.Length <= 4000 ? action : action.Trim().Substring (0, 4000); // insure we don't fault on overflow of too long a keyword list
var appTracking = new App_Tracking
{
UserID = userId,
PP_Usage = prunedAction,
Timestamp = DateTime.Now
};
try {
adminContext.App_Tracking.Add(APPTracking);
adminContext.SaveChanges();
adminContext.Dispose();
}
I get an error thrown on SaveChanges (which is the .net SQL db function) What did I do wrong
OK so I found the problem. it turns out I had not updated the EDMX file associated and thus the error was being thrown by internal entity validation - which is kindof hidden under the covers –

Processing ProcessParameters as XML in SQL Server

I am trying to extract values from an XML column. Unfortunately, whatever combination I try, I can't get any meaningfull result out of it.
A test script with data can be found here
Related questions that did not turn the light on for me
Getting values from XML type field
XML query() works, value() requires singleton
Getting rowsets from XQuery and SQL Server 2005
Example of the contents of one item
<Dictionary xmlns="clr-namespace:System.Collections.Generic;assembly=mscorlib" xmlns:mtbwa="clr-namespace:Microsoft.TeamFoundation.Build.Workflow.Activities;assembly=Microsoft.TeamFoundation.Build.Workflow" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" x:TypeArguments="x:String, x:Object">
<mtbwa:BuildSettings x:Key="BuildSettings" ProjectsToBuild="$/Projects/BpABA/Dev/V6/DUnit/FrameworkTests.dproj">
<mtbwa:BuildSettings.PlatformConfigurations>
<mtbwa:PlatformConfigurationList Capacity="1">
<mtbwa:PlatformConfiguration Configuration="Debug" Platform="Win32" />
</mtbwa:PlatformConfigurationList>
</mtbwa:BuildSettings.PlatformConfigurations>
</mtbwa:BuildSettings>
<mtbwa:SourceAndSymbolServerSettings SymbolStorePath="{x:Null}" x:Key="SourceAndSymbolServerSettings" />
<mtbwa:AgentSettings x:Key="AgentSettings" MaxExecutionTime="01:00:00" MaxWaitTime="04:00:00" Tags="Delphi 5" />
<x:Boolean x:Key="CreateWorkItem">False</x:Boolean>
<x:Boolean x:Key="PerformTestImpactAnalysis">False</x:Boolean>
</Dictionary>
Latest attempt
;WITH XMLNAMESPACES('http://schemas.microsoft.com/winfx/2006/xaml' AS mtbwa)
, q AS (
SELECT CAST(bd.ProcessParameters AS XML) p
FROM dbo.tbl_BuildDefinition bd
)
SELECT X.Doc.value('mtbwa:BuildSettings[0]', 'VARCHAR(50)') AS 'Test'
FROM q CROSS APPLY p.nodes('/mtbwa:Dictionary') AS X(Doc)
Background
The column ProcessParameters is part of the TFS build system in the tbl_BuildDefinition table.
The complete DDL is as follows
USE [Tfs_ProjectCollection]
GO
/****** Object: Table [dbo].[tbl_BuildDefinition] Script Date: 06/19/2012 16:28:56 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[tbl_BuildDefinition](
[DefinitionId] [int] IDENTITY(1,1) NOT NULL,
[GroupId] [int] NOT NULL,
[DefinitionName] [nvarchar](260) NOT NULL,
[ControllerId] [int] NOT NULL,
[DropLocation] [nvarchar](260) NULL,
[ContinuousIntegrationType] [tinyint] NOT NULL,
[ContinuousIntegrationQuietPeriod] [int] NOT NULL,
[LastBuildUri] [nvarchar](64) NULL,
[LastGoodBuildUri] [nvarchar](64) NULL,
[LastGoodBuildLabel] [nvarchar](326) NULL,
[Enabled] [bit] NOT NULL,
[Description] [nvarchar](2048) NULL,
[LastSystemQueueId] [int] NULL,
[LastSystemBuildStartTime] [datetime] NULL,
[ProcessTemplateId] [int] NOT NULL,
[ProcessParameters] [nvarchar](max) NULL,
[ScheduleJobId] [uniqueidentifier] NOT NULL,
CONSTRAINT [PK_tbl_BuildDefinition] PRIMARY KEY CLUSTERED
(
[GroupId] ASC,
[DefinitionName] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[tbl_BuildDefinition] ADD DEFAULT (newid()) FOR [ScheduleJobId]
GO
I think you have a wrong namespace defined for your mbtwa prefix in your XML/XQuery text, and you need to use 1-based indexing to get at the data when using the .value() function (not 0-based like commonly used).
So try this:
;WITH XMLNAMESPACES('clr-namespace:Microsoft.TeamFoundation.Build.Workflow.Activities;assembly=Microsoft.TeamFoundation.Build.Workflow' AS mtbwa,
DEFAULT 'clr-namespace:System.Collections.Generic;assembly=mscorlib')
, q AS (
SELECT CAST(bd.ProcessParameters AS XML) p
FROM dbo.tbl_BuildDefinition bd
WHERE DefinitionId = 1
)
SELECT
X.Doc.query('mtbwa:BuildSettings') AS 'Node',
X.Doc.value('(mtbwa:BuildSettings/#ProjectsToBuild)[1]', 'VARCHAR(50)') AS 'ProjectsToBuild'
FROM
q
CROSS APPLY
p.nodes('/Dictionary') AS X(Doc)
This should give you the whole <mtbwa:BuildSettings> node as XML (using the .query() function), as well as the value of the single attribute ProjectsToBuild ($/Projects/BpABA/Dev/V6/DUnit/FrameworkTests.dproj) of that node.
If you want a whole node (as XML), then you need to use .query('xpath') - the .value() function can get you the inner text of a node (if present), or the value of a single attribute.
Does that help at all?

How can I import data to SQL from CSV or XLS automatically incrementing a string field based on current records in DB?

I need to import data from Excel into a SQL 2000 db.
I need to import 6 fields from the worksheet and increment a string field containing an integer padded to 5 characters with leading zeros. This field is not the primary key and the db does not automatically populate this. Also the DB will allow this field to be entered as NULL if this helps and then change afterwards if this helps.
I can get the data into the table I need using a combination of rookie DTS and insert statments and manually update the string field for the 20 records I have to do today, but next week I need to import around 1000 records.
Should I write a C#/ADO.net app to do this, [bearing in mind I'm a newbie so that'll take me a couple of days :-) ] or is there a way I can increment a string field using DTS directly or some sort of loop in an insert statement?
Thanks in advance
G
EDIT: The table I'm inserting into is constructed as below and I need to update "cedeviceid", "vanwarehouse", "username", "devicesimnumber", "UserGroup" and "ServiceMgr". from the Excel sheet. "sendercode" is the string I need to increment
CREATE TABLE [dbo].[mw_gsmprofile](
[cedeviceid] varchar NOT NULL,
[mainwarehouse] varchar NULL,
[vanwarehouse] varchar NULL,
[username] varchar NULL,
[sendercode] varchar NULL,
[devicesimnumber] varchar NULL,
[usersupportgsm] [int] NULL,
[userisonline] [int] NULL,
[onlinedate] varchar NULL,
[lastsentsequenceno] [int] NULL,
[lastsentdate] varchar NULL,
[lastreceivedsequenceno] [int] NULL,
[lastreceiveddate] varchar NULL,
[EnableAutoDownloading] [int] NULL,
[EnableCompressFile] [int] NULL,
[LogonUserName] varchar NULL,
[LogonPassword] varchar NULL,
[LogonDomain] varchar NULL,
[UserGroup] varchar NULL,
[UseStorageCard] [int] NULL,
[SMSMapProfile] varchar NULL,
[SMPPClientFlag] [int] NULL,
[LASTUPDATE] varchar NULL,
[ServiceMgr] varchar NULL,
[VanLocation] varchar NULL,
[OnHireWarehouse] varchar NULL,
[OnHireWhsRepType] [int] NULL,
[HireDepotWarehouse] varchar NULL,
[HireDepotWhsRepType] [int] NULL,
CONSTRAINT [PK_mw_gsmprofile] PRIMARY KEY CLUSTERED
(
[cedeviceid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
SAMPLE DATA
cedeviceid,vanwarehouse, username, devicesimnumber, UserGroup, ServiceMgr
3431, 999, INSTALL TEAM 1,,INSTAL, AHOA
3441, 999, INSTALL TEAM 2,,INSTAL, AHOA
3451, 999, INSTALL TEAM 3,,INSTAL, AHOA
3461, 999, INSTALL TEAM 4,,INSTAL, AHOA
3471, 999, INSTALL TEAM 5,,INSTAL, AHOA
3472, 999, INSTALL TEAM 6,,INSTAL, AHOA
Some slifght own trumpet blowing here, but my own FOSS tool CSVfix can do this without writing any code, using the (inexplicably) uundocmented sequence commanbd. For example, given a CSV file:
foo,bar
one,two
three,four
then:
csvfix sequence -n 42 -p 5 afile.csv
would produce the output:
00042, foo, bar
00043, one, two
00044, three,four
the -p option specifies the padding and the -n option the starting number.
Now to find out how it got omitted from the help file....
I would do this with a small app. You have to get the first item of the table, sorted in reverse order (this will give you the maximum value of the id). After knowing the maximum value, you can increment it very easily with the expressiveness of a programming language.
Get the starting value from the database
Iterate over the imported records, inserting them one at a time into the DB. Use a counter variable to increment field. Use tostring("00000") overload to pad the number.