Read .csv table and print it as SQL query output - sql

I am using a bastardised version of T-SQL to generate reports about information within a database driven CAD software (Solidworks Electrical). I am trying to generate a Table of Contents. Due to limitations within the software, I have to generate this table using SQL.
What I would like to do is create the Table of Contents in Excel, save it as a .csv, and have my SQL query read this file and spit it out as an output.
Example Table:
Sheet,System
1,Radios
2,Processors
3,Navigation
After some searching I've been unable to find a solution myself. My problems are:
1) Read a .csv file stored on my harddrive
2) Turn this .csv file into a table (cant get stored on the database, is just temporary while we run the query)
3) Output the data in this table as the results of the query
I have tried to use the following to read my .csv table, but recieve the error "Syntax error, permission violation, or other nonspecific error". So it's possible my software just won't allow me to read external files. (NB, my software uses ]] [[ instead of quotes....)
select
]]col1[[,
]]col2[[,
]]col3[[
from openrowset('MSDASQL'
,'Driver={Microsoft Access Text Driver (*.txt, *.csv)}'
,'select * from D:\SQL Queries\input.CSV')
Any assistance would be much appreciated! Thanks

This sql works for me:
select * from openrowset (bulk N'C:\Temp\source.csv', formatfile = N'C:\Temp\format.xml', firstrow=2) SourceFile
Content of source.csv is this:
Sheet,System
1,Radios
2,Processors
3,Navigation
Content of format.xml is this:
<?xml version="1.0"?>
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
<FIELD ID="1" xsi:type="CharTerm" TERMINATOR=","/>
<FIELD ID="2" xsi:type="CharTerm" TERMINATOR="\r\n" MAX_LENGTH="128" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
</RECORD>
<ROW>
<COLUMN SOURCE="1" NAME="ID" xsi:type="SQLINT"/>
<COLUMN SOURCE="2" NAME="Name" xsi:type="SQLNVARCHAR"/>
</ROW>
</BCPFORMAT>

Related

Converting Complex XML to CSV

I have some (complex to me) XML code that I need to convert into CSV, I need absolutely every value added to the CSV for every submission, I have tried a few basic things however I cant get past the deep nesting and the different structures of this file.
Could someone please help me with a powershell script that would, I have started but cannot get the output of all data out I only get Canvas Results
Submissions.xml To large to post here (102KB)
$d=([xml](gc submissions.xml)).CANVASRESULTS | % {
foreach ($i in $_.CANVASRESULTS) {
$o = New-Object Object
Add-Member -InputObject $o -MemberType NoteProperty -Name Submissions -Value $_.Submission
Add-Member -InputObject $o -MemberType NoteProperty -Name Submission -Value $i
$o
}
}
$d | ConvertTo-Csv -NoTypeInformation -Delimiter ","
Anytime a complex XML has deeply nested structures and you require migration into a flat file format (i.e., txt, csv, xlsx, sql), consider using XSLT to simplify your XML format. As information, XSLT is a declarative, special-purpose programming language used to style, re-format, re-structure XML/HTML and other SGML markup documents for various end-use purposes. Aside - SQL is also a declarative, special-purpose programming language.
For most softwares to import XML into flat file formats in two dimensions of rows and columns, XML files must follow repeating elements (i.e., rows/records) with one level of children for columns/fields:
<data>
<row>
<column1>value</column1>
<column1>value</column1>
<column1>value</column1>
...
</row>
<row>
...
</data>
Nearly every programming language maintains an XSLT processor including PowerShell, Java, C#, Perl, PHP, Python, SAS, even VBA with your everyday MS Excel. For your complex XML, below is an example XSLT stylesheet with following output. Do note I manually create nodes based on values from original XML:
<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:strip-space elements="*"/>
<xsl:output method="xml" indent="yes"/>
<xsl:template match="CanvasResult">
<Data>
<xsl:for-each select="//Responses">
<Submission>
<Fitter><xsl:value-of select="Response[contains(Label, 'Fitter Name')]/Value"/></Fitter>
<Date><xsl:value-of select="Response[Label='Date']/Value"/></Date>
<Time><xsl:value-of select="Response[Label='Time']/Value"/></Time>
<Client><xsl:value-of select="Response[Label='Client']/Value"/></Client>
<Machine><xsl:value-of select="Response[Label='Machine']/Value"/></Machine>
<Hours><xsl:value-of select="Response[Label='Hours']/Value"/></Hours>
<Signature><xsl:value-of select="Response[Label='Signature']/Value"/></Signature>
<SubmissionDate><xsl:value-of select="Response[Label='Submission Date:']/Value"/></SubmissionDate>
<SubmissionTime><xsl:value-of select="Response[Label='Submission Time:']/Value"/></SubmissionTime>
<Customer><xsl:value-of select="Response[Label='Customer:']/Value"/></Customer>
<PlantLocation><xsl:value-of select="Response[Label='Plant Location']/Value"/></PlantLocation>
<PlantType><xsl:value-of select="Response[Label='Plant Type:']/Value"/></PlantType>
<PlantID><xsl:value-of select="Response[Label='Plant ID:']/Value"/></PlantID>
<PlantHours><xsl:value-of select="Response[Label='Plant Hours:']/Value"/></PlantHours>
<RegoExpiryDate><xsl:value-of select="Response[Label='Rego Expiry Date:']/Value"/></RegoExpiryDate>
<Comments><xsl:value-of select="Response[Label='Comments:']/Value"/></Comments>
</Submission>
</xsl:for-each>
</Data>
</xsl:template>
</xsl:stylesheet>
Output
<?xml version='1.0' encoding='UTF-8'?>
<Data>
...
<Submission>
<Fitter>Damian Stewart</Fitter>
<Date/>
<Time/>
<Client/>
<Machine/>
<Hours/>
<Signature/>
<SubmissionDate>28/09/2015</SubmissionDate>
<SubmissionTime>16:30</SubmissionTime>
<Customer>Dicks Diesels</Customer>
<PlantLocation/>
<PlantType>Dozer</PlantType>
<PlantID>DZ09</PlantID>
<PlantHours>2213.6</PlantHours>
<RegoExpiryDate>05/03/2016</RegoExpiryDate>
<Comments>Moving tomorrow from Daracon BOP to KCE BOP S6A Dam
Cabbie to operate</Comments>
</Submission>
...
</Data>
From there, you can import the two-dimensional XML into a usable rows/columns format. Below are the same import into an MS Access Database and MS Excel spreadsheet. You will notice gaps in the data due to XML content not populating the created nodes (handled in XSLT). A simple SQL cleanup can render final dataset.
Database Import

Error using SSIS Bids in Visual Studio 2012

I have to load several tables into SQL Server 2012 from SQL Server 2000. I heard BIDS could do this and I'm pretty new to it and wanted to some help. I would really appreciate whatever help I get with it.
I have Installed BIDS helper. already and used the below code. But it gives me errors stating,
Error 1187 Illegal syntax. Expecting valid start name character.
Error 1188 Character '#', hexadecimal value 0x23 is illegal in an XML name.
Error 1189 The character '#', hexadecimal value 0x40 is illegal at the beginning of an XML name.
<## template language="C#" hostspecific="true" #>
<## import namespace="System.Data" #>
<## import namespace="System.Data.SqlClient" #>
<## import namespace="System.IO" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<!--
<#
string connectionStringSource = #"Server=xxxxx;Initial Catalog=xxxx;Integrated Security=SSPI;Provider=sqloledb";
string connectionStringDestination = #"Server=xxxxxx;Initial Catalog=xxxxxxx;Integrated Security=SSPI;Provider=SQLNCLI11.1";
string SrcTableQuery = #"
SELECT
SCHEMA_NAME(t.schema_id) AS schemaName
, T.name AS tableName
FROM
sys.tables AS T
WHERE
T.is_ms_shipped = 0
AND T.name <> 'sysdiagrams';
";
DataTable dt = null;
dt = ExternalDataAccess.GetDataTable(connectionStringSource, SrcTableQuery);
#>
-->
<Connections>
<OleDbConnection
Name="SRC"
CreateInProject="false"
ConnectionString="<#=connectionStringSource#>"
RetainSameConnection="false">
</OleDbConnection>
<OleDbConnection
Name="DST"
CreateInProject="false"
ConnectionString="<#=connectionStringDestination#>"
RetainSameConnection="false">
</OleDbConnection>
</Connections>
<Packages>
<# foreach (DataRow dr in dt.Rows) { #>
<Package ConstraintMode="Linear"
Name="<#=dr[1].ToString()#>"
>
<Variables>
<Variable Name="SchemaName" DataType="String"><#=dr[0].ToString()#></Variable>
<Variable Name="TableName" DataType="String"><#=dr[1].ToString()#></Variable>
<Variable Name="QualifiedTableSchema"
DataType="String"
EvaluateAsExpression="true">"[" + #[User::SchemaName] + "].[" + #[User::TableName] + "]"</Variable>
</Variables>
<Tasks>
<Dataflow
Name="DFT"
>
<Transformations>
<OleDbSource
Name="OLE_SRC <#=dr[0].ToString()#>_<#=dr[1].ToString()#>"
ConnectionName="SRC"
>
<TableFromVariableInput VariableName="User.QualifiedTableSchema"/>
</OleDbSource>
<OleDbDestination
Name="OLE_DST <#=dr[0].ToString()#>_<#=dr[1].ToString()#>"
ConnectionName="DST"
KeepIdentity="true"
TableLock="true"
UseFastLoadIfAvailable="true"
KeepNulls="true"
>
<TableFromVariableOutput VariableName="User.QualifiedTableSchema" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
<# } #>
</Packages>
</Biml>
This is the maddening thing about trying to do much BimlScript in Visual Studio. The editor "knows" it's doing XML markup so all of the enhancements that make up BimlScript are "wrong" and so it's going to highlight them and put angry red squigglies and have you question whether you really have valid code here.
In my Error List, I see the same things you're seeing but this is one of the few times you can ignore Visual Studio's built in error checker.
Instead, the true test of whether the code is good is right click on the .biml file(s) and select "Check Biml for errors"
You should get a dialogue like
If so, click Generate SSIS Packages and then get some tape to attach your mind back into your head as it's just been blown ;)
Operational note
Note that the supplied code is going to copy all of the data from the source to the target. But, you've also specified that this is going to be a monthly operation so you'd either want to add a truncate step via an Execute SQL Task, or factor in a Lookup Transformation (or two) to determine new versus existing data and change detection.

creating bcp format file for sql server 2005

I am stuck in creating bcp format file. My client is using MSSQL 2005 and I am remotely connecting through RDC. I am creating format file on the target client MSSQL Server using the following command,
bcp myDatabase.TableName format nul -c -x -f someFile..xml -T
but the it prompt me with error
An Error has occured while establishing a connection to the server.
My View pointing to a table having structure as under:
Colum Name DataType
SKU Varchar(20)
ASIN Varchar(20)
Price Float
Quantity Int
MarketplaceTitle Varchar(50)
Note: I have been through many similar questions but had no luck.
Anyone can please provide me with a format file for the above View?
Thanks in advance.
It sounds like your issue is actually connecting to your database, not the format file.
I've created a quick test:
-- DROP TABLE dbo.Test
CREATE TABLE dbo.Test(
SKU Varchar(20),
[ASIN] Varchar(20),
Price Float,
Quantity Int,
MarketplaceTitle Varchar(50)
);
and here's a slightly more verbose syntax:
bcp YourDatabaseName.dbo.Test format nul -c -x -f c:\YourDir\format.xml -T -S YourServerName
so for me, this looks like this:
bcp sandbox.dbo.Test format nul -c -x -f c:\Test\format.xml -T -S "(local)"
Here's the file generated:
<?xml version="1.0"?>
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
<FIELD ID="1" xsi:type="CharTerm" TERMINATOR="\t" MAX_LENGTH="20" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="2" xsi:type="CharTerm" TERMINATOR="\t" MAX_LENGTH="20" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
<FIELD ID="3" xsi:type="CharTerm" TERMINATOR="\t" MAX_LENGTH="30"/>
<FIELD ID="4" xsi:type="CharTerm" TERMINATOR="\t" MAX_LENGTH="12"/>
<FIELD ID="5" xsi:type="CharTerm" TERMINATOR="\r\n" MAX_LENGTH="50" COLLATION="SQL_Latin1_General_CP1_CI_AS"/>
</RECORD>
<ROW>
<COLUMN SOURCE="1" NAME="SKU" xsi:type="SQLVARYCHAR"/>
<COLUMN SOURCE="2" NAME="ASIN" xsi:type="SQLVARYCHAR"/>
<COLUMN SOURCE="3" NAME="Price" xsi:type="SQLFLT8"/>
<COLUMN SOURCE="4" NAME="Quantity" xsi:type="SQLINT"/>
<COLUMN SOURCE="5" NAME="MarketplaceTitle" xsi:type="SQLVARYCHAR"/>
</ROW>
</BCPFORMAT>
HTH.

SQL SERVER bulk insert ignore deformed lines

I have to import SAP unconvered lists. These reports look quite ugly and are not that well suited for automated processing. However there is no other option. The data is borderd around minus and pipe symbols similar to the following example:
02.07.2012
--------------------
Report name
--------------------
|Header1 |Header2 |
|Value 11|Value1 2 |
|Value 21|Value2 2 |
--------------------
I use a format file and a statement like the following:
SELECT Header1, Header2
FROM OPENROWSET(BULK 'report.txt',
FORMATFILE='formatfile_report.xml' ,
errorfile='rejects.txt',
firstrOW = 2,
maxerrors = 100 ) as report
Unfortunately I receive the follwing error code:
Msg 4832, Level 16, State 1, Line 1
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
The rejects txt file contains the last row from the file with just minuses in it. The rejects.txt.Error.Txt documents:
Row 21550 File Offset 3383848 ErrorFile Offset 0 - HRESULT 0x80004005
The culprit that raises the error is obviously the very last row that does not conform to the format as declared in the format file. However the ugly header does not cause much problems (at least the one at the very top).
Although I defined the maxerror attribute that very one deformed line kills the whole operation. If I manually delete the last line containing all that minuses (-) everything works fine. Since that import shall run frequently and particularly unattended that extra post-treatment is not serious solution.
Can anyone help me to get sql server to be less picky and susceptible respectively. It is good that it documents the lines that couldn't be loaded but why does it abort the whole operation? And further after one execution of a statement that caused the creation of the reject.txt no other (or the same) statement can be executed before the txt file gets deleted manually:
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "rejects.txt" could not be opened. Operating system error code 80(The file exists.).
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "rejects.txt.Error.Txt" could not be opened. Operating system error code 80(The file exists.).
I think that is weird behavior. Please help me to suppress it.
EDIT - FOLLOWUP:
Here is the format file I use:
<?xml version="1.0"?>
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
<FIELD ID="EMPTY" xsi:type="CharTerm" TERMINATOR="|" MAX_LENGTH="100"/>
<FIELD ID="HEADER1" xsi:type="CharTerm" TERMINATOR="|" MAX_LENGTH="100"/>
<FIELD ID="HEADER2" xsi:type="CharTerm" TERMINATOR="|\r\n" MAX_LENGTH="100"/>
</RECORD>
<ROW>
<COLUMN SOURCE="HEADER1" NAME="HEADER2" xsi:type="SQLNVARCHAR"/>
<COLUMN SOURCE="HEADER2" NAME="HEADER2" xsi:type="SQLNVARCHAR"/>
</ROW>
</BCPFORMAT>
BULK INSERT is notoriously fiddly and unhelpful when it comes to handling data that doesn't meet the specifications provided.
I haven't done a lot of work with format files, but one thing you might want to consider as a replacement is using BULK INSERT to drop each line of the file into a temporary staging table with a single nvarchar(max) column.
This lets you get your data into SQL for further examination, and then you can use the various string manipulation functions to break it down into the data you want to finally insert.
I was in the same trouble, but using bcp command line the problem was solved, its simply dont take the last row
I had the same problem. I had a file with 115 billions of rows so manually deleting the last row was not an option, as I couldn't even open the file manually, as it was too big.
Instead of using BULK INSERT command, I used the bcp command, which looks like this:
(Open a DOS cmd in administrator then write)
bcp DatabaseName.dbo.TableNameToInsertIn in C:\Documents\FileNameToImport.dat -S ServerName -U UserName -P PassWord
It's about the same speed as bulk insert as far as I can tell (took me only 12 minutes to import my data). When looking at activity monitor, I can see a bulk insert, so I guess that it logs the same way when the database is in bulk recovery mode.

How to get first element by XPath in Oracle

In my Oracle db I have records like this one:
<ROOT>
<Event>
<Type>sldkfvjhkljh</Type>
<ID>591252</ID>
</Event>
<Data>
<File>
<Name>1418688.pdf</Name>
<URL>/591252/1418688.pdf</URL>
</File>
<File>
<Name>1418688.xml</Name>
<URL>/591252/1418688.xml</URL>
</File>
</Data>
</ROOT>
I need to extract a value from the first <Name> tag. If I try:
Select xmltype(xml_data).extract('//Name[1]/text()').getStringVal() from MY_TABLE
I get:
1418688.pdf1418688.xml
Why is that and how can I get just 1418688.pdf?
Oracle Version:
Oracle Database 10g Enterprise Edition
Release 10.2.0.4.0 - 64bi
I think that both Name elements are #1 in this doc, because in their nodes they are each first. Try //File[1]/Name/text()