How to remove Nulls from Save As in SQL Server Management Studio - sql

I have created a variable that is type table inside a stored procedure. At the end of the procedure I am selecting all the rows in the table and displaying them. When I right click on the headers and select "Save As" it allows me to change the type to All Files and save the file as a text file. This works fine except that the columns that have NULLS in them saves as NULLS. I want it to fill NULLs in with spaces.
I've been trying to find a way to create a file using a stored procedure but most things indicate to use SSIS but I can't figure out how to use SSIS with a variable that is a table instead of using an actual table.
If I could either replace nulls with spaces or use a stored procedure to do the same thing it would be great. I can not use tab or comma delimited as the final product has to be a flat file that each column uses the same amount of characters as is declared in the column headers. Padded with spaces.
Thanks for any help you are able to offer.
Cheers
P.S. I am using SQL Server 2012 Management Studio

The easy way to do this would be to convert the NULLs to spaces in your SELECT statement.
SELECT COALESCE(yourcolumn, '')
Put the COALESCE clause around every column that has NULLs in it.
Using COALESCE article link

If the last thing you do in the stored procedure is Select * From TempTable then you can use that SP in an OleDb source component. Change from Table or View to Sql Command and use the Exec (sp_SomeName) syntax. This will create a pipe that you can connect to a destination component, such as flat file.
I have seen many issues over the years doing Save Results As... I will only use this for informal 'quick check' files and not for anything considered 'live' or 'production' data.
Here is a good blog that also shows how to use parameters.
http://geekswithblogs.net/stun/archive/2009/03/05/mapping-stored-procedure-parameters-in-ssis-ole-db-source-editor.aspx

Related

How to run a select sql statement within a field in the Pentaho?

I have a table with a 'query' field containing a select sql and another 'parameters' field containing the sql parameters. I have merged these two fields into a new field containing a correct select sql statement. Now I need to execute this new field containing select sql, get the return from select (the output fields) and generate an excel file.
Use Table-Input if you are interested in a query result set. Table-Input supports SQL parameters, so no need to build the statement yourself using e.g. Replace-In-String, and tripping over escapes on your way. Also, there's variable substitution, just in case you can't live with a single template.
Update 21:14 GMT
I'm not very fond of the way you try to prepare the SELECT statement, but here we go, assuming it's a single statement we have:
Create a job with a Start entry and 2 Transformation entries (T1, T2). Let T1 produce the field containing your SELECT statement and use a Set-Variables step to make the statement available to T2 as variable SELECT. In T2 use a Table-Input step referencing ${SELECT} in the SQL statement text area. Don't forget to enable option "Replace variables in script".
From now on it's a matter of taste. I would prefer to create a CSV file using Text-File-Output. Using the right field separator Excel will open the file after double-clicking it. The advantage of Text-File-Output is that you don't have to specify the fields you don't know at design-time anyway. An empty field list will just handle all fields coming in. Comparable to the total projection in a Table-Input which will create the necessary fields from the retrieved columns downstream.
If you must produce an Excel workbook, you'll have to learn about metadata injection. That would be a separate project for a beginner, though. There are samples in your Kettle installation folder. And there is a very active community if you find yourself in trouble.

tsql : outputting each record to their own text file

is there a simple way to just output each record in a select statement to write to its own file?
for example, if you have the tsql query in sql server 2005,
select top 10 items, names + ':' + address from book
and you ended up with 10 text files with the individual name and addresses in each file.
is there a way to do this without writing an extensive spWriteStringToFile procedure? I'm hoping there is some kind of output setting or something in the select statement.
thanks in advance
SQL returns the result set first, there's no opportunity in there for writing records to specific files until afterwards.
Being SQL Server 2005, it's possible you could use a SQLCLR (.NET 2.0 code) function in a SQL statement without having to make a separate application.
In SSMS, you can do a results to file, but that wouldnt split each record out into its own file. I pretty sure you cannot do this out of the box, so it sounds like you will be rolling your own solution.
You'd do this in some client, be it Java, VBA or SSIS typically.

Create delimited string from a row in stored procedure with unknown number of elements

Using SQL Server 2000 and Microsoft SQL Server MS is there a way to create a delimited string based upon an unknown number of columns per row?
I'm pulling one row at a time from different tables and am going to store them in a column in another table.
A simple SQL query can't do anything like that. You need to specify the fields you are concatenating.
The only method that I'm aware of is to dynamincally build a query for each table.
I don't recall the structure of MSSQL2000, so I won't try to give an exact example, maybe someone else can. But there -are- system tables that contain table defintions. By parsing the contents of those system tables you can dynamically build the necessary query for each source data table.
TSQLthat writes TSQL, however, can be a bit tricky to debug and maintain :) So be careful how you structure everything...
Dems.
EDIT:
Or just do it in your client application.

Import Package Error - Cannot Convert between Unicode and Non Unicode String Data Type

I have made a dtsx package on my computer using SQL Server 2008. It imports data from a semicolon delimited csv file into a table where all of the field types are NVARCHAR MAX.
It works on my computer, but it needs to run on the clients server. Whenever they create the same package with the same csv file and destination table, they receive the error above.
We have gone through the creation of the package step by step, and everything seems OK. The mappings are all correct, but when they run the package in the last step, they receive this error. They are using SQL Server 2005.
Can anyone advise where to begin looking for this problem?
The problem of converting from any non-unicode source to a unicode SQL Server table can be solved by:
add a Data Conversion transformation step to your Data Flow
open the Data Conversion and select Unicode for each data type that applies
take note of the Output Alias of each applicable column (they are named Copy Of [original column name] by default)
now, in the Destination step, click on Mappings
change all of your input mappings to come from the aliased columns in the previous step (this is the step that is easily overlooked and will leave you wondering why you are still getting the same errors)
At some point, you're trying to convert an nvarchar column to a varchar column (or vice-versa).
Moreover, why is everything (supposedly) nvarchar(max)? That's a code smell if I ever saw one. Are you aware of how SQL Server stores those columns? They use pointers to where the column is stored from the actual rows, since they don't fit within the 8k pages.
Non-Unicode string data types:
Use STR for text file and VARCHAR for SQL Server columns.
Unicode string data types:
Use W_STR for text file and NVARCHAR for SQL Server columns.
The problem is that your data types do not match, so there could be a loss of data during the conversion.
Two solutions:
1- if the type of the target column is [nvarchar] it should be change to [varchar]
2- Add a "Derived Column" component to the SSIS package and add a new column with the following expression:
(DT_WSTR, «length») [ColumnName]
Length is the length of the column in the target table and ColumnName is the name of the column in the target table.
finally at the mapping part you should use this new added column instead of the original column.
Not sure if this is a best practice with SSIS but sometimes I find their tools are a bit clunky when you want to do this type of activity.
Instead of using their components you can convert the data within your query
Instead of doing
SELECT myField = myNvarchar20Field
FROM myTable
You could do
SELECT myField = CONVERT(VARCHAR(20),myNvarchar20Field)
FROM myTable
This a solution that uses the IDE to fix:
Add a Data Conversion item to your dataflow as shown below;
Double click on the Data Conversion item, and set it as shown:
Now double click on the DB Destination item, Click on Mapping, and ensure that your input Column is actually the same as coming from the Copy of [your column name], which is in fact the Data Conversion output NOT the DB Source Output (be careful here). Here is a screenshot:
And thats it .. save and run ..
Mike, I had the same problem with SSIS in SQL Server 2005...
Apparently, the DataFlowDestination object will always attempt to validate the data coming in,
into Unicode. Go to that object, Advanced Editor, Component Properties pane, change the "ValidateExternalMetaData" property to False. Now, go to the Input and Output Properties pane, Destination Input, External Columns - set each column Data type and Length to match the database table it's going to. Now, when you close that editor, those column changes will be saved and not validated over, and it will work.
Follow the below steps to avoid (cannot convert between unicode and non-unicode string data types) this error
i) Add the Data conversion Transformation tool to your DataFlow.
ii) To open the DataFlow Conversion and select [string DT_STR] datatype.
iii) Then go to Destination flow, select Mapping.
iv) change your i/p name to copy of the name.
Get to the registry to configuration of the client and change the LANG.
For Oracle, go to HLM\SOFTWARE\ORACLE\KEY_ORACLIENT...HOME\NLS_LANG and change to appropriate language.
The dts data Conversion task is time taking if there are 50 plus columns!Found a fix for this at the below link
http://rdc.codeplex.com/releases/view/48420
However, it does not seem to work for versions above 2008. So this is how i had to work around the problem
*Open the .DTSX file on Notepad++. Choose language as XML
*Goto the <DTS:FlatFileColumns> tag. Select all items within this tag
*Find the string **DTS:DataType="129"** replace with **DTS:DataType="130"**
*Save the .DTSX file.
*Open the project again on Visual Studio BIDS
*Double Click on the Source Task . You would get the message
the metadata of the following output columns does not match the metadata of the external columns with which the output columns are associated:
...
Do you want to replace the metadata of the output columns with the metadata of the external columns?
*Now Click Yes. We are done !
Resolved - to the original ask:
I've seen this before. Easiest way to fix (don't need all those data conversion steps as ALL of the meta data is available from the source connection):
Delete the OLE DB Source & OLE DB Destinations
Make sure Delayed Validation is FALSE (you can set it to True later)
Recreate the OLE DB Source with your query, etc.
Verify in the Advanced Editor that all of the output data column types are correct
Recreate your OLE DB Destination, map, create new table (or remap to existing) and you'll see that SSIS got all the data types correct (same as source).
So much easier that the stuff above.
Not sure if this is still a problem but I found this simple solution:
Right-Click Ole DB Source
Select 'Edit'
Select Input and Output Properties Tab
Under "Inputs and Outputs", Expand "Ole DB Source Output" External Columns and Output Columns
In Output columns, select offending field, on the right-hand panel ensure Data Type Property matches that of the field in External Columns properties
Hope this was clear and easy to follow
Sometime we get this error when we select static character as a field in source query/view/procedure and the destination field data type in Unicode.
Below is the issue i faced:
I used the script below at source
and got the error message Column "CATEGORY" cannot convert between Unicode and non-Unicode string data types. as below:
error message
Resolution:
I tried multiple options but none worked for me. Then I prefixed the static value with N to make in Unicode as below:
SELECT N'STUDENT DETAIL' CATEGORY, NAME, DATEOFBIRTH FROM STUDENTS
UNION
SELECT N'FACULTY DETAIL' CATEGORY, NAME, DATEOFBIRTH FROM FACULTY
If anyone is still experiencing this issue, I found that it related to a difference in Oracle Client versions.
I have posted my full experience and solution here: https://stackoverflow.com/a/43806765/923177
1.add a Data Conversion tool from toolbox
2.Open it,It shows all coloumns from excel ,convert it to desire output. take note of the Output Alias of
each applicable column (they are named Copy Of [original column name] by default)
3.now, in the Destination step, click on Mappings
I changed ValidateExternalMetadata=False for each transformation task. It worked for me.

Manually inserting varbinary data into SQL Server

We have a SQL Server table for user settings. Originally the settings were domain objects which had been serialized as XML into the table but we recently begun serializing them as binary.
However, as part of our deployment process we statically pre-populate the table with predefined settings for our users. Originally, this was as simple as copying the XML from a customized database and pasting it into an INSERT statement that was ran after the database was built. However, since we've moved to storing the settings as binary data we can't get this to work.
How can we extract binary data from a varbinary column in SQL Server and paste it into a static INSERT script? We only want to use SQL for this, we don't want to use any utilities.
Thanks in advance,
Jeremy
You may find it easier to store a template value in a config table somewhere, then read it into a variable and use that variable to fill your inserts:
DECLARE #v varbinary(1000)
SELECT #v = templatesettings from configtable
INSERT INTO usertable VALUES(name, #v, ....)
From SQL Server 2008 onwards you can use Tasks > Generate Scripts and choose to include data. That gives you INSERT statements for all rows in a table which you can modify as needed.
Here's the steps for SQL 2008. Note that the "Script Data" option in SQL 2008 R2 is called "Types of data to script" instead of "Script Data".
I presume you're OK with utilities like Query Analyzer/Mangement Studio?
You can just copy and paste the binary value returned by your select statement (make sure that you are returning sufficient data), and prefix it with "0x" in your script.
If I understand you correctly, you want to generate a static script from your data. If so, consider performing a query on the old data that concatenates strings to form the SQL statements you'll want in the script.
First, figure out what you want the scripted result to look like. Note that you'll need to think of the values you're inserting as constants. For example:
INSERT INTO NewTable VALUES 'value1', 'value2'
Now, create a query for the old data that just gets the values you'll want to move, like this:
SELECT value1, value2
FROM OldTable
Finally, update your query's SELECT statement to produce a single concatenated string in the form of the output you previous defined:
SELECT 'INSERT INTO NewTable VALUES ''' + value1 + ''', ''' + value2 + ''''
FROM OldTable
It's a convoluted way to do business, but it gets the job done. You'll need a close attention to detail. It will allow a small (but confusing) query to quickly output very large numbers of static DML statements.
David M's suggestion of using the 0x prefixing works but i had to add an extra 0 at the end of varbinary data that i was trying to insert.
See the stackoverflow entry below to see the issue with additional 0 that gets added when converting to varbinary or saving to varbinary column
Insert hex string value to sql server image field is appending extra 0