I have a RedShift COPY command which is executed as SQL:-
COPY some_schema.some_table FROM 's3://a-bucket/home/a_file.csv' CREDENTIALS 'aws_access_key_id=SOMEKEY;aws_secret_access_key=SOMESECRETKEY' IGNOREHEADER 1 CSV DATEFORMAT 'YYYY-MM-DD' NULL 'NOT-CAPTURED'
The data I need to import has a date column with occasional occurrences of 'NOT-CAPTURED'. The addition of the NULL option allows these to be treated as null and prevents a load error. This apparently worked.
Can this statement be extended to treat multiple types of occurrence as null? I have 'N.A' in a date column in a similar file and would like to use a common statement?
I have tried obvious variations to provide more than one value to replace as null such as NULL 'NOT-CAPTURED','N.A' and couldn't find any documentation covering it.
Thanks!
I have created a variable that is type table inside a stored procedure. At the end of the procedure I am selecting all the rows in the table and displaying them. When I right click on the headers and select "Save As" it allows me to change the type to All Files and save the file as a text file. This works fine except that the columns that have NULLS in them saves as NULLS. I want it to fill NULLs in with spaces.
I've been trying to find a way to create a file using a stored procedure but most things indicate to use SSIS but I can't figure out how to use SSIS with a variable that is a table instead of using an actual table.
If I could either replace nulls with spaces or use a stored procedure to do the same thing it would be great. I can not use tab or comma delimited as the final product has to be a flat file that each column uses the same amount of characters as is declared in the column headers. Padded with spaces.
Thanks for any help you are able to offer.
Cheers
P.S. I am using SQL Server 2012 Management Studio
The easy way to do this would be to convert the NULLs to spaces in your SELECT statement.
SELECT COALESCE(yourcolumn, '')
Put the COALESCE clause around every column that has NULLs in it.
Using COALESCE article link
If the last thing you do in the stored procedure is Select * From TempTable then you can use that SP in an OleDb source component. Change from Table or View to Sql Command and use the Exec (sp_SomeName) syntax. This will create a pipe that you can connect to a destination component, such as flat file.
I have seen many issues over the years doing Save Results As... I will only use this for informal 'quick check' files and not for anything considered 'live' or 'production' data.
Here is a good blog that also shows how to use parameters.
http://geekswithblogs.net/stun/archive/2009/03/05/mapping-stored-procedure-parameters-in-ssis-ole-db-source-editor.aspx
I am new to dealing with languages in SQL Server and this forum...
The following query:
SELECT [CultureCode], [Target]
FROM [Str].[dbo].[LatestReversal]
where [Target] = N''
CultureCode | Target
am-ET | ማዕከላዊ የብራዚል የቀን ብርሃን ጊዜ
am-ET | ፓስፊክ የቀን ብርሃን ጊዜ
...
Expected:
The query in the code snippet is to a table that forms the first step of an ETL process. Using this query I would expect to see only rows returned have empty strings.
Result:
However I am returned 900+ rows of that have a value for the field Target. All these strings are from Unicode only cultures i.e. windows does not have a specific code page for them. Can someone explain why this is happening?
The CSV file was UTF8 and I have tried Unicode format.
The SSIS job uses NText for the Source and Destination connection data types (no conversion)
The Target field in the DB is nvarchar(Max)
Later in the process I try to process a cube allowing for failures. The exact same strings fail the load process.
Any help appreciated. Even with pointers on other special handling I would need to apply for these cultures.
Cheers,
Seamus
you can simply use IS NULL function to retrieve only empty strings for target field...
SELECT [CultureCode], [Target]
FROM [Str].[dbo].[LatestReversal]
where [Target] IS NULL;
I am trying to import data into my table using
INPUT INTO
The problem is my decimals is using , as a delimiter, and it expects .. So it won't work!
How can i change this? Search and replace in the input file is not an option!
I am using SQL Anywhere 10
I don't believe that it's possible to change the decimal delimiter. You could either preprocess the file (which I know you said is not an option) or load it into a temporary table with the decimal column defined as a string and then use an insert from the temporary table to your real table, performing the necessary conversion at that point.
We have a SQL Server table for user settings. Originally the settings were domain objects which had been serialized as XML into the table but we recently begun serializing them as binary.
However, as part of our deployment process we statically pre-populate the table with predefined settings for our users. Originally, this was as simple as copying the XML from a customized database and pasting it into an INSERT statement that was ran after the database was built. However, since we've moved to storing the settings as binary data we can't get this to work.
How can we extract binary data from a varbinary column in SQL Server and paste it into a static INSERT script? We only want to use SQL for this, we don't want to use any utilities.
Thanks in advance,
Jeremy
You may find it easier to store a template value in a config table somewhere, then read it into a variable and use that variable to fill your inserts:
DECLARE #v varbinary(1000)
SELECT #v = templatesettings from configtable
INSERT INTO usertable VALUES(name, #v, ....)
From SQL Server 2008 onwards you can use Tasks > Generate Scripts and choose to include data. That gives you INSERT statements for all rows in a table which you can modify as needed.
Here's the steps for SQL 2008. Note that the "Script Data" option in SQL 2008 R2 is called "Types of data to script" instead of "Script Data".
I presume you're OK with utilities like Query Analyzer/Mangement Studio?
You can just copy and paste the binary value returned by your select statement (make sure that you are returning sufficient data), and prefix it with "0x" in your script.
If I understand you correctly, you want to generate a static script from your data. If so, consider performing a query on the old data that concatenates strings to form the SQL statements you'll want in the script.
First, figure out what you want the scripted result to look like. Note that you'll need to think of the values you're inserting as constants. For example:
INSERT INTO NewTable VALUES 'value1', 'value2'
Now, create a query for the old data that just gets the values you'll want to move, like this:
SELECT value1, value2
FROM OldTable
Finally, update your query's SELECT statement to produce a single concatenated string in the form of the output you previous defined:
SELECT 'INSERT INTO NewTable VALUES ''' + value1 + ''', ''' + value2 + ''''
FROM OldTable
It's a convoluted way to do business, but it gets the job done. You'll need a close attention to detail. It will allow a small (but confusing) query to quickly output very large numbers of static DML statements.
David M's suggestion of using the 0x prefixing works but i had to add an extra 0 at the end of varbinary data that i was trying to insert.
See the stackoverflow entry below to see the issue with additional 0 that gets added when converting to varbinary or saving to varbinary column
Insert hex string value to sql server image field is appending extra 0