I had a table definition with
SSN varbyte(100)
I just want to test my mload script ,
so I gave
CHAR2HEXINT(:SSN) //SSN is defined as varchar in layout.
UTY0805 RDBMS failure, 3532: Conversion between BYTE data and
other types is illegal.
Is there any way that i can convert varchar to varbyte with out using any UDF ?
NOTE:
I cant change my table definition , I just want to test my script .
i just got access to the encrypt function and used that to load test data.
as suggested by dnoeth ,
using TO_BYTES and FROM_BYTES to cast BYTE<->CHAR
seems to be working
Related
I am attempting to load a database from a CSV file using AsterixDB. Currently, it works using only string, int, and double fields. However, I have a column in the CSV file that is in DateTime format. Currently I am importing them as strings, which works fine, but I would like to import them as the SQL DateTime data type. When I try changing my schema and reimporting I get the following error:
ERROR: Code: 1 "org.apache.hyracks.algebricks.common.exceptions.NotImplementedException: No value parser factory for fields of type datetime"
All entries are in this format 02/20/2010 12:00:00 AM.
I know this isn't exactly inline with the format specified by the Asterix Data Model, however, I tried a test line with the proper format and the error persisted.
Does this mean AsterixDB cant parse DateTime when doing mass imports? And if so how can I get around this issue?
Any help is much appreciated.
Alright, after discussing with some colleagues, we believe that AsterixDB does not currently support DateTime parsing when mass importing. Our solution was to upsert every entry in the dataset with the parsing built into the query.
We used the following query:
upsert into csv_set (
SELECT parse_datetime(c.Date_Rptd, "M/D/Y h:m:s a") as Datetime_Rptd,
parse_datetime(c.Date_OCC, "M/D/Y h:m:s a") as Datetime_OCC,
c.*
FROM csv_set c
);
As you can see we parse the strings using the parse_datetime function from the AsterixDB Temporal Functions library. This query intentionally doesn't erase the column with the DateTimes in string format, although that would be very simple to do if your application requires it. If anyone has a better or more elegant solution please feel free to add to this thread!
I'm working on ssis package which exports data from SQL Server to Excel. I had a problem converting non-unicode to unicode string data types. So I created a derived Column task and converted to Unicode string [DT_WSTR] 4 columns which have a type Varchar(40) in SQL Server table. It worked with these columns. But I also have a Description column of type varchar(max) and I tried to convert it to Unicode text stream [DT_NTEXT]. It did not work.
If your source is SQL Server (as you said), you can convert it directly in your SQL Query
SELECT
CONVERT(NVARCHAR(40), 'att1')
,CONVERT(NTEXT, 'att2')
Convert your VARCHAR into NVARCHAR
Convert your TEXT into NTEXT
it's faster.
P.S. To test it (Do not forget to delete or reset your previous OLE DB Input component) -> It will be forced to reevaluate your datatype
Does it help you?
The only thing that worked was to cast a Description column in Stored Procedure as varchar(1000). I checked the max length of this field and it was about 300 characters. So I made it varchar(1000) and in Derived column Unicode string [DT_WSTR]. This was a workaround, but I still want to know how to make it in ssis package without converting data type in Stored Procedure.
By mistake I have not prefixed a Unicode string with N and I have inserted data that now contains ? instead of the original Unicode characters. Using this as an example:
SELECT T.A FROM ( SELECT '男孩 SQL' A) T
It is returning ?? SQL instead of 男孩 SQL.
So how can I get the actual value; what can I use in outer SELECT statement?
If you are declaring a hard coded NVARCHAR string value it is important to use this format:
DECLARE #Variable NVARCHAR(10) = N'YourVariableHere'
Instead of :
DECLARE #Variable NVARCHAR(10) = 'YourVariableHere'
The second method will cause an implicit column conversion which is at best bad for performance and at worst will incorrectly interperit the results. I just found a cool little test for this. Run this script in SQL Server.
SELECT N'௰', '௰';
You will get this as a result.
If you're interested in more information on Implicit Column Conversion look here.
However since you've already inserted this data into your database you are out of luck. There is no way to recover this unless you have the scripts saved somewhere else.
If you have the permissions and powers to insert data into a production system I suggest you exercise more caution next time.
We have a stored procedure that gets call from an application. One of the input to the SP is SSN. We are using SQL Server 2008R2 Enterprise Edition.
The SSN data type is input as VARCHAR(9). We take that SSN and join it to a table who has the SSN stored as DECIMAL(9,0). This causes an implicit conversion at execution time and is affecting performance.
My original plan was to have the input come in as VARCHAR and just copy the variable over to another variable with the DECIMAL datatype however, we have a kink in the process. A user can also input text as 'new' when there is a new client. When this happens, we get a failed to convert datatype varchar to decimal.
Is there a way to conditionally change the data type of a variable based on the input it receives?
Here's an example of what I am trying to use now, to no avail, as the instances where 'new' is entered cause a conversion error:
CREATE PROCEDURE [dbo].[StoredProcedureFromApplication](
#SubscriberSSN VARCHAR(9))
AS
DECLARE #SSN_DEC DECIMAL(9,0)
SET #SSN_DEC = #SubscriberSSN
Any ideas on how to have the datatype able to be changed?
First, you should store SSNs as strings and not numbers, because they can start with 0.
I think the best you can do is to have the resulting value be NULL
SET #SSN_DEC = (case when isnumeric(#SubscriberSSN) = 1
then cast(#SubscriberSSN as decimal(9, 0))
end);
I got the following error trying to convert from a varchar to varbinary..
Implicit conversion from data type
varchar to varbinary(max) is not
allowed. Use the CONVERT function to
run this query.
and this is the alter command that I tried.
alter table foo Alter column bar varBINARY(max)
So how do I use the convert function for something like this? I did a bit of searching via google and had no luck.
Thanks.
You have to rebuild the table.
Use SSMS designer to change the type: this will generate a script with CONVERT.