I have a linked server to Oracle database in SQL server and retrieve data to local SQL server database every day by scheduling, the problem is: one of the Oracle database column has holding number with 18 fixed digits which type is NUMBER(18) and when I am trying converting that column to numeric(18,0) or numeric(38,0) and so on, the data converted but for many of them, last digit is different with source data, for example:
data in Oracle database(source): 100002345678912345
data in SQL database (destination): 100002345678912348
Thanks to #Jeroen Mostert.
I used DBCC TRACEON (7314) before INSERT INTO and my data is changed to DOUBLE type, after that to solve the problem I used SELECT CAST(COLUMN_NAME AS numeric(18,0))
for example:
My Real Data:100002345678912345
My Data (wrong data): 100002345678912348
My Data after using DBCC TRACEON (7314):
100002345678912345.0000000000
My Data after using SELECT CAST(COLUMN_NAME AS NUMERIC(18,0)):
100002345678912345
Should the data type have its limit number written or just indicating the data type only
I tried using varchar(100) as a data type and it brought an error indicating the number of values exceeds 100
I have a column stored as blob data type in SQL Server 2014. I want to extract the blob column into a string. I tried the following SQL statement, but long string is getting truncated by varchar(max) (due that the maximum storage size is 2^31-1 bytes (2 GB))
SELECT CONVERT(VarChar(max), blobfield)
Is there a way in SQL Server to view complete string on blob to text conversion?
Thanks in advance.
I have a table with a field datatype of NVarchar2(4000) I am moving data from a SQL Server to an Oracle Server. The SQL Server datatype is also nvarchar(4000). I have checked the MAX Size of this field on the SQL Server side, and the MAX is 3996, which is 4 characters short of the 4000 limit.
When I try to insert this data into Oracle, I get an error "LONG" due to the size.
What is going on here, will the Oracle NVarchar2(4000) not allow 4000 characters? If not, what is the limit, or how can I get around this?
There is a limit of 4000 bytes not 4000 characters. So NVARCHAR2(4000) with an AL16UTF16 national character set would occupy the maximum 4000 bytes.
From the oracle docs of MAX_STRING SIZE:
Tables with virtual columns will be updated with new data type
metadata for virtual columns of VARCHAR2(4000), 4000-byte NVARCHAR2,
or RAW(2000) type.
Solution:-
Also if you want to store 4000 characters then I would recommend you to use CLOB
A CLOB (Character Large Object) is an Oracle data type that can hold
up to 4 GB of data. CLOBs are handy for storing text.
You may try like this to change column data type to CLOB:
ALTER TABLE table_name
ADD (tmpcolumn CLOB);
UPDATE table_name SET tmpcolumn =currentnvarcharcolumn;
COMMIT;
ALTER TABLE table_name DROP COLUMN currentnvarcharcolumn;
ALTER TABLE table_name
RENAME COLUMN tmpcolumn TO whatevernameyouwant;
First, as others have pointed out, unless you're using 12.1, both varchar2 and nvarchar2 data types are limited in SQL to 4000 bytes. In PL/SQL, they're limited to 32767. In 12.1, you can increase the SQL limit to 32767 using the MAX_STRING_SIZE parameter.
Second, unless you are working with a legacy database that uses a non-Unicode character set that cannot be upgraded to use a Unicode character set, you would want to avoid nvarchar2 and nchar data types in Oracle. In SQL Server, you use nvarchar when you want to store Unicode data. In Oracle, the preference is to use varchar2 in a database whose character set supports Unicode (generally AL32UTF8) when you want to store Unicode data.
If you store Unicode data in an Oracle NVARCHAR2 column, the national character set will be used-- this is almost certainly AL16UTF16 which means that every character requires at least 2 bytes of storage. A NVARCHAR2(4000), therefore, probably can't store more than 2000 characters. If you use a VARCHAR2 column, on the other hand, you can use a variable width Unicode character set (AL32UTF8) in which case English characters generally require just 1 byte, most European characters require 2 bytes, and most Asian characters require 3 bytes (this is, of course, just a generalization). That is generally going to allow you to store substantially more data in a VARCHAR2 column.
If you do need to store more than 4000 bytes of data and you're using Oracle 11.2 or later, you'd have to use a LOB data type (CLOB or NCLOB).
As per the documentation though the width refers to the number of characters there's still a 4,000 byte limit:
Width specifications of character data type NVARCHAR2 refer to the number of characters. The maximum column size allowed is 4000 bytes.
You probably have 4 multi-byte characters.
Can anyone tell me will there be any impact on changing the datatype of a column from char to varchar2.
Because the issue i am facing is when i fire a select query i.e
select * from table_name where column_name in ('X','Y','Z');
The above query is returning only few rows. And recently the column_name data type was changed from char to varchar. The rows returned are the rows after the data type was changed.
A varchar2 datatype, when stored in a database , uses only the space allocated to
it. If you have a varchar2(100) and put 50 bytes in the table, it will use 52 bytes
(leading length byte).
A char datatype, when stored in a database table, always uses the maximum length and is
blank padded. If you have char(100) and put 50 bytes into it, it will consume 102
bytes.
So in your case probably its only giving the rows from the space allocated to varchar and hence only few rows are returned i believe.
Refered from : http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1542606219593