LONG column value - sql

I am having an issue with attempting to import data from an excel csv file. I am using SQL Developer 4.1. My problem is I keep getting an error of:
ORA-01461: can bind a LONG value only for insert into a LONG column
despite the fact that I am not using a LONG column. That is not even an option for me to use. I am only using varchar2, number, and date. Can somebody assist in explaining what the LONG thing is, and how I can get around it? And yes, I am aware of how stupid this sounds.

I think the problem is the other way around, i.e. you are trying to enter a number into the number column that is too high. The Datatype is defined here https://docs.oracle.com/cd/B28359_01/server.111/b28318/datatype.htm#CNCPT1832

Related

Casting a string to float/decimal - big query

Hoping someone can advise me on this. I have two tables in big query, the first is called master, the second is called daily_transfer.
In the master table, there is a column named impression_share, the data type is float and all working correctly.
However my problem is with the daily_transfer table. The idea is that on a daily basis, I'll transfer this data into master. The schema and column names are exactly the same in both tables. The problem however is that in the daily transfer table, in my float column (impression_share), I have a string value, which is < 0.1.
This string isn't pulled up as an issue initially as the table is being loaded from a google sheet, so the error is only highlighted when I try to query the data.
In summary, column type is float, but a recurring value is a string. I've tried a couple of things, firstly replacing the '< 0.1' to '0.1', but I get an error that replace can only be used with an expression of string, string, string. Which makes sense to me.
So I've tried to cast the column instead from float to string, and then replace the value. When I try to cast though I'm getting an error right away:
"Error while reading table: data-studio-reporting.analytics.daily_transfer, error message: Could not convert value to float. Row 3; Col 6."
Column 6 being "impression_share", row 3 value being < 0.1.
The query I was trying is:
SELECT
SAFE_CAST(mydata.impression_share AS STRING)
FROM `data-studio-reporting.analytics.daily_transfer` mydata
I just don't know if its possible what I'm trying to do, or if I would be better recreating the daily_transfer table and setting column 6 (impression_share) as String, to make it easier to replace and then cast before I transfer to the main table?
Any help greatly appreciated!
Thanks,
Mark
Thanks for the help on this, changing the column type in my daily_transfer_table from float to string, then replacing and casting has worked.
SELECT
mydata.Date,
CAST (REPLACE(mydata.Impression_share,'<','') AS FLOAT64 ) as impression_share_final,
mydata.Available_impressions
FROM `data-studio-reporting.google_analytics.daily_transfer_temp_test` mydata
been great for my knowledge to learn this one. thanks!

Excel data type issues

I am using MS query to pull data from sql server and all is good.
Problem starts when data comes from the server I am stuck with data type general for everything, and no way to change the data type in excel.
Main issue is numbers, where in database datatype is decimal yet i can do no calculations on it in excel. Any help would be appreciated.
I am using excel to execute a stored procedure on server
This pulls the data into the following table
Even though the data in the sql server for column price is formatted as decimal it becomes a general data type after getting to excel.
Changing it to number/currency etc. does not change anything.
Also no errors appear. Simply data comes down and no matter what changes in excel I apply nothing changes it all is treated as text.
You can do these things.
Select Column
Click Data-> Text to Columns
Follow the wizard
Set the format
Use this official support ticket from Microsoft
Problem in this case was created by myself.
But I suppose it could easily happen to others who are just starting on their path with sql and excel.
Here is what happened as I established after few days of going in circles.
as there was load of trailing spaces in the data coming down from the server I have decided to tidy things up.
Without considerring implications I have stuck an RTRIM() on everything.
This caused excel to treat everything as strings as string RTRIM is a built in string function.
What made things worse is the fact that when using power query I was able to transform the data to the desired, formats.
Unfortunately MS query does not seem to be quite as clever as power query hence the issues.

SQL query that prevents Excel from converting long integer to scientific notation

So it's been a long time since I've done anything fancy with SQL, so I'm going to do my best to explain. Please be nice, I'm trying my best here.
Basically, I'm pulling information from a database in Snowflake and putting it into a new XML file, and that data is input exactly as-written into a form email.
One of the values is an ID number that's 14 characters long (example: 12345678912345), which is stored in the database as an integer (or so I'm told), but Excel keeps automatically converting it into scientific notation. Since it's an ID number, it needs to look like an ID number, not scientific notation.
Right now, my query just selects & inputs the regular ol' value, and then we manually change it in the Excel sheet. Like literally just SELECT ID_Number from TheThing
One thing I thought might work is:
SELECT CAST(ID_Number as bigint) as ID_Number
... But it doesn't work. Most other solutions I've found don't seem to address my specific scenario of unwanted integer-to-string conversion & I'm distraught.
I'm just an intern and this might have a very obvious answer, but my fellow interns have given up on it and I need to find the answer for my own sanity. It's been a minute since I did anything fancy with SQL so please be nice to me and sorry if this is a dumb question.
In Snowflake, BIGINT and INT(EGER) are the same thing, what you want is VARCHAR. As Ross mentioned in his comment, this is likely just a formatting issue within Excel. In Excel any value can be cast as a string by including a single quote ' at the beginning of the value, or by using the Text-to-Column feature.
If you wanted to try to format it out of Snowflake as a string, casting it might not do the trick unless you include some kind of additional string character.
To get this type of formatting out of Snowflake, you can try:
SELECT '\'' || CAST(ID_Number AS VARCHAR) as ID_Number;

Null check on decimal crystal reports using CDBL({value})

I am using a decimal value in a formula which gives error when there is no data.
I tried using CDBL({value}) i.e. create a formula for value=CDBL({value}) .
The use {#value} in the formula. This used to take care of null values. But now keep getting error IF NOT ISNULL({#Value}) THEN ' A number, or currency amount is required here. Details: errorKind
Any suggestions on how to fix this please
I will try to answer this and see if I get any sort of indication that it worked.. maybe even a correct answer indication :)
You cant have mixed field types returned in Crystal. If one part of the IF statement returns a numeric type then the rest has to be numeric type. If you post your entire formula I (or someone else who is willing to give up valuable time) can show you how it needs to look.

Import PostgreSQL dump into SQL Server - data type errors

I have some data which was dumped from a PostgreSQL database (allegedly, using pg_dump) which needs to get imported into SQL Server.
While the data types are ok, I am running into an issue where there seems to be a placeholder for a NULL. I see a backslash followed by an uppercase N in many fields. Below is a snippet of the data, as viewed from within Excel. Left column has a Boolean data type, and the right one has an integer as the data type
Some of these are supposed to be of the Boolean datatype, and having two characters in there is most certainly not going to fly.
Here's what I tried so far:
Import via dirty read - keeping whatever datatypes SSIS decided each field had; to no avail. There were error messages about truncation on all of the boolean fields.
Creating a table for the data based on the correct data types, though this was more fun... I needed to do the same as in the dirty read, as the source would otherwise not load properly. There was also a need to transform the data into the correct data type for insertion into the destination data source; yet, I am getting truncation issues, when it most certainly shouldn't be.
Here is a sample expression in my derived column transformation editor:
(DT_BOOL)REPLACE(observation,"\\N","")
The data type should be Boolean.
Any suggestion would be really helpful!
Thanks!
Since I was unable to circumvent the SSIS rules in order to get my data into my tables without an error, I took the quick-and-dirty approach.
The solution which worked for me was to have the source data read each column as if it were a string, and the destination table had all fields be of the datatype VARCHAR. This destination table will be used as a staging table, once in SS, I can manipulate as needed.
Thank you #cha for your input.