I am trying to run this pivot query to show the dates as columns which are in this format: "MM/DD/YYYY" and the occurrences of some kind of ID's in each date:
The column which contains the dates is "DATE_POSTED" -- DATA TYPE date
The column which contains the ID's is "ID_INST" -- DATA TYPE varchar2
Query:
SELECT *
FROM (SELECT ID_INST, DATE_POSTED
FROM total.table1) PIVOT XML (COUNT (DATE_POSTED)
FOR (DATE_POSTED)
IN (SELECT distinct DATE_POSTED
FROM total.table1));
The error which i'm receiving is ORA-00918: column ambiguously defined, I did some searches but I keep getting this error. Not sure if my approach is totally correct. P.S im using XML keyword due to the fact that it prompted: missing keyword
Current table:
Expected result:
Try the following:
SELECT *
FROM (SELECT ID_INST, TO_CHAR(DATE_POSTED, 'DD-Mon') DATE_POSTED
FROM TOTAL.TABLE1)
PIVOT XML (COUNT(DATE_POSTED)
FOR DATE_POSTED IN (ANY))
The poblem might be caused by the fact, that date also stores time information additional to the date.
So you get different values for DATE_POSTED, but the conversion to char leads to the same column name as the date format mask cuts the time information.
Related
I'm fairly new to SQL and am trying to write a query in sql server to get sums of documents week. My query looks like this:
SELECT
[table].[week],
SUM(ISNULL([table].[documents],0))
FROM
[table]
JOIN
(VALUES (15,187293),...other pairs...,(127,120918)) AS Outside ([ID],[Organization]) ON Outside.[ID]=[table].[ID]
AND Outside.[Organization] = [table].[Organization]
GROUP BY
week
This same query worked on a different table (formatted exactly the same) outputting documents per week, but when I run it on this table, I get the error
Msg 245, Level 16, State 1, Line 2
Conversion failed when converting the varchar value '#N/A' to data type int.
There is nothing in the documents column that looks like #N/A. How can I make this sum appropriately?
To find the offending rows, run the query:
select t.*
from table t
where id = '#N/A' or organization = '#N/A';
Or better yet:
select t.*
from table t
where try_convert(int, id) is null or try_convert(int, organization) is null;
You can fix the problem by using appropriate types in the outside set of values. That is, one or both of the values should be in single quotes so they are strings rather than numbers.
I have a query in a validation stored procedure. It goes something like this:
SELECT *
FROM Table1
WHERE batchID IN (SELECT id FROM #tempIds)
AND CAST(field1 AS DATE) > CAST(field2 AS DATE)
Both field1 and field2 have valid dates, i.e. doing IdDate on field1/2 returns 1.
The #tempIds table only has one column ID and contains only one row.
When I run the above query, I get this error:
Unable to convert varchar to date
But instead of selecting batch ids from temp table if I put hard-coded ID from the same temp table it works.
Any ideas what could be the issue?
The problem is that you are using varchar to store date (or datetime) values.
Choosing the correct data type for your columns would save you from a lot of problems, this one included. For detailed information, read Aaron Bertrand's Bad habits to kick : choosing the wrong data type.
Now, to address our conversation in the comments - SQL Server does not guarantee the order on which the conditions in the where clause are evaluated. This means that even if all the "date" strings in both your columns are convertible to date values for the specific batchID, you only need one wrong value in one of the columns to raise the "Unable to convert varchar to date" error.
This also means that even if you where to write your query the way Larry B suggested in his answer (now deleted - so for the sake of future readers - this was his suggestion:)
SELECT *
FROM Table1
WHERE batchID IN (SELECT id FROM #tempIds)
AND ISDATE(field1) = 1
AND ISDATE(field2) = 1
AND CAST(field1 AS DATE) > CAST(field2 AS DATE)
There is no guarantee that the last condition (cast(field1 as date) > cast(field2 as date)) will be evaluated after the isdate(field1)=1 condition.
The correct thing to do is to fix the problem - change the data types of the columns to the correct data type.
Assuming that can't be done (if you have no control over the structure of the database, for instance) - you can do a couple of things:
Find all the places where the values in field1 and in field2 can't be converted to dates and fix them. Consider adding a check constraint to validate that the values of these columns can be converted to date (assuming you can).
Separate your query into 2 parts:
;With cte as
(
SELECT *
FROM Table1
WHERE batchID IN (SELECT id FROM #tempIds)
AND ISDATE(field1) = 1
AND ISDATE(field2) = 1
)
SELECT *
FROM cte
WHERE CAST(field1 AS DATE) > CAST(field2 AS DATE)
This will eliminate the error, since you are only casting values where the ISDATE function already returned 1, but might not return some rows you want back, if the value if either field1 or field2 is wrong in these rows.
I have a table with a string column. I convert this column to a number using the function TO_INTEGER(). Ist work fine. But If I Aggregate the converted column with the function SUM I got this error:
SAP DBTech JDBC: [339]: invalid number: not a valid number string ' ' at function to_int()
This is my sample SQL query:
select SUM(PARTICIPANT)
from (
select TO_INTEGER(STUDENT) as PARTICIPANT
from MyTable)
Column STUDENT is a varchar(50) in MyTable
What did I do wrong?
Thanks in advance
Without seeing your Column values, it looks like you're trying to convert the numeric sequence at the end of your values list to a number, and the spaces that delimit it are throwing this error. But based on the information you've given us, it could be happening on any field.
Eg:
Create table Table1 (tel_number number);
Insert into Table1 Values ('0419 853 694');
The above gives you a
invalid number
Kindly Check Filter/ where clause if you try to give string column value as integer that time you got this error message. I wrote in HANA - Calculation view - SQL Script, In where clause Bukrs (Company Code) = 1000 after that i changed Bukrs = '1000'. then this issue is resolved.
I just need to make a simple query of a table on a MapR cluster in that I want to know what the date is of the most recent record in the table. Dates are in a 'report_date' column in string format. I tried the following query without success:
select max(report_date) from partition.table_name
I know the second part of the statement works. Is there something wrong with the first part?
Thanks,
A
Your date column datatype is string hence the max function doesnt produce the output as desired.
for example : string column with values 1,2,3,4 and when you run max(column) you wont get the output as 4 , since max doesnt work on string datatype.
Try changing your datatype to DATE or TIMESTAMP , Which should work.
OR
if changing datatype is not possible then try,
If there is an auto incrementing ID column in the table or any column like so , then
select report_date from table_name order by ID desc.
This should provide you the max date sting.
I'm a newbie with Oracle. In SQLite, PostgreSQL or MSSQL I can do the following query:
SELECT * FROM users ORDER BY id, email
Here is the definition of USERS:
CREATE TABLE "USERS" (
"ID" NUMBER(38,0) NOT NULL,
"EMAIL" VARCHAR2(255)
)
Id is NUMBER type and email is VARCHAR type.
When I run the above SELECT query in Oracle it will raise the error:
ORA-00932: inconsistent datatypes: expected - got CLOB
Is there anyway to do that in Oracle?
Thank for your interest.
Seems like the email field is a CLOB and not a VARCHAR. You cannot ORDER BY a CLOB.
If your column is a CLOB then you can CAST() the field to order by:
select *
from users
order by id, cast(email as varchar2(50)) -- or whatever length you want.
See SQL Fiddle With Demo
try it with column positions also-
(this means you should not use * - which i would recommend anyway...)
select id, email from users order by 1,2
it should also be possible to order by the first portion of the CLOB by applying some string concatenation function in th eorder by clause.
Double check the real type of email column. It just seems to be CLOB which is not applicable for order by.
Obviously, there is not need to store emails that are longer than 4000 chars.
So CLOB should be replaced with VARCHAR2(255).
See What is the maximum length of a valid email address?.
There may be some problem with Oracle type mapping in your ORM.