I am executing simple SQL queries with Toad for Oracle version 13.1.1.5.
I have been reading the Oracle documentations and still struggle with understanding the differences between GLDATE and GL_DATE.
At the moment I know how to use GLDATE to change the date format implicitly by casting the GLDATE to text (without tampering with the computer datetime settings), example:
SELECT * FROM table WHERE to_char(GLDATE,'YYYYMMDD') = '20190105'
But what is GL_DATE (with the "_" character) then used for and why is the above code not working with GL_DATE (it throws the error ORA-00904 GL_DATE: invalid identifier)?
Moreover, how do environment variables connect with the GLDATE/GL_DATE commands?
Related
I am having trouble in my Oracle query that uses a variable stored in SSIS which has a date that is pulled from sql server.
I am using an execute sql task that simply gets a max date from a sql server table and stores it in a variable. E.g.
SELECT MAX(t.Date) FROM table t;
I then want to use that variable in my Oracle query which is an ADO.NET source connection. I noticed you can't parameterize in those connections and found the work around where you use the sql expression with your user variable in it. So now my Oracle source query looks something like this:
"SELECT DISTINCT t.* FROM table t WHERE TO_CHAR(t.LastUpdateDate, 'YYYY-MM-DD') > " + "'#[User::LastUpdateDate]'"
The query syntax itself is fine, but when I run it, it is pulling all rows and seems to be completely ignoring the where clause of the date.
I've tried removing the TO_CHAR from LastUpdateDate.
I've tried adding a TO_CHAR to my user variable #[User::LastUpdateDate].
I've tried using the CONVERSION() function from sql server on #[User::LastUpdateDate].
Nothing seems to work and the query just runs and pulls in all data as if I don't have the WHERE clause on the query.
Does anyone know how to rectify this issue or point out what I might be doing wrong?
Thank you for any and all help!
**EDIT:
My date being pulled from SQL Server is in this format: 2022-09-01 20:17:58.0000000
This is not an answer, just troubleshooting advice
You do not say what data type #[User::LastUpdateDate] is, I'll assume it's a datetime
Ideally all datetime data should be kept in datetime data types, then format becomes completely irrelevant. However since it's difficult to parameterise Oracle queries in SSIS, you have to concoct a string to be submitted. Now date format does become important.
On to something a little different, it is a very good habit performancewise, to not put functions around columns that you are searching on. This is called sargability - look it up.
Given these things, I suggest that you concoct your required SQL query bit by bit and troubleshoot.
First, format your date parameter as an Oracle date literal. Remember this is normally a bad and unecessary thing. We are only doing it because we have to concoct a SQL string.
So create another SSIS variable called strLastUpdateDate and put this hideous expression in it:
RIGHT("0" + (DT_STR,2,1252)DATEPART( "dd" , #[User::LastUpdateDate] ), 2) + '-' +
(DT_STR,3,1252)DATEPART( "mmm" , #[User::LastUpdateDate] ) + '-' +
(DT_STR,4,1252)DATEPART("yyyy" , #[User::LastUpdateDate] )
Yes this is ludicrously long code but it will turn your date variable into a Oracle string literal. You could simplify this by putting it into your original max query but lets not go there. Use whatever debugging technique you have to confirm that it works as expected.
Now you should be able to use this:
"SELECT t.*, '"+#[User::LastUpdateDate]+"' As MyStrDate FROM table t WHERE
t.LastUpdateDate > '" #[User::strLastUpdateDate] + "'"
You can try running that and see if it makes any difference. Make sure you use this https://dba.stackexchange.com/questions/8828/how-do-you-show-sql-executing-on-an-oracle-database to monitor what is actually being submitted to Oracle.
This is all from memory and googling - I haven't done SSIS for many years now
I suspect after all this you may still have the same problem because I recall from many years having the same mysterious issue.
I have migrated a Sybase database to SQL server 2008.
The main application that using the database trying to set some of dateTime2 column with data like 1986-12-24 16:56:57:81000 which is giving this error:
Conversion failed when converting date and/or time from character string.
Running the same query using dot(.) instead of colon(:) as millisecond separator like 1986-12-24 16:56:57.81000 or limiting the milliseconds to 3 digits like 1986-12-24 16:56:57:810 will solve the problem.
NOTE:
1- I don't have access to the source of application to fix this issue and there are lots of table with the same problem.
2. Application connect to database using ODBC connection.
Is there any fast forwarding solution or should i write lots of triggers on all tables to fix it using the above solutions?
Thanks in advance
AS Gordon Linoff said
A trigger on the current table is not going to help because the type
conversion happens before the trigger is called. Think of how the
trigger works: the data is available in a "protorow".
But There is a simple answer!
Using SQL Server Native Client Connection instead of basic SQL Server ODBC connection handle everything.
Note:
1. As i used SQL Server 2008 version 10 of SQL server native client works fine but not the version 11 (it's for SQL Server 2012).
2. Use Regional Settings make some other conversion problem so don't use it if you don't need it.
Select REPLACE(getdate(), ':', '.')
But it will Give String Formate to datetime Which is not covert into DateTime formate
Why would you need triggers? You can use update to change the last ':' to '.':
update t
set col = stuff(col, 20, 1, '.');
You also mistakenly describe the column as datetime2. That uses an internal date/time format. Your column is clearly a string.
EDIT:
I think I misinterpreted the question (assuming the data is already in a table). Bring the data into staging tables and do the conversion in another step.
A trigger on the current table is not going to help because the type conversion happens before the trigger is called. Think of how the trigger works: the data is available in a "protorow".
You could get a trigger to work by creating views and building a trigger on a view, but that is even worse. Perhaps the simplest solution would be:
Change the name and data type of the column so it contains a string.
Add a computed column that converts the value to datetime2.
I need to test sql queries on Oracle SQL Developer.
These queries contain timestamp literals in the format
{ts 'yyyy-mm-dd hh:mm:ss.fff'}
Oracle SQL Developer does not seem to accept this syntax, the symbol { causes error ORA-00911: invalid character.
Is there anything I can do?
EDIT
The sql editor advises me that { is not allowed.
I tried with two other tools (DbVisualizer and DBeaver), both using the Oracle Thin driver, all works fine.
However I still want to use that syntax in Oracle SQL Developer because it has interesting features.
The queries I have to test are not written by me, change syntax is not an option.
Use an actual SQL timestamp literal:
TIMESTAMP 'yyyy-mm-dd hh:mm:ss.fff'
What you were using is the JDBC escape syntax, which is supported by JDBC drivers, but not by the Oracle database itself.
You can use CAST
select to_char(cast(sysdate as timestamp),'DD-MON-YYYY HH24:MI:SS.FF') from dual
See the answer of : How to convert date to timestamp(DD-MON-YYYY HH24:MI:SS.FF format) in oracle?
The "{ts xxx}" syntax is specific to ODBC or OLEDB driver...
It seems this question is asked a lot, but none of the answers have given me results. I'm pulling my hair out here ... so hopefully someone has an answer.
I have a production server running SQL Server 2005. I backed up the db and restored it on my laptop's SQL Server Express instance. Now date queries are seriously affected. In the prod. server they are all stored as "4/13/2011 12:00:00 AM" format, but on my laptop they are showing as "2011-04-14 00:00:00.000". When I do a query trying to find entries on "4/14/2011" my laptop gives me the error "The conversion of a varchar data type to a datetime data type resulted in an out-of-range value". Edit: This exact query runs fine on the production SQL server. (I'm using SSMS to run queries ... not an application/code)
I made sure my laptop's Windows regional settings are the same as the server (English(United States)) and everything on the Region and Language control panel is the exact same.
Finally I ran the following two queries:
select name ,alias, dateformat
from syslanguages
where langid =
(select value from master..sysconfigures
where comment = 'default language')
select ##language
Which gave the result of "*us_english, English, mdy*" ..... and "British" respectively. Where is this British coming from?! Now when I run this command before my query (in the management studio)
SET DATEFORMAT mdy
Then everything works perfectly! But in the syslanguages query it seems to already be mdy format. I'm not about to rewrite my application with "SET DATEFORMAT" all over the place - so hopefully someone has a clue. Maybe my SQL Express installation is buggared and I have to reinstall it?
I'm going to keep tinkering to hopefully get this to work.
It is the language settings of the login that you need to change.
You can do it through SSMS -> Security -> Logins -> YourLogin -> Properties -> Default Language
Or through TSQL
ALTER LOGIN [YourLogin] WITH DEFAULT_LANGUAGE=[us_english]
If you specify things in dd-mmm-yyyy format, or better still, use DateTime parameters (rather than varchar input), you won't have a problem.
Try changing the default language to us-english, using the following:
EXEC sp_configure 'default language', 1033
RECONFIGURE
If that doesn't work, you could try language code 0.
Are you passing dates as string literals - or concatenating strings to build up the sql in your application?
If you are doing this, consider using parameters for the dateTime data types. Or at least escape them using the ODBC escape clause and format them as in { ts'yyyy-mm-ddhh:mm:ss[.fff] '} such as: { ts'1998-09-24 10:02:20' }.
It is most probably your application not passing the datetime properly.
I built a small query tool for Oracle using OracleCommand and OracleDataAdapter. Users just input a full query (no parameters), execute and the results are shown in a datagridview. So far so good, although I tried an invalid query, e.g.:
SELECT * FROM mytable WHERE dateColumn = '1-JAN-10'
This query is not valid SQL for Oracle. You have to use the to_date() function to compare with date literals. SQL developer also rejects it, but somehow my query tool just works. Does that mean my OracleCommand is a bit of a wizard here or am I doing something wrong? Also is there a way to omit this behavior because the purpose of the tool is testing queries, which should work always...
Thanks
The query may be valid for Oracle. You don't have to use to_date() if you give the date string in your session's date format, though it's generally better to do that anyway to avoid issues like this.
It sounds like you have a different NLS_DATE_FORMAT in your tool's environment to that in SQL Developer, or the session date format is being set implicitly by OracleCommand.
You can select value from nls_session_parameters where parameter = 'NLS_DATE_FORMAT' to see what it is from SQL*Plus and SQL Developer, and from your tool; and from nls_database_parameters to see which is overriding the database default.
Looks like your tool may have DD-MON-RR and you're expecting some other format elsewhere, but without checking those tables it's hard to say where you're using the database default and where you're overriding it at session level. I'd guess that is the DB default though and you have an override in your other environments.
From SQL Developer, try alter session set nls_date_format='DD-MON-RR'; and then re-run your invalid query - should work there too.