I am getting this error and don't understand why:
Error Executing Database Query. [Macromedia][SQLServer JDBC
Driver][SQLServer]Invalid column name 'buildno'. The error occurred
in C:/data/wwwroot/webappsdev/cfeis/redbook/redbook_bio_load.cfm: line
10
8 : select *
9 : from redbook_bio
10 : where build_num = '#session.build_num#'
11 : </cfquery>
12 :
VENDORERRORCODE: 207 SQLSTATE: 42S22 SQL: select * from
redbook_bio where buildno = '4700' DATASOURCE: xxxx
******"
It is saying buildno is an invalid column name, but I do not have that name in my query. I used to, but changed both the column in the database and the column name in the query to build_num. You can see my exact code with line numbers, and that there is no 'buildno' in there. But looking at the SQL statement below that, it is still trying to use 'buildno'.
I had my editor check the directory for anywhere it says buildno and no results came back. I have restarted the CF Service and cleared the cache. Why would it still be trying to run it with buildno instead of build_num like the code says?
There was a cfquery cache setting in the Administrator. We had it set to 100. Apparently clearing the cache template and component cache doesn't clear the cfquery cache. I changed the query name and it fixed the problem. It most likely could have been fixed by setting the cfquery cache value to 0.
Related
I've got a Hive SQL script/action as part of an Oozie workflow. I'm doing a CREATE TABLE AS SELECT to output the results. I want to name the table using the username plus an appended string (e.g. "User123456_output_table"), but can't seem to get the correct syntax.
set tablename=${hivevar:current_user()};
CREATE TABLE `${hiveconf:tablename}_output_table` AS SELECT ...
That doesn't work and gives:
Error while compiling statement: FAILED: IllegalArgumentException java.net.URISyntaxException: Relative path in absolute URI: ${hivevar:current_user()%7D_output_table
Or changing the first line to set tablename=${current_user()}; starts running the SELECT query but eventually stops with:
Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hive.ql.metadata.HiveException: [${current_user()}_output_table]: is not a valid table name
Or changing the first line to set tablename=current_user(); starts running the SELECT query but eventually stops with:
Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. org.apache.hadoop.hive.ql.metadata.HiveException: [current_user()_output_table]: is not a valid table name
Alternatively, is there a way to pass the username from the Oozie workflow via a parameter?
I'm using Hue to do all this rather than the command line.
Thanks
This is wrong: set tablename=${hivevar:current_user()}; - it will not be resolved and substituted as is.
Hive does not calculate variables before substitution, it substitutes them as is, all functions in variables are NOT calculated. variables are just text replacement.
This:
set tablename=current_user();
CREATE TABLE `${hiveconf:tablename}_output_table` ...
gets resolved as
CREATE TABLE `current_user()_output_table` ...
And functions are not supported in table names, it will not work this way.
The solution is to calculate functions outside the script and pass them as parameters.
See this blog: https://prodlife.wordpress.com/2013/12/06/parameterizing-hive-actions-in-oozie-workflows/
More info from stms :
Short Text
SQL error "SQL code: -10108" occurred while accessing table "CDHDR".
What happened?
Database error text: "SQL message: Session has been reconnected."
Return value of the database layer: "SQL dbsl rc: 99"
The problem is that client is running this code through the job in which I am not available to debug it. All I know is that it is working for other countries selected but for some specific it is not so the question is :
Was it a temporary connection error between two servers or is it data overload issue due to SELECT statement?
At the moment I am not really sure which thing I have to look on the system, the same program once produced the error for exceeding the limit in the db query ( > 10 minutes) so this might be related to system configuration or what?
Thanks in advance
I am unable to query with .sql file in DataProcHiveOperator.
Though the documentation tells that we can query using file. Link of the documentation Here
It is working fine when I give query directly
Here is my sample code which is working fine with writing query directly:
HiveInsertingTable = DataProcHiveOperator(task_id='HiveInsertingTable',
gcp_conn_id='google_cloud_default',
query='CREATE TABLE TABLE_NAME(NAME STRING);',
cluster_name='cluster-name',
region='us-central1',
dag=dag)
Querying with file :
HiveInsertingTable = DataProcHiveOperator(task_id='HiveInsertingTable',
gcp_conn_id='google_cloud_default',
query='gs://us-central1-bucket/data/sample_hql.sql',
query_uri="gs://us-central1-bucket/data/sample_hql.sql
cluster_name='cluster-name',
region='us-central1',
dag=dag)
There is no error on sample_hql.sql script.
It is reading file location as a query and throwing me the error as:
Query: 'gs://bucketpath/filename.q'
Error occuring - cannot recognize input near 'gs' ':' '/'
Similar issue has also been raised Here
The issue is because you have passed query='gs://us-central1-bucket/data/sample_hql.sql' as well.
You should pass exactly 1 of query or queri_uri.
The code in your question has both of them, so remove query or use the following code:
HiveInsertingTable = DataProcHiveOperator(task_id='HiveInsertingTable',
gcp_conn_id='google_cloud_default',
query_uri="gs://us-central1-bucket/data/sample_hql.sql",
cluster_name='cluster-name',
region='us-central1',
dag=dag)
I am running a very simple query and trying to extract the results to a text file. The entire query is essentially what is below, I am selecting everything from one single table with one piece of where criteria which is limiting the data to one month's worth. After it has extracted around 1.2 gig this error shows up. Is there any way that I can work around this other than extracting smaller date ranges? I am trying to pull a couple of years worth of data so if I can only get it a few days at a time it will take a lot of manual work.
I am currently using the free trial of a DB2 query tool - Razor SQL if that makes a difference, I can probably purchase different software if it would help. I am trying to get IBM's tool but for some reason it freezes during the download so I am still working on that. I have searched about this error but everything I see seems much more complex than what I am doing and I can't tell if it applies or not. Thanks in advance.
select *
from MyTable
where date_col between date '2014-01-01' and date '2014-01-31'
I stumbled at this error too, found out it is related to db2jcc.jar (type 4) driver.
Excerpt: If there are no items in the result set left (or to begin with), the Result set is closed automatically and therefore the Exception. Suggestion is to handle it in the application, perhaps in my case, I started checking if(rs.next()) but otherwise, there is a work around. Check out the source link below for how you can set some properties to Data source and avoid exception.
Source :
"Invalid operation: result set is closed" error with Data Server Driver for JDBC
In my case, i missed some properties in WAS, after add allowNextOnExhaustedResultSet the issue is fixed.
1.Log in to the WebSphere Application Server administration console.
2.Select Resources > JDBC > Data sources > Application Center DataSource name > Custom properties and click New.
3.In the Name field, enter allowNextOnExhaustedResultSet.
4.In the Value field, type 1.
5.Change the type to java.lang.Integer.
6.Click OK.
Sometimes you need also check whether resultSetHoldability properties exists. Details refer to here.
I encountered this failure also when ugrading from JDBC Type 2 driver (db2java.zip) JDBC type 4 driver (db2jcc4.jar)
Statement statement = results.getStatement();
if (statement != null)
{
connection = statement.getConnection(); // ** failed here
statement.close();
}
Solution was to check if the statement is closed or not as follows.
Changed to:
Statement statement = results.getStatement();
if (statement != null && !statement.isClosed()) {
{
connection = statement.getConnection();
statement.close();
}
Creating property bellow with type Integer it's worked for me:
allowNextOnExhaustedResultSet:
I had the same issue on WAS 7 so i had to add and change few this on Admin Console.
This TeamWorksRuntimeException exception should be fixed by applying APAR JR50863 which is available on top of BPM V8.5.5 or included on BPM V8.5 refresh pack 6.
For the case that the APAR does not solve the problem, try following workaround:
Log in to the WebSphere Application Server admin console
Select Resources > JDBC > Data sources > DataSource name (TeamWorksDB) > Custom properties and click New
In the Name field, enter downgradeHoldCursorsUnderXa
In the Value field, type true
Change the type to java.lang.Boolean
Click OK to save your changes
Select custom property resultSetHoldability
In the Value field, type 1
Click OK to save your changes
Source of the Answer : https://developer.ibm.com/answers/questions/194821/invalid-operation-result-set-is-closed-errorcode-4/
Restarting the app may fix the problem if connection pool lost session to Db2. If using Tomcat then connection pool property of 'testonBorrow' may reestablish the connection to Db2.
Good day!
I get this error:
SQL STATE 37000 [Microsoft][ODBC Microsoft Access Driver] Syntax Error
or Access Violation, when trying to run an embedded SQL statement on
Powerscript.
I am using MsSQL Server 2008 and PowerBuilder 10.5, the OS is Windows 7. I was able to determine one of the queries that is causing the problem:
SELECT top 1 CONVERT(DATETIME,:ls_datetime)
into :ldtme_datetime
from employee_information
USING SQLCA;
if SQLCA.SQLCODE = -1 then
Messagebox('SQL ERROR',SQLCA.SQLERRTEXT)
return -1
end if
I was able to come up with a solution to this by just using the datetime() function of PowerBuilder. But there are other parts of the program that is causing this and I am having a hard time in identifying which part of the program causes this. I find this very weird because I am running the same scripts here in my dev-pc with no problems at all, but when trying to run the program on my client's workstation I am getting this error. I haven't found any differences in the workstation and my dev-pc. I also tried following the instructions here, but the problem still occurs.
UPDATE: I was able to identify the other script that is causing the problem:
/////////////////////////////////////////////////////////////////////////////
// f_datediff
// Computes the time difference (in number of minutes) between adtme_datefrom and adtme_dateto
////////////////////////////
decimal ld_time_diff
SELECT top 1 DATEDIFF(MINUTE,:adtme_datefrom,:adtme_dateto)
into :ld_time_diff
FROM EMPLOYEE_INFORMATION
USING SQLCA;
if SQLCA.SQLCODE = -1 then
Messagebox('SQL ERROR',SQLCA.SQLERRTEXT)
return -1
end if
return ld_time_diff
Seems like passing datetime variables causes the error above. Other scripts are working fine.
Create a transaction user object inherited trom transaction.
Put logic in the sqlpreview of your object to capture and log the sql statement being sent to the db.
Instantiate it, connect to the db, and use it in your embedded sql.
Assuming the user gets the error you can then check what was being sent to the db and go from there.
The error in your first statement should be the second parameter to CONVERT function.
It's type is not a string, it's type is an valid expression
https://learn.microsoft.com/en-us/sql/t-sql/functions/cast-and-convert-transact-sql
So I would expect that your
CONVERT(DATETIME,:ls_datetime)
would evaluate to
CONVERT(DATETIME, 'ls_datetime')
but it should be
CONVERT(DATETIME, DateTimeColumn)
The error in your second statement could be that you're providing an wrong datetime format.
So please check if your error still occurs when you use this function
https://learn.microsoft.com/en-us/sql/t-sql/statements/set-dateformat-transact-sql
with the correct datetime format you're using