SSRS Execution Log updating problem - sql-server-2005

Hi All,
When i tried to execute RSExecutionLog_Update.dtsx to update my execution log data. it throws me this error. Can anyone help with this issue with me?
Thanks

The problem is the line: Invalid object name "ExecutionLog"
Could be that the table (or database) hasn't been created, or there's a permissions problem.
You haven't posted any information on the tables you are using or actual step that failed. You will be able to see in the SSIS package exactly which step failed and look for more detail.

Related

Optimizer internal error while loading data from U-SQL table

Is there a way to get around this error.
"CQO: Internal Error - Optimizer internal error. Assert:
a_drgcidChild->CLength() == UlSafeCLength(popMS->Pdrgcid()) in
rlstreamset.cpp:499"
Facing this issue while loading data from partitioned U-SQL table.
#myData =
SELECT *
FROM dbo.MyTable;
If you encounter any system error message (or something that says Internal Error), please open a support ticket with us and/or send me your job link (if it happens on the cluster) or a self-contained smallest repro (if it is happening with local run) to usql at microsoft dot com.
Thanks
Michael
UPDATE: This issue has been fixed and will be made available in the next refresh. If you are blocked, please contact me for a private runtime.

SAP BODS error Correlation name '' not found

I am working with SAP Data Services at the moment. I need to load a load procedure from the sql file. And i'm getting this error:
SQL submitted to ODBC data source resulted in error <[Sybase][ODBC Driver][Sybase IQ]Correlation name 't_table' not
found>. The SQL submitted is .
Had someone the same error? I spent 2 days already and can't find any solution!
Thanks
P.S.: all aliases are OK
Function call wasn't correct. Now it running correct

Pentaho Execute SQL Statements variable conversion to null

I am using PDI to delete and insert some data from a DB. I have the following issue. I create two variables called START_DATE and END_DATE that are used to select the data that will be deleted from my DB. I am able to get them and run my transformation with no erors in the log file, but when I checked if data was deleted, I find it didn't. I send checked my "DeleteProcedure" step, and it says "Conversion error: null". I have tried different approached to take the variables and pass them as Strings, but I haven't been able to solve this issue. It cannot be a SQL mistake as I tested it with a constant and it works.
Any ideas? I attach some pics. Thanks!
As a documentation of the Execute SQL script says:
Note: When you have an issue, that the SQL is started at the initialization phase of the transformation and not for each row, make sure to check the option "Execute for each row" (see description below).
In your case it executes during the initialization phase of the transformation that's why it gets null values instead of ones from previous step.

Error "Arithmetic operation resulted in an overflow."

I'm tasked to create a program that will run an extremely long query. That query executes well in Oracle but whenever I try to run it from VB.Net, it results to the error mentioned on the title. And also, I have noticed that when I copy my query to an SQLDataSource, it copied only certain parts and not the whole query. Is there any chance for this? Thank you!

Strange result when running query over a table created from a result of another query

Since yesterday, 1-09-2012, I can't run any queries over a table that has been created from the result of another query.
example query:
SELECT region FROM [project.table] LIMIT 1000
result:
Query Failed
Error: Field 'region' is incompatible with the table schema.
49077933619
These kinds of queries passed successfully every day, last couple of weeks. Has anybody else encountered a similar problem?
We added some additional schema checking on friday. I was unable to reproduce the problem but I'll look into your examples (I was able to find your failed job in the logs). I'm in the process of turning off the additional schema checking in the meantime. Please try again and let us know if the problem continues.