I have created SSRS report. Its working fine when I try to view this report on report server.
Now I am trying to use Subscriptions feature of Report Server to export report's in PDF format on schedule basis. For that.
I created New Subscriptions for some of the reports, and I scheduled the reports using shared schedule. The scheduling is working fine.
The issue is with the status of the report subscriptions under subscription tab, it shows
"Failure writing file \viaserver\Shared\Test.pdf: An impersonation error occurred using the security context of the current user"
Have you checked data source setting of the report, and whether the RS execution account has permission to write file at "\viaserver\Shared\"?
Hi Raju
The problem could be a bit complicated as you do not have the log for the report. So just some general suggestions, hope they can help you:
To check whether the whether the report ran successfully within the subscription, you can query executionlog2 in the report server database, by default the query should be like:
select * from ReportServer..ExecutionLog2 where RequestType = 'Subscription'
Instead of generating PDF subscription, can you run the report directly? Or is this report very large? It is because when generating report into pdf format, paginating can consume resource a lot. If you reduce the report size, can subscription run successfully?
The RS subscription relies on SQL Agent, can you find out whether the SQL Agent job ran successfully? To find out the job related to the subscription, you can use below query, where the scheduleID column is what you need to locate the job:
use ReportServer
go
SELECT S.ScheduleID AS SQLAgent_Job_Name
, SUB.Description AS Sub_Desc
, SUB.DeliveryExtension AS Sub_Del_Extension
, C.Name AS ReportName
, C.Path AS ReportPath
FROM ReportSchedule RS JOIN Schedule S ON RS.ScheduleID = S.ScheduleID
JOIN Subscriptions SUB ON RS.SubscriptionID = SUB.SubscriptionID
JOIN Catalog C ON RS.ReportID = C.ItemID AND SUB.Report_OID = C.ItemID
WHERE C.Name LIKE '%%'
Here are some additional points to help you debug the issue; I ran into many of these issues today - using SQL 2016 on Server 2012 R2. I'm writing this up mainly for my own reference but in hopes that (a) it will help someone else, and (b) someone can give additional info as to what is happening.
We have a domain set up, and I was trying to use a domain account as the account to impersonate. Use a local account. The account should have logon rights.
Set up a share. Only UNC paths are allowed, so be sure that the user set up in the previous bullet point has (a) NTFS folder permissions and (b) share permissions. These are 2 separate things. Do this even if the share is on the local machine.
I like to stop reporting services, rename the log file in <drive>\Program Files\Microsoft SQL Server\MSRS(SSRS version).MSSQLSERVER\Reporting Services\LogFiles\, then start it again so there's a fresh file and you're not scrolling down to the bottom of a 5MB file.
Sacrifice something - an animal, your first born - to the SQL Server Reporting Services gods. We're still trying to determine what happened, but at this stage we get the PDFs to generate to a local folder but it still writes out each one having an error, same errors that we were getting when it wasn't writing out the PDFs. It is writing out the PDF files. I do not know why it also writes out the same error as it was when it was failing to write out each PDF, I am thoroughly confused, and so is SSRS, apparently. I'll be sure to update this with additional info as we figure this out, but it's definitely not the easiest MS product we've had the pleasure of dealing with.
Related
I am trying to build a report to display the usage of cube such as executed query,user name, start time of the query ,duration of the query and some other information. This report will be helpful for optimization of performance .
I followed the process mentioned in this link,
https://technet.microsoft.com/en-us/library/cc917676.aspx
Table 'OLAPQueryLog' was successfully created but no rows are getting added in it . Technically rows should get populated as the cube is accessed by any user.(When a query is executed on the cube) .
I would really appreciate any immediate help .Please reach out to me if you have any concerns.
Most probably the problem is that SSAS does not have permissions to write to the log table. Make sure that the account that SSAS service is running under, has Insert permissions to the SQL database that hosts the QueryLog table. You can check the Event Viewer on the machine that runs SSAS to see the error message SSAS is producing.
I was meant to update a single column of a record in th database but i forgot to specify the id and every single record has now been updated! is there a way i can roll back the data please help!
I was meant to run the following statement :
Update Table set Cusname = "some name" where id = 2;
but i actually ran the following:
update Table set Cusname = "some name"
now every single cusname column has the same name . please help
Please help !
There's not much you can do... this is why you have a good backup strategy in place (and, ideally, don't execute any "ad hoc" t-sql in a production database before testing in a test database, then copy/paste the statements to help avoid these types of errors in the future).
Pulling info from the comments, you can start off by doing something along these lines: Can I rollback a transaction I've already committed? (data loss)
(this is for PostgreSQL, but the idea is the same... stop the server, backup all relevant files, etc).
If you have transaction logging and log backups, you can attempt a point in time restore, but this must be set up prior to your error. See here: Reverse changes from transaction log in SQL Server 2008 R2?
Your best bet in this case may be to spend some time working on resolving without restoring. It looks like you updated your customer names. Do you have another source for customer information? Can you compile an external list of customers and, say, addresses, so you can do a match on those to reset your db's customer names? If so, that might be a much easier route, getting you most of the way to a full recovery of your bonked field. If you don't have a ton of data, you can do this and fill the rest in manually. It's a process, but without a good backup of the db to revert to, it may very well be worth looking at...
Also note that if the database is hosted on a cloud based S/P/Iaas, you may have deeper level backups, or in the case of "SQL Database" (e.g., SQL Azure), point in time backups are set up out of the box even for the lowest service plans.
I need to find out which of our Crystal Reports users are actually running, so we can get rid of those that are no longer being used. I can do some kind of query of the latest jobs run on the SQL server as described here:
How to get the last run job details in SQL
However I'm not sure how I could tie that back to the actual reports. I've tried opening a Crystal Report in Crystal Reports 2008 while running a trace on the SQL server using SQL Profiler, however I don't see any database calls in the tracethat would allow me to determine the name of the report being run.
How could I find out which Crystal Reports are actually in use?
I usually embed a unique identifier (the report's name or ID) in either a sql-expression field or the command's query.
If the DBA finds an issue with a given query (e.g. non-performant), this approach easily identifies the source.
SQL-expression:
//{%report_name}
(
'Daily Obstetrics Review [OB003]'
)
In the command:
SELECT 'Daily Obstetrics Review [OB003]' AS REPORT_NAME
...
FROM ...
I'm currently unable to email out time based subscription reports from SSRS on a new SQL Server 2012 installation on Server 2012.
I receive the following error in the SSRS LogFiles
schedule!WindowsService_5!dc4!10/14/2013-10:01:09:: i INFO: Handling Event TimedSubscription with data 1a762da1-75ab-4c46-b989-471185553304.
library!WindowsService_5!dc4!10/14/2013-10:01:09:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.ReportServerStorageException: , An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database.;
library!WindowsService_5!dc4!10/14/2013-10:01:09:: w WARN: Transaction rollback was not executed connection is invalid
schedule!WindowsService_5!dc4!10/14/2013-10:01:09:: i INFO: Error processing event 'TimedSubscription', data = 1a762da1-75ab-4c46-b989-471185553304, error = Microsoft.ReportingServices.Diagnostics.Utilities.ReportServerStorageException: An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. ---> System.Data.SqlClient.SqlException: Invalid object name 'ReportServerTempDB.dbo.ExecutionCache'.
Databases were migrated from SQL 2008, this was done by a third party and I'm unsure if something was overlooked.
Any assistance would be greatly appreciated.
Thank you.
Dane
This thread seems to address your issue.
http://www.sqlservercentral.com/Forums/Topic553765-147-1.aspx
Please do a modicum of research before posting error messages.
From the Link
"
After much consternation, I have found a trigger referencing the invalid object. Trigger [Schedule_UpdateExpiration] on ReportServer table Schedule has the offending reference in it. In test, I altered this trigger to reference the correct report server tempdb and now subscriptions appear to be working properly. So far I have found nothing else broken."
AND
"If anyone is looking for a quick answer then here is what I did to solve my problem:
Updated trigger on dbo.schedule to reference the correct tempdb.
Scripted all stored procedures with their permissions onto a new query then "find and replaced" all instances of the old tempdb with the new one. "
After a while searching for a solution to fix this issue, I found that this is caused by the jobs definition of SQL Server Agent was not fully migrated to the new service. For every subscription created in SSRS, there is an associated job defined in SQL Server Agent. For services reply heavily on report delivery via subscriptions, it's best to export those jobs definition and import them into the new server.
Daniel E. answer is Correct.
I have spend a lot of time to find it and the error I am getting while updating the existing subscriptions
“An error occurred within the report server database. This may be due to a connection failure, timeout or low disk condition within the database. (rsReportServerDatabaseError)”
when i search online with the above error, i couldn't manage to fix it. after a long time i found the tigger is pointing to old database.
Temp database in the below: [ReportServerTempDB]
after updating to correct tempDB all started working fine.
ALTER TRIGGER [dbo].[Schedule_UpdateExpiration] ON [dbo].[Schedule]
AFTER UPDATE
AS
UPDATE
EC
SET
AbsoluteExpiration = I.NextRunTime
FROM
[ReportServerTempDB].dbo.ExecutionCache AS EC
INNER JOIN ReportSchedule AS RS ON EC.ReportID = RS.ReportID
INNER JOIN inserted AS I ON RS.ScheduleID = I.ScheduleID AND RS.ReportAction = 3
We have a system with an Oracle backend to which we have access (though possibly not administrative access) and a front end to which we do not have the source code. The database is quite large and not easily understood - we have no documentation. I'm also not particularly knowledgable about Oracle in general.
One aspect of the front end queries the database for a particular set of data and displays it. We have a need to determine what query is being made so that we can replicate and automate it without the front end (e.g. by generating a csv file periodically).
What methods would you use to determine the SQL required to retrieve this set of data?
Currently I'm leaning towards the use of an EeePC, Wireshark and a hub (installing Wireshark on the client machines may not be possible), but I'm curious to hear any other ideas and whether anyone can think of any pitfalls with this particular approach.
Clearly there are many methods. The one that I find easiest is:
(1) Connect to the database as SYS or SYSTEM
(2) Query V$SESSION to identify the database session you are interested in.
Record the SID and SERIAL# values.
(3) Execute the following commands to activate tracing for the session:
exec sys.dbms_system.set_bool_param_in_session( *sid*, *serial#*, 'timed_statistics', true )
exec sys.dbms_system.set_int_param_in_session( *sid*, *serial#*, 'max_dump_file_size', 2000000000 )
exec sys.dbms_system.set_ev( *sid*, *serial#*, 10046, 5, '' )
(4) Perform some actions in the client app
(5) Either terminate the database session (e.g. by closing the client) or deactivate tracing ( exec sys.dbms_system.set_ev( sid, serial#, 10046, 0, '' ) )
(6) Locate the udump folder on the database server. There will be a trace file for the database session showing the statements executed and the bind values used in each execution.
This method does not require any access to the client machine, which could be a benefit. It does require access to the database server, which may be problematic if you're not the DBA and they don't let you onto the machine. Also, identifying the proper session to trace can be difficult if you have many clients or if the client application opens more than one session.
Start with querying Oracle system views like V$SQL, v$sqlarea and
v$sqltext.
Which version of Oracle? If it is 10+ and if you have administrative access (sysdba), then you can relatively easy find executed queries through Oracle enterprise manager.
For older versions, you'll need access to views that tuinstoel mentioned in his answer.
Same data you can get through TOAD for oracle which is quite capable piece of software, but expensive.
Wireshark is indeed a good idea, it has Oracle support and nicely displays the whole conversation.
A packet sniffer like Wireshark is especially interesting if you don't have admin' access to the database server but you have access to the network (for instance because there is port mirroring on the Ethernet switch).
I have used these instructions successfully several times:
http://www.orafaq.com/wiki/SQL_Trace#Tracing_a_SQL_session
"though possibly not administrative access". Someone should have administrative access, probably whoever is responsible for backups. At the very least, I expect you'd have a user with root/Administrator access to the machine on which the oracle database is running. Administrator should be able to login with a
"SQLPLUS / AS SYSDBA" syntax which will give full access (which can be quite dangerous). root could 'su' to the oracle user and do the same.
If you really can't get admin access then as an alternative to wireshark, if your front-end connects to the database through an Oracle client, look for the file sqlnet.ora. You can set trace_level_client, trace_file_client and trace_directory_client and get it to log the Oracle network traffic between the client and database server.
However it is possible that the client will call a stored procedure and retrieve the data as output parameters or a ref cursor, which means you may not see the query being executed through that mechanism. If so, you will need admin access to the db server, and trace as per Dave Costa's answer
A quick and dirty way to do this, if you can catch the SQL statement(s) in the act, is to run this in SQL*Plus:-
set verify off lines 140 head on pagesize 300
column sql_text format a65
column username format a12
column osuser format a15
break on username on sid on osuser
select S.USERNAME, s.sid, s.osuser,sql_text
from v$sqltext_with_newlines t,V$SESSION s
where t.address =s.sql_address
and t.hash_value = s.sql_hash_value
order by s.sid,t.piece
/
You need access those v$ views for this to work. Generally that means connecting as system.