Probably has been asked before, but i'm looking for a utility, which can
Identify a particular session and record all activity.
Able to identify the sql that was executed under that session.
Identify any stored procedures/functions/packages that were executed.
And able to show what was passed as parameters into the procs/funcs.
I'm looking for a IDE thats lightweight, fast, available and won't take 2 day's to install, i.e something I can get down, install and use in the next 1 hour.
Bob.
if you have license for Oracle Diagnostic/Tuning Packs, you may use Oracle Active Session History feature ASH
The easiest way I can think of to do this is probably already installed in your database - it's the DBMS_MONITOR package, which writes trace files to the location identified by user_dump_dest. As such, you'd need help from someone with access to the database server to access the trace files.
But once, you've identified the SID and SERIAL# of the session you want to trace, you can just call:
EXEC dbms_monitor.session_trace_enable (:sid, :serial#, FALSE, TRUE);
To capture all the SQL statements being run, including the values passed in as binds.
Related
We have an ERP Program used to create and manage stock / orders. Somehow an order has vanished - this should not be possible. It should be possible to cancel an unwanted order, but never delete it completely.
The order in question was created, printed and sent to a customer - and then disappeared. I know the Primary key and Table info, and want to search the log to see if this was somehow deleted, or perhaps there was a rollback.
How can I translate/search the log in this way?
Please note: I did not write this program, and its not my job to fix it.
I just need to diagnose the issue and contact the SW Vendor, if required, and have them fix it. As such I cannot post any code.
With so little information it is hard to give a definitive answer.
I'd start by searching the regular logs. If you have some kind of audit trail mechanism that would be a great help!
If a search through the regular logs doesn't find you the answer then I would:
Get a copy of the database
Go through the REDO logs using the appropriate DBA tools. Since I'm not an sqlanywhere DBA I would get help from one.
When I found the place in time where the order was deleted I would find any other information I could get. The user that did the commit or users that where logged on at the time (I don't know exactly what kind of information you can get here). Also, go back to the other logs you may have and check around that time stamp.
To learn exactly how to go through the redo logs of an SQL Anywhere database you should first try your google luck and then ask in Database Administrators.
Solved!!!!
The Sybase Central tool has an option (which I couldn't find in the manual and missed the first time I looked), which can translate a log file into a series of statements and create a *.SQL file.
Tools -> SQL Anywhere -> Translate Log File -> Follow wizard (which hopefully for you is in a language that you speak, for me it was not).
I have a need to recompile a package in oracle 9i. But the session gets hung forever. When I checked in V$SESSION_WAIT, got to know that it is waiting on an event 'library cache pin'. Couldn't get a possible solution for 9i version. Is there anyway to find the session, that is executing my package and kill it?
Sure.
To find what sessions run a code which contains a name:
select s.*,sa.*
FROM v$session s
left join v$sqlarea sa
on s.sql_address=sa.address AND s.sql_hash_value=sa.hash_value
where sql_text like '%your_package_name_here%';
After this, you have the pid and serial so you can kill the session you need to kill. (The code above may return sessions that you do not need to kill, for example, it is finding itself :) )
Oracle offers no built-in easy path to do this.
If your application uses the DBMS_APPLICATION_INFO routines to register module usage you're in luck: you can simply query V$SESSION filtering on MODULE and/or ACTION. Alternatively, perhaps you have trace messages in your PL/SQL which you use?
Otherwise, you need to start trawling V$SQLTEXT (or another of the several views which show SQL) for calls containing the package name. Remember to make the search case-insensitive. This will give you a SQL_ID you can link to records in V$SESSION.
This will only work if your package is the primary object; that is, if it is the top of the call stack. That is one explanation for why the package is locked for so long. But perhaps your package is called from some other package: in that case you might not get a hit in V$SQLTEXT. So you will need to find the programs which call it, through ALL_DEPENDENCIES, and sift V$SQL_TEXT for them.
Yes, that does sound like a tedious job. And that is why it is a good idea to include some form of trace in long-running PL/SQL calls.
This command is not deleting backups:
EXEC xp_delete_file 0,N'F:\path\cms',N'*.bak',N'2014-01-30T21:08:04'
Also tried
EXEC xp_delete_file 0,N'F:\path\cms',N'bak',N'2014-01-30T21:08:04'
and
EXEC xp_delete_file 0,N'F:\path\cms',N'.bak',N'2014-01-30T21:08:04'
SQL Server Agent has permissions on the folder.
Did you try:
EXEC xp_delete_file 0,N'F:\path\cms\',N'bak',N'2014-01-30T21:08:04';
--- this slash may be important ---^
That said, you should simply not be using this stored procedure to clean up your backup folders. It is undocumented and unsupported. Take a look at this Connect item, which complains about exactly the same problem, seven years ago to the day. Note that it is closed as "won't fix" and of particular interest should be this official statement from Terrence Nevins of Microsoft:
The stored procedure was never intended to be called by an end user and the chances that you could ever be successful in having it do what you intend are almost zero. I took a quick peek at the implementation we are using and it takes some very specialzed arguments for a very specific task.
If you detemine that you do need to access the file system directly from a stored proc, then I would imagine you would need to write that yourself in .net. Or perhaps there is already a vendor that provides this.
We don't document this XP or promote its usage by anyone for very good reasons. Most of all, they can 'go away' from release to release.
Solved: Both users for agent and sql server service need read/write/delete permissions on the backup folder.
The path name must end with a \ and the extension must not contain the dot. Then it will work.
Make sure that you have "Full Control" on the directory where your backups are. Unfortunately running the xp_delete_file doesn't return an error if you don't have the correct privs nor do you see anything in your SQL Server agent log files.
I have got a database of ms-sql server 2005 named mydb, which is being accessed by 7 applications from different location.
i have created its copy named mydbNew and tuned it by applying primary keys, indexes and changing queries in stored procedure.
now i wants to replace old db "mydb" from new db "mydbnew"
please tell me what is the best approach to do it. i though to do changes in web.config but all those application accessing it are not accessible to me, cant go for it.
please provide me experts opinion, so that i can do replace database in minimum time without affecting other db and all its application.
my meaning of saying replace old db by new db is that i wants to rename old db "mydb" to "mydbold" and then wants to remname my new db "mydbnew" to "mydb"
thanks
Your plan will work but it does carry a high risk, especially since I'm assuming this is a system that has users actively changing data, which means your copy won't have the same level of updated content in it unless you do a cut right before go-live. Your best bet is to migrate your changes carefully into the live system during a low traffic / maintenance period and extensively test it once your done. Prior to doing this, or the method you mentioned previously, backup everything.
All of the changes you described above can be made to an online database without the need to actually bring it down. However, some of those activities will change the way in which the data is affected by certain actions (changes to stored procs), that means that during the transition the behaviour of the system or systems may be unpredicatable and therefore you should either complete this update at a low point in day to day operations or take it down for a maintenance window.
Sql Server comes with a function to make a script file out of you database, you can also do this manually but clicking on the object you want to script and selecting the Script -> CREATE option. Depending on the amount of changes you have to make it may be worthwhile to script your whole new database (By clicking on the new database and selecting Tasks -> Generate Scripts... and selecting the items needed).
If you want to just script out the new things you need to add individually then you simply click on the object you want to script, select the Script <object> as -> then select DROP and CREATE to if you want to kill the original version (like replacing a stored proc) or select CREATE to if your adding new stuff.
Once you have all the things you want to add/update as a script your then ready to execute that against the new database. This would be the part where you backup everything. Once your happy everything is backed up and the system is in maintenance or a low traqffic period, you execute the script. There may be a few problems when you do this, you will need to fix these as quickly as possible (usually mostly just 'already exisits' errors, thats why drop and create scripts are good) and if anything goes really wrong restore your backups and try again (after figuring out what happened and how to fix it).
Make no mistake if you have a lot of changes to make this could be a long process, or it could take mere minutes, you just need to adapt if things go wrong and be sure to cover yourself with backups/extensive prayer. Good Luck!
Part of the setup routine for the product I'm working on installs a database update utility. The utility checks the current version of the users database and (if necessary) executes a series of SQL statements that upgrade the database to the current version.
Two key features of this routine:
Once initiated, it runs without user interaction
SQL operations preserve the integrity of the users data
The goal is to keep the setup/database routine as simple as possible for the end user (the target audience is non-technical). However, I find that in some cases, these two features are at odds. For example, I want to add a unique index to one of my tables - yet it's possible that existing data already breaks this rule. I could:
Silently choose what's "right" for the user and discard (or archive) data; or
Ask the user to understand what a unique index is and get them to choose what data goes where
Neither option sounds appealing to me. I could compromise and not create a unique index at all, but that would suck. I wonder what others do in this situation?
Check out SQL Packager from Red-Gate. I have not personally used it, but these guys make good tools overall and this seems to do what you're looking for. It let's you modify the script to customize the install:
http://www.red-gate.com/products/SQL_Packager/index.htm
You never throw a users data out. One possible option is to try and create the unique index. If the index creation fails, let them know it failed, tell them what they need to research, and provide them a script they can run if they find they have a data error that they choose to fix up.