Find what application is connected with what login to what database what table and what columns - sql

Is there a Script which finds the current activity
from application->login->Database->Table->Column level ?
I have used
SP_who2, sp_who2'Active',Sysprocesses
Activity Monitor
Audit
Profiler
Trigger
Extended Events
and coludnt get column level data connections, i was able to get the sql statements, table name, database,instance, application, login name ...but I couldn't get Column Names
the reason I am trying to find to track all usage and re architect the Database..
any help is appreciated

SP_who2 and sp_who are the ones I have even used to get the required information. You can as well check against sys.sysprocesses to know about processes that are running on an instance of SQL Server.
If you want the columns involved in the queries then consider using SQL Server Tracing probably.

Related

In PostgreSQL, How to get all users who are logged into the session and also get their IP address and Query whenever they access databse?

I am using PostgreSQL. Nearly five people will be using the same database. I want to get the data about who are running which query both in HeidiSQL tool and Webapplication.
I tried using pg_stat_activity table to get the details. But it returns only one row per IP, which is the corresponding machine's query details.
to log who was connected use
https://www.postgresql.org/docs/current/static/runtime-config-logging.html#GUC-LOG-CONNECTIONS
to log what statements were used, use
https://www.postgresql.org/docs/current/static/runtime-config-logging.html#GUC-LOG-STATEMENT
pg_stat_activity shows connected session only:
https://www.postgresql.org/docs/current/static/monitoring-stats.html
One row per server process, showing information related to the current
activity of that process, such as state and current query.
https://www.postgresql.org/docs/current/static/monitoring-stats.html#PG-STAT-ACTIVITY-VIEW
you might also be interested in https://github.com/pgaudit/pgaudit and https://www.postgresql.org/docs/current/static/pgstatstatements.html

Updating multiple tables data from different databases having same column name

I have 11 databases in which I'm having tables contains User Details i.e. all employee details. There I have a column "Status"(which is 1 for Active and 0 for Inactive). I have a regular tasks for updating "Status" column value 0 or 1 for mentioned employees and for that, I have to go through all the databases then User table then I have to update. The same task i have to do for all the database and it consumes a lot of time.
If I will get a short Query or Procedure that I have to run once and will do all updation at once then, it would be a great help.
I see a couple of possible options.
You could build an SSIS package to connect to each database and do the necessary updates provided the criteria of which employees to update and what to update them to could be found within the database or some external source such as a text file.
Alternatively, you could use SQLCMD mode in SQL Server Management Studio and then within your SQL script use CONNECT command to switch to each server and database something like this...
:CONNECT Server1
USE Database1
--put your update SQL script
:CONNECT Server2
USE Database2
--put your update SQL script
...
These links provide some further information on using SQLCMD mode...
Connecting to multiple servers in a Query Window using SQLCMD
SQL Server SQLCMD Basics
Noel
As you mentioned, you have 11 databases.
Problem : First, you are using very bad approach for database design,
What really Happens : When you are using multiple databases and you need to check in every database, then the server needs to connect to different database again and again, which takes very more time compared to switching into the tables, because of connection handling.
Solution : In your case, you have only one option to connect different databases in loops and then run the query in the loop for every DB.
Suggestion : you should keep all the data in the same database, you can use an extra column in tables to keep track your data to different entities.

the transaction log for database 'tempdb' is full due to 'ACTIVE_TRANSACTION

I'm running same query in two different windows of the same server.
the only difference is : the query that is throwing above error has got 'index' on the temporary tables.
The query w/o index on temporary tables is working fine. Please explain how could index be a reason for this error?
This depends from your query. SQL-Server has to maintain indexes during data changes. This can drives you in different time-waith events.
Try this: check on your two different SQL Server instances what's heppening exactly to your running session during query executon.
You can do this monitoring wait events creating a monitor sessions for a single SPID.
This is my complete procedure to this: http://zaboilab.com/sql-server-toolbox/monitoring-wait-events-of-a-single-session-or-query

How to determine an Oracle query without access to source code?

We have a system with an Oracle backend to which we have access (though possibly not administrative access) and a front end to which we do not have the source code. The database is quite large and not easily understood - we have no documentation. I'm also not particularly knowledgable about Oracle in general.
One aspect of the front end queries the database for a particular set of data and displays it. We have a need to determine what query is being made so that we can replicate and automate it without the front end (e.g. by generating a csv file periodically).
What methods would you use to determine the SQL required to retrieve this set of data?
Currently I'm leaning towards the use of an EeePC, Wireshark and a hub (installing Wireshark on the client machines may not be possible), but I'm curious to hear any other ideas and whether anyone can think of any pitfalls with this particular approach.
Clearly there are many methods. The one that I find easiest is:
(1) Connect to the database as SYS or SYSTEM
(2) Query V$SESSION to identify the database session you are interested in.
Record the SID and SERIAL# values.
(3) Execute the following commands to activate tracing for the session:
exec sys.dbms_system.set_bool_param_in_session( *sid*, *serial#*, 'timed_statistics', true )
exec sys.dbms_system.set_int_param_in_session( *sid*, *serial#*, 'max_dump_file_size', 2000000000 )
exec sys.dbms_system.set_ev( *sid*, *serial#*, 10046, 5, '' )
(4) Perform some actions in the client app
(5) Either terminate the database session (e.g. by closing the client) or deactivate tracing ( exec sys.dbms_system.set_ev( sid, serial#, 10046, 0, '' ) )
(6) Locate the udump folder on the database server. There will be a trace file for the database session showing the statements executed and the bind values used in each execution.
This method does not require any access to the client machine, which could be a benefit. It does require access to the database server, which may be problematic if you're not the DBA and they don't let you onto the machine. Also, identifying the proper session to trace can be difficult if you have many clients or if the client application opens more than one session.
Start with querying Oracle system views like V$SQL, v$sqlarea and
v$sqltext.
Which version of Oracle? If it is 10+ and if you have administrative access (sysdba), then you can relatively easy find executed queries through Oracle enterprise manager.
For older versions, you'll need access to views that tuinstoel mentioned in his answer.
Same data you can get through TOAD for oracle which is quite capable piece of software, but expensive.
Wireshark is indeed a good idea, it has Oracle support and nicely displays the whole conversation.
A packet sniffer like Wireshark is especially interesting if you don't have admin' access to the database server but you have access to the network (for instance because there is port mirroring on the Ethernet switch).
I have used these instructions successfully several times:
http://www.orafaq.com/wiki/SQL_Trace#Tracing_a_SQL_session
"though possibly not administrative access". Someone should have administrative access, probably whoever is responsible for backups. At the very least, I expect you'd have a user with root/Administrator access to the machine on which the oracle database is running. Administrator should be able to login with a
"SQLPLUS / AS SYSDBA" syntax which will give full access (which can be quite dangerous). root could 'su' to the oracle user and do the same.
If you really can't get admin access then as an alternative to wireshark, if your front-end connects to the database through an Oracle client, look for the file sqlnet.ora. You can set trace_level_client, trace_file_client and trace_directory_client and get it to log the Oracle network traffic between the client and database server.
However it is possible that the client will call a stored procedure and retrieve the data as output parameters or a ref cursor, which means you may not see the query being executed through that mechanism. If so, you will need admin access to the db server, and trace as per Dave Costa's answer
A quick and dirty way to do this, if you can catch the SQL statement(s) in the act, is to run this in SQL*Plus:-
set verify off lines 140 head on pagesize 300
column sql_text format a65
column username format a12
column osuser format a15
break on username on sid on osuser
select S.USERNAME, s.sid, s.osuser,sql_text
from v$sqltext_with_newlines t,V$SESSION s
where t.address =s.sql_address
and t.hash_value = s.sql_hash_value
order by s.sid,t.piece
/
You need access those v$ views for this to work. Generally that means connecting as system.

What is your FIRST SQL command to run to troubleshoot SQL Server performance?

When the SQL Server (2000/2005/2008) is running sluggish, what is the first command that you run to see where the problem is?
The purpose of this question is that, when all answers are compiled, other users can benefit by running your command of choice to segregate where the problem might be.
There are other troubleshooting posts regarding SQL Server performance but they can be useful only for specific cases.
If you roll out and run your own custom SQL script,
then would you let others know what
the purpose of the script is
it returns (return value)
to do to figure out where problem is
If you could provide source for the script, please post it.
In my case,
sp_lock
I run to figure out if there are any locks (purpose) to return SQL server lock information. Since result set displays object IDs (thus not so human readable), I would usually skim through result to see if there are abnormally many locks.
Feel free to update tags
Why run a single query when a picture is worth a thousand words!
I prefer to run the freely avaialable Performance Dashboard Reports.
They provide a complete snapshot overview of your servers performance in seconds. You can then choose the a specific area to investigate (locking, currently running queries, wait requests etc.) simply by clicking the apporpriate area on the Dashboard.
http://www.microsoft.com/downloads/details.aspx?FamilyId=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en
One slight caveat, I beleive these are only available in SQL 2005 and above.
sp_who
http://msdn.microsoft.com/en-us/library/aa260384(SQL.80).aspx
I want to see "who", what machines/users are running what queries, length of time, etc. I can also easily scan for blocks.
If something is blocking a bunch of other transactions I can use the spid to issue a kill command if necessary.
sp_who_3 - Provides a lot of information available elsewhere but in one nice output. Also has several parameters to allow customized output.
A custom query which combines what you would expect in sp_who with DBCC INPUTBUFFER(spid) to get the last query text on each spid ordered by the blocked/blocking graph.
Process data is avaliable via master..sysprocesses.
sp_who3 returns standand sp_who2 output, until you specify a specific spid, then gives 6 different recordsets about that spid including locks, blocks, what it's currently doing, the T/SQL it's running, and the statement within the T/SQL that is currently running.
Ian Stirk has a great script I like to use as detailed in this article: http://msdn2.microsoft.com/en-ca/magazine/cc135978.aspx
In particular, I like the missing indexes one:
SELECT
DatabaseName = DB_NAME(database_id)
,[Number Indexes Missing] = count(*)
FROM sys.dm_db_missing_index_details
GROUP BY DB_NAME(database_id)
ORDER BY 2 DESC;
DBCC OPENTRAN to see what the oldest active transaction is
Displays information about the oldest
active transaction and the oldest
distributed and nondistributed
replicated transactions, if any,
within the specified database. Results
are displayed only if there is an
active transaction or if the database
contains replication information. An
informational message is displayed if
there are no active transactions.
followed by sp_who2
I use queries like those:
Number of open/active connections in ms sql server 2005