How to query results from another query? - sql

I am writing a Trans-SQL script against a MSSQL 2005 Server that intends to query file path of each database present. I am able to list out the database present in the system. But how do I run a separate query based on the results?
The following is the output from the list of databases using the command (SELECT name from sys.databases):
name
----
master
tempdb
model
msdb
Now i would like to take this database names (e.g. master, tempdb) and enter into another query namely (exec sp_helpdb <database_name>).
any ideas?

Not answering your question directly, but if you want to run a query for each db, you can use sp_msforeachdb.
sp_msforeachdb 'EXEC sp_helpdb [?]'
Otherwise, you're going to need to use the results to generate your SQL.

You can build a cursor based on that query then loop through the results, stuff them into a SQL variable, and use that variable to exec your sproc. Unfortunately I'm not able to give you a sample right now, but that is the way I would approach it.

In general, the answer to your question would be "use a subquery".
But in this case, you're using a SQL Server stored procedure. So the best approach is to write your own stored procedure to:
1) call sp_helpdb (or select from master..sysdatabases)
2) Iterate through the results
Here's an example:
http://www.mssqltips.com/sqlservertip/1070/simple-script-to-backup-all-sql-server-databases/

If i am understanding correct , you can use derived table here:-
select database.name (your query) from
(SELECT name from sys.databases) database

Related

Loop Through All SSMS Databases without Recreating Stored Procedure

Background Information:
In Python, I might write something like this if I want to apply the same logic to different values in a list.
database_list = ["db_1", "db_2", "db_3"]
for x in range(0,len(database_list),1):
print("the database name is " + database_list[x])
What I am trying to do:
What I am trying to do in SSMS, is pull a list of DB objects for each database. I created a stored procedure to pull exactly what I want, but I have to run it against each database, so 10 databases mean running it 10 times.
My goal is to do this with a T-SQL query instead of Python.
I tried doing something like this:
exec sp_MSforeachdb 'USE ?; EXEC [dbo].[my_stored_procedure]';
The problem with this is, [dbo].[my_stored_procedure] has to exist in every database I want to do this in.
How can I create the stored procedure in 1 database, but execute it for all databases or a list of databases that I choose?
I know what you are trying to do and if it's what I think (you seem reluctant to actually say!) you can do the following:
In the master database, create your procedure. Normally you wouldn't do this, but in this case you must prefix it sp_
use master
go
create procedure sp_testproc as
select top 10 * from sys.tables
go
Now if you run this, it will return tables from the master database.
If you switch context to another database and exec master.dbo.sp_testproc, it will still return tables from the master database.
In master, run
sys.sp_MS_marksystemobject sp_testproc
Now switch context to a different database and exec master.dbo.sp_testproc
It will return tables from the database you are using.
Try creating your sproc in master and naming it with an sp_ prefix:
USE master
GO
CREATE PROCEDURE sp_sproc_name
AS
BEGIN
...
END
GO
-- You *may* need to mark it as a system object
EXEC sys.sp_MS_marksystemobject sp_sprocname
See: https://nickstips.wordpress.com/2010/10/18/sql-making-a-stored-procedure-available-to-all-databases/
It should then be available in all dbs
Create the stored procedure in the Master database with the sp_ prefix, and use dynamic SQL in the stored procedure so it resolves object names relative to the current database, rather than the database which contains the stored procedure.
EG
use master
go
CREATE OR ALTER PROCEDURE [dbo].[sp_getobjects]
AS
exec ('
select *
from [sys].[objects]
where is_ms_shipped = 0
order by type, name
')
go
use AdventureWorks2017
exec sp_getobjects
#LunchBox - it's your single stored procedure (that you create in one database) that is actually going to need to contain the "exec sp_MSforeach ...." command, and instead of the command to be executed being "EXEC ", it will need to be the actual SQL that you were going to put into the stored proc.
Eg. (inside your single stored procedure)
EXEC sp_MSforeachdb 'USE ?; SELECT * FROM <table>; UPDATE <another table> SET ...';
Think of the stored procedure (that you put into one database) as being no different than your Python code file - if you had actually wanted to achieve the same thing in Python, you would have either needed to create the stored proc in each database, or build the SQL statement string in Python and execute it against each database.
I understand what you thought you might be able to achieve with SQL, but stored procedures really don't work the way you were expecting. Even when you're in the context of a different database, but you run EXEC <different_db>.stored_proc, that stored proc ends up running in the context of the database in which it exists (not your context database).
Now, the only one issue you may come up against is that the standard sp_MSforeachdb stored proc has a limit of 2000 characters for the command that can be executed (although, it does have multiple "command" parameters, this may not be practical if you were planning on running a very large code block, perhaps with variables that carry all the way through). If this is something that might impact what you're intending to do, you could do a search online for "sp_MSforeachdb alternatives" - there seem to be a handful that people have created where the command parameter can contain a larger string.

In db2 how to find all the Stored Procedures having a given text in it

I want to find if a table is being used anywhere in all the stored procedures in a system.
Is there a query to fetch all the details of SP.
You can use SYSCAT.TABDEP and SYSCAT.ROUTINEDEP system catalog views.
For tables in Dynamic SQL statements, that are built and executed on the fly, you can use
select routinename,text from syscat.routines where language='SQL' and locate('<table-name>',text)>0
HTH
Sathyaram
The accepted answer didn't work for me for our particular flavor of DB2, but it set me in the right direction. Here is the query I wrote which allowed me to search sprocs in a given schema:
SELECT ROUTINE_NAME, ROUTINE_DEFINITION FROM sysibm.routines
WHERE SPECIFIC_SCHEMA='<YourSchemaName>'
AND ROUTINE_DEFINITION LIKE '<YourSearchText>%'
Replace YourSchemaName and YourSearchText with appropriate values.

Do I have to write the "GO" word in order to execute an SQL server statement?

I have little to no experience with TSQL and SQL Server - so in MySQL when I want to execute a statement I simply write:
Select * from users
...and then hit ENTER.
However now I see many SQL Server tutorials that you have the GO word immediately after each statement. Do I have to write this? For example:
Select * from users; GO
Or I can simply write:
Select * from users; <enter key pressed...>
In SQL Server, go separates query batches. It's optional in most situations.
In earlier versions of SQL Server, you had to do a go after altering a table, like:
alter table MyTable add MyColumn int
go
select MyColumn from MyTable
If you didn't, SQL Server would parse the query batch, and complain that MyColumn didn't exist. See MSDN:
SQL Server utilities interpret GO as a
signal that they should send the
current batch of Transact-SQL
statements to an instance of SQL
Server. The current batch of
statements is composed of all
statements entered since the last GO,
or since the start of the ad hoc
session or script if this is the first
GO.
GO separates batches, as Andomar wrote.
Some SQL statements (e.g. CREATE SCHEMA) need to be the first or only statements within a batch. For example, MSDN states
The CREATE PROCEDURE statement cannot
be combined with other Transact-SQL
statements in a single batch.
Local variables are also limited to a batch, and therefore are not accessible after a GO.
Go is optional, no need to write that in your sql statements.
You don't have to. What the GO will do is execute each statement (at least in Sql Server)
As the other answerers said before me, you don't really NEED Go.
There is only one case when you have to use it, and that's when you want to create a table or view and then select from it.
For example:
create view MyView as select * from MyTable
go
select * from MyView
Without Go, Sql Server won't execute this because the select statement is not valid, because the view doesn't exist at that moment.

Cast Stored Procedure Result as a Table? [duplicate]

This question already has answers here:
SQL: how to predicate over stored procedure's result sets?
(3 answers)
Closed 6 years ago.
I currently have a stored procedure that runs a complex query and returns a data set. I'd like to cast this data set to a table (on which I can perform further queries) if at all possible. I know I can do this using a table-valued UDF but I'd prefer to avoid that at this point. Is there any way I can accomplish this task?
EDIT: OK... so the SProc I'm using (written by third party and I'm not supposed to change it) runs a fairly complex select statement to return a bunch of line item data about purchase orders. I can recreate it as a UDF but then I'd have to support the UDF and ensure it gets changed as and when our vendor changes their SProc. I'd like to further refine this line item info by a number of criteria such as (but not limited to) item numbers, vendor codes, cost centers, etc. All of this information is brought back by the original SProc and I just need to be able to manipulate it further. My thought process was that if I can somehow treat the results of the SProc as a table (or get them into a table format of some type) then I can run further queries against the original result set to limit by the criteria mentioned above. Please let me know if any further details are needed.
There's various means of sharing data between stored procedures - this link is pretty exhaustive.
But I'm curious why you want a table valued stored procedure (which doesn't exist in SQL Server) when there are table valued functions...
Cast Stored Procedure Result as a
Table?
Yes and this is used quite often. It simply needs one or more select statements:
Create Procedure #Foo
As
Select object_id, name
From sys.columns
That said, you cannot join to this resultset nor can you easily consume it from another stored proc (although there is a way). Given your edit, it appears the question is whether you can consume the results of a stored proc by another stored proc. Technically, yes. You can populate a temp table with the results of a proc. However, you must declare your temp variable or temp table with the same column structure as is returned by the first resultset of the stored proc.
Declare #Data Table ( object_id int, name nvarchar(128) )
Insert #Data
Exec #Foo
Select *
From #Data
(Or use the far more clever OPENROWSET solution as mentioned by Cade Roux and OMG Ponies)
Have you considered using table-valued parameters? They are new in SQL 2008.
-- Edit --
Nope, never mind, they're only good for passing data into stored procedures.
You could try using a View instead of a Stored Procedure. Store your complex query as part of the view, and you have the functionality to perform more queries on the view.

SQL - How to insert results of Stored_Proc into a new table without specifying columns of new table?

Using SQL Server 2005, I'd like to run a stored procedure and insert all of the results into a new table.
I'd like the new table to have its columns automatically configured based upon the data returned by the stored procedure.
I am familiar with using the SELECT ... INTO syntax:
SELECT * INTO newtable FROM oldtable
Is this possible?
Edit for clarification: I'm hoping to accomplish something like:
Select * INTO newtable FROM exec My_SP
The only way to do this is w/ OPENROWSET against the local server:
SELECT * INTO #temp
FROM OPENROWSET (
'SQLOLEDB'
, 'Server=(local);TRUSTED_CONNECTION=YES;'
, 'SET FMTONLY OFF EXEC database.schema.procname'
) a
But this is kind of a last-ditch-gotta-do-it-damn-the-consequences kind of method. It requires elevated permissions, won't work for all procedures, and is generally inefficient.
More info and some alternatives here: http://www.sommarskog.se/share_data.html
This seems like a horrible design. You're really going to create a new table to store the results of a stored procedure, every time the stored procedure is called? And you really can't create the table in advance because you have absolutely no idea what kind of output the stored procedure has? What if the stored procedure returns multiple resultsets? What if it has side effects?
Okay, well, if that's what you really want to do...
One way to accomplish this is to use your local server as a linked server and utilize OPENQUERY. First you need to make sure your local server is configured for data access:
EXEC sp_serveroption 'local server name', 'DATA ACCESS', true;
Then you can do something like this:
SELECT * INTO dbo.newtable
FROM OPENQUERY('local server name', 'EXEC yourdb.dbo.yourproc;');
PS How are you going to write code that is going to perform SELECT INTO into a new table name every time (because you can only do SELECT INTO once)? Dynamic SQL? What happens if two users run this code at the same time? Does one of them win, and the other one just gets an error message?
A variation of the same is
create table somename
select * from wherever;