Exporting query results to a file on the fly - sql

I need to export the results of a query to a csv file and put the file on a network shared folder.
Is it possible to achieve this within a stored procedure?
If yes, comes yet another constraint: can I achieve this without sysadmin privileges, aka without using xp_cmdshell + BCP utility?
If no to 2., does the caller have to have sysadmin privileges or would it suffice if the SP owner has sysadmin privileges?
Here are some more details to the problem: The SP must export and transfer the file on the fly and raise error if something went wrong. The caller must get a response immediately, i.e. in case of no error, he can assume that the results are successfully transferred to the folder. Therefore, a DTS/SSIS job that runs every N minutes is not an option. I know the problem smells like I will have to do this at application level, but I would be more than happy if all those stuff could be done from T-SQL.

It seems to me, that you are not waiting for an SQL code in the answer on your question. The main aspect of you question is the security aspect. What should you do to implement your requirement without sysadmin privileges and without a new security hole? This is your real question I think.
I see at least 3 ways to solve your problem. But first of all a short explanation why sysadmin privileges exists in all solutions based on Extended Stored Procedures. Extended Stored Procedures like xp_cmdshell are very old. They existed at least before SQL Server 4.2, the first Microsoft SQL Server running under the first Windows NT (NT 3.1). In the old version of SQL Server I was not security restriction to execute such procedures, but later one made such restrictions. It is important to understand, that all general purpose procedures which allow starting any process under SQL Server account like xp_cmdshell and sp_OACreate must have sysadmin privileges restriction. Only a task oriented procedures with a clear area of usage and role based permissions can solve the problem without a security hole. So this is the 3 solution ways which I promised before:
You create a new SQL account on you SQL server with sysadmin privileges. Then you create a stored procedure which use some from Extended Stored Procedures like xp_cmdshell or sp_OACreate and technically implement you requirements (export some information into a CSV file). With respect of EXECUTE AS Clause (see http://msdn.microsoft.com/en-us/library/ms188354.aspx) you configure the created stored procedure so, that it runs under the account with sysadmin privileges. You delegate the execution of this procedure to users with a some SQL role, to be more flexible from the side of delegation of permission.
You can use CLR Stored Procedures instead of xp_cmdshell and sp_OACreate. You should also use role based permissions on the procedure created.
The end-user doesn’t call directly any SQL stored procedure what you create. There is exists a piece of software (like WCF service or a Web Site) which call your SQL stored procedure. You can implement the export to CSV file inside of this software and not inside of any SQL stored procedure.
In all implementation ways you should exactly define where you will hold the password of the account with which you access to the file system. There are different options which you have, all with corresponding advantages and disadvantages. It's possible to use impersonation to allow access to the file system with the end-user‘s account. The best way depends on the situation which you have in your environment.

You can build a SQL Agent job andkick it off via system SP's from a trigger or SP. The job may call SSIS or bulk dump scrits... returning instant error message may be an issue though
In general, it's quite unusual requirement - what are you trying to accomplish?
UPDATE:
After some more thinking - this is a design issue and I have not been able to find a solution simply by using SQL Server SP's.
IN the past - this is what I did:
on the app level - implement async process where user pushes a button, requesting a file download; the app accepts and let user go
the user can check the status via status page or will get email when it's done or error occured
in the mean time the application layer, kicks of either SSIS package or SQL Agent Job
If parameters are needed - use design and implement special table: JOB_PARAMETERS - where you would put the parameters
you would also need to create more tables to manage the jobs and store job status and communicate with application layer
you may want o use SQL Server Broker on DB level
You may want to use MSMQ on the app level
This is not easy, but this is the most efficient way to export data, where it goes from DB to a file, without traveling to app server and user PC via browser.

Can you use OLE Automation? It's ugly, and you could probably use some set based string building techniques instead of the cursor but here goes...
Declare #Dir varchar(4000)
Set #Dir = 'c:\some\path\accessible\to\SQLServer'
If #Dir IS NULL
Begin
print 'dir is null.'
Return 1
End
declare
#FilePath as varchar(255),
#DataToWrite as varchar(8000)
If right(#DataDir,1) <> '\'
Set #DataDir = #DataDir + '\'
Set #FilePath = #DataDir + 'filename.csv'
DECLARE #RetCode int , #FileSystem int , #FileHandle int
EXECUTE #RetCode = sp_OACreate 'Scripting.FileSystemObject' , #FileSystem OUTPUT
IF (##ERROR|#RetCode > 0 Or #FileSystem < 0)
begin
RAISERROR ('could not create FileSystemObject',16,1)
End
declare #FileExists int
Execute #RetCode = sp_OAMethod #FileSystem, 'FileExists', #FileExists OUTPUT, #FilePath
--print '#FileExists = ' + cast(#FileExists as varchar)
If #FileExists = 1
Begin
RAISERROR ('file does not exist',16,1)
/*return 1*/
End
--1 = for reading, 2 = for writing (will overwrite contents), 8 = for appending
EXECUTE #RetCode = sp_OAMethod #FileSystem , 'OpenTextFile' , #FileHandle OUTPUT , #FilePath, 8, 1
IF (##ERROR|#RetCode > 0 Or #FileHandle < 0)
begin
RAISERROR ('could not create open text file',16,1)
End
DECLARE CSV CURSOR
READ_ONLY
FOR
Select Anything From MyDataTable
order by whatever
DECLARE #fld1 nvarchar(50)
,#fld2 nvarchar(50)
OPEN CSV
FETCH NEXT FROM CSV INTO #fld1, #fld2
WHILE (##fetch_status <> -1)
BEGIN
IF (##fetch_status <> -2)
BEGIN
Set #DataToWrite = #fld1 + ',' + #fld2 + char(13) + char(10)
EXECUTE #RetCode = sp_OAMethod #FileHandle , 'Write' , NULL , #DataToWrite
IF (##ERROR|#RetCode > 0)
begin
RAISERROR ('could not write to file',16,1)
End
END
FETCH NEXT FROM OpenOrders INTO #fld1, #fld2
END
CLOSE CSV
DEALLOCATE CSV
EXECUTE #RetCode = sp_OAMethod #FileHandle , 'Close' , NULL
IF (##ERROR|#RetCode > 0)
RAISERROR ('Could not close file',16,1)
EXEC sp_OADestroy #FileSystem
return 0
End

Generally, no, this kind of work can't be done without a lot of fuss and effort and sysadmin rights.
SQL is a database engine, and is focused on database problems, and so and quite rightly has very poor file manipulation tools. Work-arounds include:
xp_cmdshell is the tool of choice for file manipulations.
I like the sp_OA* solution myself, 'cause it gives me flashbacks to SQL 7.0. But using those functions always made me nervous.
You might be able to do something with OPENROWSET, where the target of an insert is a file defined with this function. Sounds unlikely, might be worth a look.
Similarly, a linked server definition might be used as a target for inserts or select...into... statements.
Security seems to be your showstopper. By and large, when SQL shells out to the OS, it has all the rights of the NT account under which the SQL service started up on; if you'd want to limit network access, configure that account carefully (and never make it domain admin!)
It is possible to call xp_cmdshell as a user without sysadmin rights, and to configure these calls to not have the same access rights as the SQL Service NT account. As per BOL (SQL 2005 and up):
xp_cmdshell Proxy Account
When it is called by a user that is not a member of the sysadmin fixed server role, xp_cmdshell connects to Windows by using the account name and password stored in the credential named ##xp_cmdshell_proxy_account##. If this proxy credential does not exist, xp_cmdshell will fail.
The proxy account credential can be created by executing sp_xp_cmdshell_proxy_account. As arguments, this stored procedure takes a Windows user name and password. For example, the following command creates a proxy credential for Windows domain user SHIPPING\KobeR that has the Windows password sdfh%dkc93vcMt0.
So your user logs in with whatever user rights (not sysadmin!) and executes the stored procedure, which calls xp_cmdshell, which will "pick up" whatever proxy rights have been configured. Again, awkward, but it sounds like it'd do what you'd want it to do. (A possible limiting factor is that you only get the one proxy account, so it has to fit all possible needs.)
Honestly, it sounds to me like the best solution would be to:
Identify the source of the call to the stored procedure,
Have the procedure return the data to be written to the file (you can do all your formatting and layout in the procedure if need be), and
Have the calling routine manage all the file preparation steps (which could be as simple as redirecting data returned from SQL into an opened file)
So, what does launch the call to the stored procedure?

Related

Calling xp_cmdshell from a Stored Procedure

As a proof of concept we're trying to insert an xp_cmdshell command into an existing solution. Currently an application invokes a stored procedure on our database server which when profiled looks like:
declare #P1 int
set #P1=1
exec Name_Of_The_SP #param1 = 3, #param2 = 'blah', #parametc = 'blahetc', #ID = P1 output
select #P1
The SP essentially opens a transaction, inserts a row, and then commits. Inside this we added:
exec master..xp_cmdshell 'dir > c:\test.txt'
When we then run the first block of code in a SSMS query window the file is generated on the server as expected. But when we use the application to invoke it then the rows are inserted as normal but the file isn't generated?
The SQL Server and SQLAgent users are local admins and sysadmins so can't see any issues there. Tried making the application user a local admin also, to no avail, it was already a sysadmin.
This is SQL Server 2000
We managed to figure this out - we (I) were overlooking in profiler that the exec was coming in under a different login. Granting execute permission to master.dbo.xp_cmdshell specifically got it working. Apologies to anyone who spent any time/effort on this!

Alter or Create multiply stored procedures at once from multiply files in SQL Server 2008

I have a large amount of stored procedures that I am updating often and then transferring to a duplicate database on another server. I have been opening each “storedproc.sql” file from within SQL Server Management Studio 2008 and then selecting Execute in the tool bar which will ether create or alter an existing stored procedure. I have been doing this for each stored procedure.
I am looking for a script (or another way) that will allow me to alter all of the stored procedures on the databases with ones that are located in a folder at one time. I am basically looking for a script that will do something similar to the pseudo-code like text below.
USE [DatabaseName]
UPDATE [StoredProcName]
USING [directory\file\path\fileName.sql]
UPDATE [StoredProcNameN]
USING [directory\file\path\fileNameN.sql
…
Not the cleanest pseudo-code but hopefully you understand the idea. I would even be willing to drop all of the stored procedures (based on name) and then create the same stored procedures again on the database. If you need more clarity don’t hesitate to comment, I thank you in advance.
To further explain:
I am changing every reporting stored procedure for an SSRS conversion project. Once the report is developed, I move the report and the stored procedure to a server. I then have to manually run (ALTER or CREATE) each stored procedure against the duplicated database so the database will now be able to support the report on the server. So far this has not been too much trouble, but I will eventually have 65 to 85 stored procedures; and if I have to add one dataset field to each one, then I will have to run each one manually to update the duplicate database.
What I want to be able to do is have a SQL script that says: For this database, ALTER/CREATE this named stored procedure and you can find that .sql text file with the details in this folder.
Here is some code that I use to move all stored procedures from one database to another:
DECLARE #SPBody nvarchar(max);
DECLARE #SPName nvarchar(4000);
DECLARE #SPCursor CURSOR;
SET #SPCursor = CURSOR FOR
SELECT ao.name, sm.definition
FROM <SOURCE DATABASE>.sys.all_objects ao JOIN
<SOURCE DATABASE>.sys.sql_modules sm
ON sm.object_id = ao.object_id
WHERE ao.type = 'P' and SCHEMA_NAME(ao.schema_id) = 'dbo'
order by 1;
OPEN #SPCursor;
FETCH NEXT FROM #SPCursor INTO #SPName, #SPBody;
WHILE ##FETCH_STATUS = 0
BEGIN
if exists(select * from <DESTINATION DATABASE>.INFORMATION_SCHEMA.Routines r where r.ROUTINE_NAME = #SPName)
BEGIN
SET #query = N'DROP PROCEDURE '+#SPName;
exec <DESTINATION DATABASE>..sp_executesql #query;
END;
BEGIN TRY
exec <DESTINATION DATABASE>..sp_executesql #SPBody;
END TRY
BEGIN CATCH
select #ErrMsg = 'Error creating '+#SPName+': "'+Error_Message()+'" ('+#SPBody+')';
--exec sp__LogInfo #ProcName, #ErrMsg, #ProductionRunId;
END CATCH;
FETCH NEXT FROM #SPCursor INTO #SPName, #SPBody;
END;
You need to put in and as appropriate.
For Reference
c:\>for %f in (*.sql) do sqlcmd /S <servername> /d <dbname> /E /i "%f"
I recommend saving all your stored procedure script files starting with if exists(...) drop procedure followed by the create procedure section. Optionally include a go statement at the end depending on your needs.
You can then use a tool to concatenate all the files together into a single script file.
I use a custom tool for this that allows me to define dependency order, specify batch separators, script types, include folders, etc. Some text editors, such as UltraEdit have this capability.
You can also use the Microsoft Database Project to select batches of script files, and execute them against one or more database connections stored in the project. This is a good starting place that doesn't require any extra software, but can be a bit of a pain regarding adding and managing folders and files within the project.
Using a schema comparison tool such as RedGate's SQL Compare can be useful to synchronize the schema and/or objects of two databases. I don't recommend using this as a best practice deployment or "promote to production" tool though.

Stored procedure will not display in object explorer?

USE [MASTER]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
USE [MASTER]
GO
CREATE PROCEDURE [dbo].[TOTALLY_NEW] #FISCAL_YEAR NVARCHAR(4) AS
BEGIN
PRINT 'HERE'
END
GO
select * from master..sysobjects
where name like 'tot%' <-- returns one row!!!!!!
I've refreshed this list a dozen times..!!
I've tried disconnecting and reconnecting..
I've created all those other SP's listed in the image before.
Here is a picture with more.
Ensure that the user you are using has permissions to view stored procedures. I am not 100% on SQL Server which permission this is but I have seen this problem on a few other databases where a user creates a SP, but another user does not have permission to view or list the SPs.
Per request, converting comment to answer:
Yes, you shouldn't be creating user objects in master. The only time I ever do it is when I explicitly want to create a utility procedure that I can call from any database using that database's context, which you have to do on purpose and doesn't happen by accident - so I suspect you inadvertently marked your object as a system procedure. You do this using EXEC sp_MS_marksystemobject (or in older versions by having set EXEC sp_MS_upd_sysobj_category 1 - the latter might work in 2005 with 80 compatibility, not sure).

Export view data programmatically in Access/SQL Server

We have an Access application front-end connected to a SQL Server 2000 database. We would like to be able to programmatically export the results of some views to whatever format we can (ideally Excel, but CSV / tab delimited is fine). Up until now we've just hit F11, opened up the view, and hit File->Save As, but we're starting to get results with more than 16,000 results, which can't be exported.
I'd like some sort of server side stored procedure we can trigger that will do this. I'm aware of the sp_makewebtask procedure that does this, however it requires administrative rights on the server, and for obvious reasons we can't give that to everyone.
Any ideas?
You might not be able to give everyone administrative rights, but maybe you could:
Create a special user, e.g. 'WebTaskUser', with permissions to read
from the desired views and to execute the sp_makewebtask stored
procedure. You would be giving permissions to one user-not to everyone.
Then create a wrapper stored procedure (below) that allows your users to execute it, while it contains code to execute specific predefined calls to the sp_makewebtask procedure, one per view, granting only execute permissions for that single sp_makewebtask procedure, to one single user account-not all administrative permissions are granted-only execute-only to one account :-).
Test and refine the stored procedure to your liking from SSMS,
Access-vba, or whatever suits you best
When you're happy with the proc, grant any further necessary
permissions to your users or user roles, so they can execute it as
well.
`
--Example code to create user, and add permissions I might be able to add later
USE [some_database_x];
CREATE PROCEDURE EXPORT_VIEWS_TO_EXCEL
#TARGET_FOLDER NVARCHAR(100) DEFAULT 'C:\temp';
#FILE_TAG NVARCHAR(20) DEFAULT '';
#VIEWNAME NVARCHAR(100);
WITH EXECUTE AS 'WebTaskUser'
AS
BEGIN
IF #VIEWNAME IS NOT NULL
BEGIN
DECLARE #myOUTPUTFILE NVARCHAR(100);
SET #myOUTPUTFILE = #TARGET_FOLDER + '\' + #VIEWNAME + COALESCE(#FILE_TAG,'');
DECLARE #myQUERY NVARCHAR(150);
IF #VIEWNAME = 'mydb.dbo.firstview'
BEGIN
SET #myQUERY = 'Select * from mydb.dbo.firstview',
END
IF #VIEWNAME = 'mydb.dbo.secondview'
BEGIN
SET #myQUERY = 'Select * from mydb.dbo.secondview'
END
EXECUTE sp_makewebtask
#outputfile = #OUTPUTFILE,
#query = #myQUERY,
#colheaders = 1, #FixedFont = 0, #lastupdated = 0, #resultstitle='My Title'
END
RETURN 0;
END
GO
`
If you wanted to do everything in access you could link the view as a linked table and then using the TransferSpreadsheet method you could export that “table” as a csv file
EDIT:
As you want to do it server side check this out
http://www.mssqltips.com/tip.asp?tip=1633
I have used this before and it worked just fine
You may want to look at SSIS - it allows creating server side packages to export data on server side.
Another option is to right-click your database and run through data export wizard (which is using SSIS underneath).
Yet another option is to create command line utility (SQLCMD) to export the data into a flat file.
Have you used VB or Macros?
Create a local table the looks like the view structure
create a query that deletes the contents of the table
create a query that inserts the contents of the view to the local table
use the "analyze with excel" feature or one of the built in export features
Create a macro (or vba) that runs the first two qrys and the export with a single click
I just tried it with 26k rows and it worked without a problem
HTH

User Granted Access to Stored Procedure but Can't Run Query

I am working on a product that runs an SQL server which allows some applications to login and their logins are granted permission to run a stored procedure- AND NOTHING ELSE. The stored procedure is owned by an admin; the stored procedure takes a query and executes it, then the results are returned to the application.
Unfortunately I can't figure out why the application can call the stored procedure to which it's granted access, but the stored procedure cannot execute the SQL statement which was passed into it.
The stored procedure executes the passed in query when I'm logged in as an admin, but when I log in as the limited user it throws an exception in the execute statement.
For example:
EXEC [Admin].[STORED_PROC] #SQL_STATEMENT = 'SELECT * FROM table_x'
the STORED_PROC looks something like this:
BEGIN TRY
EXEC (#SQL_STATEMENT)
END TRY
BEGIN CATCH
-- some logging when an exception is caught, and the exception is caught here!!!
END CATCH
There is nothing inside the the try catch statement except that EXEC... and the SQL_STATEMENT works when I'm logged in as the Admin, but not when I'm logged in as the User.
Can anybody help me figure out what permissions I need to set in order to allow the User to run queries through the stored proc only?
So there have been some comments about allowing raw SQL statements to be executed via stored proc defeats the purpose of using a stored proc... but in reality what we're actually doing is we're passing an encrypted SQL statement into the stored proc and the stored proc gets the statement decrypted and THEN it executes it.
So yes, in reality raw SQL statements are not secure and they defeat the purpose of stored procs, but I don't know how to encrypt SQL queries that are passed through ODBC and run against a pre-2005 SQL Server.
In any case, I tried to put up some minimal safeguards to at least have some basic security.
Since you are using dynamic sql, SQL server can't tell which tables you are using, so you have to grant SELECT rights to all the tables as well
Users also need to have SELECT grant on the tables
Allowing raw SQL to be passed into a stored procedure and then executing is the very essence of data insecurity.
SQL Server security is structured so that arbitrary bits of SQL execute in their own security context. If you don't have the permission to run the query ad hoc, you also don't have the permission to run it through a stored procedure. In this, SQL Server is saving you from yourself.
Since your system allows access to stored procs and nothing else (which is good for security purposes and should not be changed) then you simply cannot under any circumstances use dynamic SQL because the rights are not at the table level and your dbas are unlikely to change that. This is not only to prevent SQL Injection attacks but to prevent possible internal fraud so any workplace which has considered this important will not be willing to compromise to make life easier for you. You simply need to redesign to never do anything dynamically. You have no other choice. If you write the procs to do what you want it to do in the first place, there is no need to send encypted sql.
When dynamic SQL is used through EXEC or sp_executesql within an SP, the EXEC permissions on the SP do not allow you to run arbitrary code in the dynamic sql. You either need to grant SELECT (yuck), or you might be able to impersonate another user using EXECUTE AS or SETUSER.
When normal SQL is used, EXEC permissions works fine, overridding ungranted SELECT persmissions. If you have DENY, though, I believe that trumps it.
Having said that, I'm still not sure you should use EXECUTE AS when the source of the SQL is outside the SP (or outside the database). For code-generation or dynamic sql which is safe from outside influence, EXECUTE AS can be a useful tool
This is most likely because of different schemas i.e. the user who logs in is not part of the Admin schema, or at least I would hope not.
The security technique that permits the type of access you are looking to achieve, i.e. to permit access to objects that are owned by the same schema, is called Ownership Chaining.
This principle is not best explained in a post.
Here is a link from Microsoft that explains the concept.
http://msdn.microsoft.com/en-us/library/ms188676(SQL.90).aspx
Here is a an outstanding article on security that provides examples and walkthroughs, for ownership chaining, amongst other techniques.
http://www.sommarskog.se/grantperm.html
I hope this is clear and assists you but please feel free to pose further questions.
Cheers, John