We have an Access application front-end connected to a SQL Server 2000 database. We would like to be able to programmatically export the results of some views to whatever format we can (ideally Excel, but CSV / tab delimited is fine). Up until now we've just hit F11, opened up the view, and hit File->Save As, but we're starting to get results with more than 16,000 results, which can't be exported.
I'd like some sort of server side stored procedure we can trigger that will do this. I'm aware of the sp_makewebtask procedure that does this, however it requires administrative rights on the server, and for obvious reasons we can't give that to everyone.
Any ideas?
You might not be able to give everyone administrative rights, but maybe you could:
Create a special user, e.g. 'WebTaskUser', with permissions to read
from the desired views and to execute the sp_makewebtask stored
procedure. You would be giving permissions to one user-not to everyone.
Then create a wrapper stored procedure (below) that allows your users to execute it, while it contains code to execute specific predefined calls to the sp_makewebtask procedure, one per view, granting only execute permissions for that single sp_makewebtask procedure, to one single user account-not all administrative permissions are granted-only execute-only to one account :-).
Test and refine the stored procedure to your liking from SSMS,
Access-vba, or whatever suits you best
When you're happy with the proc, grant any further necessary
permissions to your users or user roles, so they can execute it as
well.
`
--Example code to create user, and add permissions I might be able to add later
USE [some_database_x];
CREATE PROCEDURE EXPORT_VIEWS_TO_EXCEL
#TARGET_FOLDER NVARCHAR(100) DEFAULT 'C:\temp';
#FILE_TAG NVARCHAR(20) DEFAULT '';
#VIEWNAME NVARCHAR(100);
WITH EXECUTE AS 'WebTaskUser'
AS
BEGIN
IF #VIEWNAME IS NOT NULL
BEGIN
DECLARE #myOUTPUTFILE NVARCHAR(100);
SET #myOUTPUTFILE = #TARGET_FOLDER + '\' + #VIEWNAME + COALESCE(#FILE_TAG,'');
DECLARE #myQUERY NVARCHAR(150);
IF #VIEWNAME = 'mydb.dbo.firstview'
BEGIN
SET #myQUERY = 'Select * from mydb.dbo.firstview',
END
IF #VIEWNAME = 'mydb.dbo.secondview'
BEGIN
SET #myQUERY = 'Select * from mydb.dbo.secondview'
END
EXECUTE sp_makewebtask
#outputfile = #OUTPUTFILE,
#query = #myQUERY,
#colheaders = 1, #FixedFont = 0, #lastupdated = 0, #resultstitle='My Title'
END
RETURN 0;
END
GO
`
If you wanted to do everything in access you could link the view as a linked table and then using the TransferSpreadsheet method you could export that “table” as a csv file
EDIT:
As you want to do it server side check this out
http://www.mssqltips.com/tip.asp?tip=1633
I have used this before and it worked just fine
You may want to look at SSIS - it allows creating server side packages to export data on server side.
Another option is to right-click your database and run through data export wizard (which is using SSIS underneath).
Yet another option is to create command line utility (SQLCMD) to export the data into a flat file.
Have you used VB or Macros?
Create a local table the looks like the view structure
create a query that deletes the contents of the table
create a query that inserts the contents of the view to the local table
use the "analyze with excel" feature or one of the built in export features
Create a macro (or vba) that runs the first two qrys and the export with a single click
I just tried it with 26k rows and it worked without a problem
HTH
Related
I have a central management database which collates some information and runs some dynamic SQL for various other tasks when a new database is restored into the environment. One of those tasks is going to be a bit complex to achieve through dynamic SQL so I had the idea of creating a master copy stored procedure in the central DB and copying that over to the new databases after they are restored.
I've seen a few examples of people trying to do that on here but I can't get anything to play ball.
Here's what i am trying to achieve conceptually, note that I'm trying to cater for potentially multiple stored procedures to be created in this way just for future proofing.
declare #sql nvarchar(max), #DatabaseName nvarchar(200)
set #DatabaseName = 'TargetDatabase'
set #sql =
(
SELECT definition + char(13) + 'GO'
FROM sys.sql_modules s
INNER JOIN sys.procedures p
ON [s].[object_id] = [p].[object_id] WHERE p.name LIKE '%mastercopy%'
)
exec #sql
Thanks
Instead of creating dynamic script you could use one script with all the procedures that you want to create (you can script all the procs you want using 2 click in SSMS), you then run this script manually in the context of the database where you want to create these procedures or by passing the file with this script to sqlcmd with -i and passing the correct database name with -d.
Here Use the sqlcmd Utility you can see the examples.
I am trying to build an SSIS package that dynamically rebuilds the indexes for all the tables in my database. The general idea is that the package will make sure that the table is not being update and then execute a stored procedure that drops the old index, if it exists, and then recreates it. The logic behind the package seems to be sound. The problem that I am having is when I execute the package I keep getting the error:
Cannot find object...because it does not exist or you do not have permission...
The index existing should be irrelevant due to the IF EXISTS part.
The procedure looks like this:
REFERENCE_NAME AS VARCHAR(50),
COLUMN_NAME AS VARCHAR(50),
INDEX_NAME AS VARCHAR(50)
AS
BEGIN
SET NOCOUNT ON;
DECLARE #sql NVARCHAR(MAX)
SET #sql = 'IF EXISTS (SELECT name FROM sysindexes WHERE name = '+CHAR(39)+#INDEX_NAME+CHAR(39)+') '+
'DROP INDEX '+#INDEX_NAME+' ON '+#REFERENCE_NAME+' '+
'CREATE INDEX '+#INDEX_NAME+' ON '+#REFERENCE_NAME+'('+#COLUMN_NAME+') ON [INDEX]'
EXEC sp_executesql #sql
END
GO
I am able to execute the procedure through SSMS just fine, no error and it builds the index. When I execute the package in SSIS it errors out the minute it gets to the task that executes the stored procedure. I have made sure that SSIS is passing the variables to the execute SQL task and I have verified that I have db_ddladmin rights. Outside of that I am at a loss and have been beating my head against the wall for a day and a half on this.
Is there something I am missing, some permissions I need to request, or some work around for the issue?
Any information would be much appreciated.
Bartover, its definitely not looking at the wrong database. I have checked that the proc is there and the only connection on the package is to that specific database. Yes, I am executing the package manually with Visual Studios 2010 Shell Data Tools.
Sorrel, I tried your idea of a sanity check on the #sql statement on the drop, on both the drop and create, and on whole #sql statement, no joy.
Gnackenson, I had that same thought, but the connection authentication method is set to Windows Authentication, same as ssms. Do you have any ideas as to why it might use different permissions?
It looks like IF EXISTS is being ignored by SSIS SQL Task. To fix my problem, I altered my SQL tasks from DROP - CREATE to DISABLE - ENABLE.
I have a large amount of stored procedures that I am updating often and then transferring to a duplicate database on another server. I have been opening each “storedproc.sql” file from within SQL Server Management Studio 2008 and then selecting Execute in the tool bar which will ether create or alter an existing stored procedure. I have been doing this for each stored procedure.
I am looking for a script (or another way) that will allow me to alter all of the stored procedures on the databases with ones that are located in a folder at one time. I am basically looking for a script that will do something similar to the pseudo-code like text below.
USE [DatabaseName]
UPDATE [StoredProcName]
USING [directory\file\path\fileName.sql]
UPDATE [StoredProcNameN]
USING [directory\file\path\fileNameN.sql
…
Not the cleanest pseudo-code but hopefully you understand the idea. I would even be willing to drop all of the stored procedures (based on name) and then create the same stored procedures again on the database. If you need more clarity don’t hesitate to comment, I thank you in advance.
To further explain:
I am changing every reporting stored procedure for an SSRS conversion project. Once the report is developed, I move the report and the stored procedure to a server. I then have to manually run (ALTER or CREATE) each stored procedure against the duplicated database so the database will now be able to support the report on the server. So far this has not been too much trouble, but I will eventually have 65 to 85 stored procedures; and if I have to add one dataset field to each one, then I will have to run each one manually to update the duplicate database.
What I want to be able to do is have a SQL script that says: For this database, ALTER/CREATE this named stored procedure and you can find that .sql text file with the details in this folder.
Here is some code that I use to move all stored procedures from one database to another:
DECLARE #SPBody nvarchar(max);
DECLARE #SPName nvarchar(4000);
DECLARE #SPCursor CURSOR;
SET #SPCursor = CURSOR FOR
SELECT ao.name, sm.definition
FROM <SOURCE DATABASE>.sys.all_objects ao JOIN
<SOURCE DATABASE>.sys.sql_modules sm
ON sm.object_id = ao.object_id
WHERE ao.type = 'P' and SCHEMA_NAME(ao.schema_id) = 'dbo'
order by 1;
OPEN #SPCursor;
FETCH NEXT FROM #SPCursor INTO #SPName, #SPBody;
WHILE ##FETCH_STATUS = 0
BEGIN
if exists(select * from <DESTINATION DATABASE>.INFORMATION_SCHEMA.Routines r where r.ROUTINE_NAME = #SPName)
BEGIN
SET #query = N'DROP PROCEDURE '+#SPName;
exec <DESTINATION DATABASE>..sp_executesql #query;
END;
BEGIN TRY
exec <DESTINATION DATABASE>..sp_executesql #SPBody;
END TRY
BEGIN CATCH
select #ErrMsg = 'Error creating '+#SPName+': "'+Error_Message()+'" ('+#SPBody+')';
--exec sp__LogInfo #ProcName, #ErrMsg, #ProductionRunId;
END CATCH;
FETCH NEXT FROM #SPCursor INTO #SPName, #SPBody;
END;
You need to put in and as appropriate.
For Reference
c:\>for %f in (*.sql) do sqlcmd /S <servername> /d <dbname> /E /i "%f"
I recommend saving all your stored procedure script files starting with if exists(...) drop procedure followed by the create procedure section. Optionally include a go statement at the end depending on your needs.
You can then use a tool to concatenate all the files together into a single script file.
I use a custom tool for this that allows me to define dependency order, specify batch separators, script types, include folders, etc. Some text editors, such as UltraEdit have this capability.
You can also use the Microsoft Database Project to select batches of script files, and execute them against one or more database connections stored in the project. This is a good starting place that doesn't require any extra software, but can be a bit of a pain regarding adding and managing folders and files within the project.
Using a schema comparison tool such as RedGate's SQL Compare can be useful to synchronize the schema and/or objects of two databases. I don't recommend using this as a best practice deployment or "promote to production" tool though.
There's a SQL function that I'd like to remove from a SQL Server 2005 database, but first I'd like to make sure that there's no one calling it. I've used the "View Dependencies" feature to remove any reference to it from the database. However, there may be web applications or SSIS packages using it.
My idea was to have the function insert a record in an audit table every time it was called. However, this will be of limited value unless I also know the caller. Is there any way to determine who called the function?
You can call extended stored procedures from a function.
Some examples are:
xp_cmdshell
xp_regwrite
xp_logevent
If you had the correct permissions, theoretically you could call an extended stored procedure from your function and store information like APP_NAME() and ORIGINAL_LOGIN() in a flat file or a registry key.
Another option is to build an extended stored procedure from scratch.
If all this is too much trouble, I'd follow the early recommendation of SQL Profiler or server side tracing.
An example of using an extended stored procedure is below. This uses xp_logevent to log every instance of the function call in the Windows application log.
One caveat of this method is that if the function is applied to a column in a SELECT query, it will be called for every row that is returned. That means there is a possibility you could quickly fill up the log.
Code:
USE [master]
GO
/* A security risk but will get the job done easily */
GRANT EXECUTE ON xp_logevent TO PUBLIC
GO
/* Test database */
USE [Sandbox]
GO
/* Test function which always returns 1 */
CREATE FUNCTION ufx_Function() RETURNS INT
AS
BEGIN
DECLARE
#msg VARCHAR(4000),
#login SYSNAME,
#app SYSNAME
/* Gather critical information */
SET #login = ORIGINAL_LOGIN()
SET #app = APP_NAME()
SET #msg = 'The function ufx_Function was executed by '
+ #login + ' using the application ' + #app
/* Log this event */
EXEC master.dbo.xp_logevent 60000, #msg, warning
/* Resume normal function */
RETURN 1
END
GO
/* Test */
SELECT dbo.ufx_Function()
Depending on your current security model. We use connection pooling w/ one sql account. Each application has it's own account to connect to the database. If this is the case. You could then do a Sql Profiler session to find the caller of that function. Whichever account is calling the function will directly relate to one application.
This works for us in the way we handle Sql traffic; I hope it does the same for you.
try this to search the code:
--declare and set a value of #SearchValue to be your function name
SELECT DISTINCT
s.name+'.'+o.name AS Object_Name,o.type_desc
FROM sys.sql_modules m
INNER JOIN sys.objects o ON m.object_id=o.object_id
INNER JOIN sys.schemas s ON o.schema_id=s.schema_id
WHERE m.definition Like '%'+#SearchValue+'%'
ORDER BY 1
to find the caller at run time, you might try using CONTEXT_INFO
--in the code chain doing the suspected function call:
DECLARE #CONTEXT_INFO varbinary(128)
,#Info varchar(128)
SET #Info='????'
SET #CONTEXT_INFO =CONVERT(varbinary(128),'InfoForFunction='+ISNULL(#Info,'')+REPLICATE(' ',128))
SET CONTEXT_INFO #CONTEXT_INFO
--after the suspected function call
SET CONTEXT_INFO 0x0 --reset CONTEXT_INFO
--here is the portion to put in the function:
DECLARE #Info varchar(128)
,#sCONTEXT_INFO varchar(128)
SET #sCONTEXT_INFO=CONVERT(varchar(128),CONTEXT_INFO())
IF LEFT(#sCONTEXT_INFO,15)='InfoForFunction='
BEGIN
SET #Info=RIGHT(RTRIM(#sCONTEXT_INFO),LEN(RTRIM(#sCONTEXT_INFO))-15)
END
--use the #Info
SELECT #Info,#sCONTEXT_INFO
if you put different values in #CONTEXT_INFO in various places, you can narrow down who is calling the function, and refine the value until you find it.
You can try using APP_NAME() and USER_NAME(). It won't give you specifics (like an SSIS package name), but it might help.
This will help you find if this is being called anywhere in your database.
select object_name(id) from sys.syscomments where text like '%**<FunctionName>**%'
Another far less elegant way is to grep -R [functionname] * through your source code. This may or may not be workable depending on the amount of code.
This has the advantage of working even if that part of the only gets used very infrequently, which would be big problem with your audit table idea.
You could run a trace in the profiler to see if that function is called for a week (or whatever you consider a safe window).
I think that you might also be able to use OPENROWSET to call an SP which logs to a table if you enable ad-hoc queries.
I need to export the results of a query to a csv file and put the file on a network shared folder.
Is it possible to achieve this within a stored procedure?
If yes, comes yet another constraint: can I achieve this without sysadmin privileges, aka without using xp_cmdshell + BCP utility?
If no to 2., does the caller have to have sysadmin privileges or would it suffice if the SP owner has sysadmin privileges?
Here are some more details to the problem: The SP must export and transfer the file on the fly and raise error if something went wrong. The caller must get a response immediately, i.e. in case of no error, he can assume that the results are successfully transferred to the folder. Therefore, a DTS/SSIS job that runs every N minutes is not an option. I know the problem smells like I will have to do this at application level, but I would be more than happy if all those stuff could be done from T-SQL.
It seems to me, that you are not waiting for an SQL code in the answer on your question. The main aspect of you question is the security aspect. What should you do to implement your requirement without sysadmin privileges and without a new security hole? This is your real question I think.
I see at least 3 ways to solve your problem. But first of all a short explanation why sysadmin privileges exists in all solutions based on Extended Stored Procedures. Extended Stored Procedures like xp_cmdshell are very old. They existed at least before SQL Server 4.2, the first Microsoft SQL Server running under the first Windows NT (NT 3.1). In the old version of SQL Server I was not security restriction to execute such procedures, but later one made such restrictions. It is important to understand, that all general purpose procedures which allow starting any process under SQL Server account like xp_cmdshell and sp_OACreate must have sysadmin privileges restriction. Only a task oriented procedures with a clear area of usage and role based permissions can solve the problem without a security hole. So this is the 3 solution ways which I promised before:
You create a new SQL account on you SQL server with sysadmin privileges. Then you create a stored procedure which use some from Extended Stored Procedures like xp_cmdshell or sp_OACreate and technically implement you requirements (export some information into a CSV file). With respect of EXECUTE AS Clause (see http://msdn.microsoft.com/en-us/library/ms188354.aspx) you configure the created stored procedure so, that it runs under the account with sysadmin privileges. You delegate the execution of this procedure to users with a some SQL role, to be more flexible from the side of delegation of permission.
You can use CLR Stored Procedures instead of xp_cmdshell and sp_OACreate. You should also use role based permissions on the procedure created.
The end-user doesn’t call directly any SQL stored procedure what you create. There is exists a piece of software (like WCF service or a Web Site) which call your SQL stored procedure. You can implement the export to CSV file inside of this software and not inside of any SQL stored procedure.
In all implementation ways you should exactly define where you will hold the password of the account with which you access to the file system. There are different options which you have, all with corresponding advantages and disadvantages. It's possible to use impersonation to allow access to the file system with the end-user‘s account. The best way depends on the situation which you have in your environment.
You can build a SQL Agent job andkick it off via system SP's from a trigger or SP. The job may call SSIS or bulk dump scrits... returning instant error message may be an issue though
In general, it's quite unusual requirement - what are you trying to accomplish?
UPDATE:
After some more thinking - this is a design issue and I have not been able to find a solution simply by using SQL Server SP's.
IN the past - this is what I did:
on the app level - implement async process where user pushes a button, requesting a file download; the app accepts and let user go
the user can check the status via status page or will get email when it's done or error occured
in the mean time the application layer, kicks of either SSIS package or SQL Agent Job
If parameters are needed - use design and implement special table: JOB_PARAMETERS - where you would put the parameters
you would also need to create more tables to manage the jobs and store job status and communicate with application layer
you may want o use SQL Server Broker on DB level
You may want to use MSMQ on the app level
This is not easy, but this is the most efficient way to export data, where it goes from DB to a file, without traveling to app server and user PC via browser.
Can you use OLE Automation? It's ugly, and you could probably use some set based string building techniques instead of the cursor but here goes...
Declare #Dir varchar(4000)
Set #Dir = 'c:\some\path\accessible\to\SQLServer'
If #Dir IS NULL
Begin
print 'dir is null.'
Return 1
End
declare
#FilePath as varchar(255),
#DataToWrite as varchar(8000)
If right(#DataDir,1) <> '\'
Set #DataDir = #DataDir + '\'
Set #FilePath = #DataDir + 'filename.csv'
DECLARE #RetCode int , #FileSystem int , #FileHandle int
EXECUTE #RetCode = sp_OACreate 'Scripting.FileSystemObject' , #FileSystem OUTPUT
IF (##ERROR|#RetCode > 0 Or #FileSystem < 0)
begin
RAISERROR ('could not create FileSystemObject',16,1)
End
declare #FileExists int
Execute #RetCode = sp_OAMethod #FileSystem, 'FileExists', #FileExists OUTPUT, #FilePath
--print '#FileExists = ' + cast(#FileExists as varchar)
If #FileExists = 1
Begin
RAISERROR ('file does not exist',16,1)
/*return 1*/
End
--1 = for reading, 2 = for writing (will overwrite contents), 8 = for appending
EXECUTE #RetCode = sp_OAMethod #FileSystem , 'OpenTextFile' , #FileHandle OUTPUT , #FilePath, 8, 1
IF (##ERROR|#RetCode > 0 Or #FileHandle < 0)
begin
RAISERROR ('could not create open text file',16,1)
End
DECLARE CSV CURSOR
READ_ONLY
FOR
Select Anything From MyDataTable
order by whatever
DECLARE #fld1 nvarchar(50)
,#fld2 nvarchar(50)
OPEN CSV
FETCH NEXT FROM CSV INTO #fld1, #fld2
WHILE (##fetch_status <> -1)
BEGIN
IF (##fetch_status <> -2)
BEGIN
Set #DataToWrite = #fld1 + ',' + #fld2 + char(13) + char(10)
EXECUTE #RetCode = sp_OAMethod #FileHandle , 'Write' , NULL , #DataToWrite
IF (##ERROR|#RetCode > 0)
begin
RAISERROR ('could not write to file',16,1)
End
END
FETCH NEXT FROM OpenOrders INTO #fld1, #fld2
END
CLOSE CSV
DEALLOCATE CSV
EXECUTE #RetCode = sp_OAMethod #FileHandle , 'Close' , NULL
IF (##ERROR|#RetCode > 0)
RAISERROR ('Could not close file',16,1)
EXEC sp_OADestroy #FileSystem
return 0
End
Generally, no, this kind of work can't be done without a lot of fuss and effort and sysadmin rights.
SQL is a database engine, and is focused on database problems, and so and quite rightly has very poor file manipulation tools. Work-arounds include:
xp_cmdshell is the tool of choice for file manipulations.
I like the sp_OA* solution myself, 'cause it gives me flashbacks to SQL 7.0. But using those functions always made me nervous.
You might be able to do something with OPENROWSET, where the target of an insert is a file defined with this function. Sounds unlikely, might be worth a look.
Similarly, a linked server definition might be used as a target for inserts or select...into... statements.
Security seems to be your showstopper. By and large, when SQL shells out to the OS, it has all the rights of the NT account under which the SQL service started up on; if you'd want to limit network access, configure that account carefully (and never make it domain admin!)
It is possible to call xp_cmdshell as a user without sysadmin rights, and to configure these calls to not have the same access rights as the SQL Service NT account. As per BOL (SQL 2005 and up):
xp_cmdshell Proxy Account
When it is called by a user that is not a member of the sysadmin fixed server role, xp_cmdshell connects to Windows by using the account name and password stored in the credential named ##xp_cmdshell_proxy_account##. If this proxy credential does not exist, xp_cmdshell will fail.
The proxy account credential can be created by executing sp_xp_cmdshell_proxy_account. As arguments, this stored procedure takes a Windows user name and password. For example, the following command creates a proxy credential for Windows domain user SHIPPING\KobeR that has the Windows password sdfh%dkc93vcMt0.
So your user logs in with whatever user rights (not sysadmin!) and executes the stored procedure, which calls xp_cmdshell, which will "pick up" whatever proxy rights have been configured. Again, awkward, but it sounds like it'd do what you'd want it to do. (A possible limiting factor is that you only get the one proxy account, so it has to fit all possible needs.)
Honestly, it sounds to me like the best solution would be to:
Identify the source of the call to the stored procedure,
Have the procedure return the data to be written to the file (you can do all your formatting and layout in the procedure if need be), and
Have the calling routine manage all the file preparation steps (which could be as simple as redirecting data returned from SQL into an opened file)
So, what does launch the call to the stored procedure?