Execute custom sql procedure in MS Master Data Services 2012 Database - sql

My goal is to launch stored procedure from another DB (not MDS DB) when user commits Verion to start ETL process. For that in MDS DB i've added launching custom stored procedure in [mdm].[udpVersionSave].
But when i'm trying to commit Version in MDS Web interface, nothing happens - Version doesn't become commited. BTW, when i launch procedure from MDS DB - it's working.
My guess - the problem is in user\login access. But i try a lot combinations in giving access - nothing helped.
UPD.:
Code which launches stored procedure:
execute [AdventureWorksDW2012].[dbo].[sp__test_insert_data] #code = N'3';
Code of procedure sp__test_insert_data:
insert into AdventureWorksDW2012.dbo._test_insert_data(col_ver)
values (#code);
Procedure sp__test_insert_data works fine, when i'm launching it manually under sa.
Any ideas?

Related

No insert happens on linked server Express version when Agent Job activates stored procedure from linked Enterprise version server

Using SSMS with linked servers. Have stored procedure in Express version that performs a table insert. When the stored procedure is run locally, the table insert works.
When I run an Agent Job executing the stored procedure from the linked Enterprise version, the table insert does not happen. Linked servers are set up properly - I have all permissions turned on for both target table and stored procedures.

How to rollback database when user abort process?

I have createde a stored procedure such as:
CREATE PROCEDURE backupDB
BEGIN
...
exec('BACKUP DATABASE '+#targetDbName+' TO DISK = ''C:\ABC\'+#backupFileName+'.bak''')
...
END
I would like to rollback database when user abort backup Database process. For example, I create a button named "Cancel" and when user click it, all process in procedure 'backupDB' will be rollolbacked.
So, How can do it?
I use MS SQL Server 2008 R2 and Visual studio 2013 with ASP.NET MVC 5.
Thanks for your help.
You can apply Transaction in C# code or inside your stored procedure.Transaction

bulk insert works in ssms but not in other applications

Context: SQL Server 2005
I have a simple proc, which does a bulk load from an external file.
ALTER proc [dbo].[usp_test]
AS
IF OBJECT_ID('tempdb..#promo') is not null BEGIN
DROP TABLE #promo
END
CREATE TABLE #promo (promo VARCHAR(1000))
BULK INSERT #promo
FROM '\\server\c$\file.txt'
WITH
(
--FIELDTERMINATOR = '',
ROWTERMINATOR = '\n'
)
select * from #promo
I can run it in SSMS. But when I call it from another application (Reporting service 2005), it throws this error:
Cannot bulk load because the file "\server\c$\file.txt" could not be opened. Operating system error code 5 (Access is denied.).
Here is complicated because it may related to the account used by reporting service, or some windows security issue.
But I think I can maybe impersonate the login as the one I used to create the proc because the login can run it in SSMS. So tried to change the proc to 'with execute as self', it compiles ok, but when I tried to run it in SSMS, I got:
Msg 4834, Level 16, State 4, Procedure usp_test, Line 12
You do not have permission to use the bulk load statement.
I am still in the same session, so when I run this, it actually execute as the 'self', which is the login I am using now, so why I got this error? What should I do?
I know it's bit unclear so just list the facts.
========update
I just tried using SSIS to load the file into a table so that the report can use. The package runs ok in BIDS but when runs in sql agent job it got the same access to the file is denied error. Then I set up a proxy and let the package run under that account and the job runs no problem.
So I am thinking is it the account ssrs used can't access the file? What account is used by ssrs? Can ssrs be set up to run under a proxy like sql agent does?
==============update again
Finally got it sorted
I have created a SSIS package, put the package in a job (running under a proxy account to access the file), and in the proc to execute the job. This does work although tricky (need to judge whether the job has finished in the proc). This is too tricky to maintain, so just create as a proof of concept, will not go into production.
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/761b3c62-c636-407d-99b5-5928e42f3cd8/execute-as-problem?forum=transactsql
1) The reason you get the "You do not have permission to use the bulk load statement." is because (naturally) you don't have permissions to use the bulk load statement.
You must either be a sysadmin or a bulkadmin at the server level to run BULK commands.
2) Yes, "Access is denied" usually means whatever credentials you are using to run the sproc in SSRS does not have permissions to that file. So either:
Make the file available to everyone.
Set a known credential with full access to the file to the datasource running the sproc.
3) What the heck, dude.
Why not just use the text file directly as a data source in SSRS?
If that's not possible, why not perform all your ETL in one sproc run outside SSRS, and then just use a simple "select * from table" statement for SSRS?
Please do not run a BULK INSERT every time someone wants the report. If they need up to date reads of the file, use the file as a data source. If they can accept, say, a 10 minute lag in data, create a batch job or ETL process to pick the file up and put it into a database table every 10 minutes and just read from that. Write once, read many.

SQL Server runs SP on startup - where's the magic?

We have several SQL Server 2008 R2 environments (dev, QA, Production) with databases for an ASP.NET application.
This applicaton uses ASP.NET membership and SQL Server session providers, thus we have an ASPState database.
The functionality of these providers were extended to restrict one active session per login. Our implementation added tables to TempDB, a stored proc ASPState.dbo.CreateTempTables to create these tables, and another stored proc Master.dbo.ASPState_Startup which calls the SP in ASPState.
On my dev machine and in production, when SQL Server is started Master.dbo.ASPState_Startup is executed and the tables are created.
I am setting up a new QA environment and cannot figure out how that happens (so in QA, the tables are not added to TempDB on startup). I have compared schema and permissions manually and via Red Gate's compare tool and find no differences there.
I checked the jobs and none call either of these stored procs.
Any ideas of where the magic is hiding?
Thanks,
Scott
sp_procoption is the "magic":
Sets or clears a stored procedure for automatic execution. A stored procedure that is set to automatic execution runs every time an instance of SQL Server is started.
EXEC sp_procoption #ProcName = 'ASPState_Startup'
, #OptionName = 'startup'
, #OptionValue = 'on';

Stored Procedure stopped working

I have a stored procedure that I'm positive of has no errors but I recently deleted the table it references but imported a backup with the same name exactly and same column settings (including identity) that the previous one had but now it doesn't work.
Is there any reason that having deleted the table but importing a new would break the stored procedure?
BTW: Running Microsoft SQL Server Express Edition w/ IIS.
you can try to recompile the stored procedure with:
exec sp_recompile YourProblemTableNameHere
this will recompile all procedures that use the YourProblemTableNameHere table. But that is just a guess based on the very limited info given.