How to close the window after a sql file has finished running - sql

I have a script that I turned into an exe that connects to an oracle database, then runs a sql script. after the script is done, the window stays open. I have tried a few commands and the sql> stays there. Thanks!
ALTER USER &user_name ACCOUNT UNLOCK;

Edit your C:\Scripts\Script2.sql file and add an exit statement at the end on a line by itself.
You may need to ensure that the prior command is correctly terminated with a ; or a / character on a line by itself.
ALTER USER &user_name ACCOUNT UNLOCK;
exit

Related

Change the database connection programmatically

In Oracle SQL Developer, I need to switch the active database connection manually. Is there a command that will connect to a different database programmatically, assuming that the login credentials are already saved? I'm trying to avoid clicking on the drop-down menu at the top right of the window which selects the active connection.
Perhaps I should rather have a single SQL file per database? I could understand that argument. But this to prepare to migrate some tables from one database to another and so it's nice to have all of the context in one file.
On database1, run a query on table1 which is located in schema1.
-- manually switch to database1 (looking for a command to replace this step)
ALTER SESSION SET CURRENT_SCHEMA = schema1
SELECT * FROM table1;
On database2, run a query on table2 which is located in schema2.
-- manually switch to database2
ALTER SESSION SET CURRENT_SCHEMA = schema2
SELECT * FROM table2;
Looks like this is well documented here
Use this command
CONN[ECT] [{<logon>| / |proxy} [AS {SYSOPER | SYSDBA | SYSASM}] [edition=value]]
You need a DDL TRIGGER to perform an event after your presql
CREATE TRIGGER sample
ON TABLE
AFTER
Event
........
THEN
ALTER SESSION SET
CURRENT_SCHEMA = schema2
SELECT * FROM table2;
I don't know of a way in which to change your selected connection in SQL Developer, but there is a programmatic method for temporarily changing the connection under which the script commands are run, as #T.S. pointed out. I want to give a few examples, which might be helpful to people (as they would have been for me).
So let's say your script has part A and part B and you want to execute them one after the other but from different connections. Then you can use this:
CONNECT username1/password1#connect_identifier1;
-- Put commands A here to be executed under this connection.
DISCONNECT; -- username1
CONNECT username2/password2#connect_identifier2;
-- Put commands B here to be executed under this connection.
DISCONNECT; -- username2
The connect_identifier part identifies the database where you want to connect. For instance, if you want to connect to a pluggable database on the local machine, you may use something like this:
CONNECT username/password#localhost/pluggable_database_name;
or if you want to connect to a remote database:
CONNECT username/password#IP:port/database_name;
You can omit the password, but then you will have to input it in a prompt each time you run that section. If you want to consult the CONNECT command in more detail, this reference document may be useful.
In order to execute the commands, you would then select the code that you are interested in (including the relevant CONNECT commands) and use Run Script (F5) or just use Run Script (F5) without selecting anything which will execute the entire script file. SQL Developer will execute your commands, put the output into the Script Output tab and then stop the connection. Note that the output of SELECT commands might be unpleasant to read inside Script Output. This can be mitigated by running the following command first (just once):
SET sqlformat ansiconsole;
There is also Run Statement (Ctrl+Enter), but do note that Run Statement (Ctrl+Enter) does not seem to work well with this workflow. It will execute and display each SELECT statement into a separate Query Result tab, which is easier to read, BUT the SELECT query will always be executed from the context of the active connection in SQL Developer (the one in the top right), not the current code connection of the CONNECT statement. On the other hand, INSERT commands, for instance, DO seem to be executed in the context of the current code connection of the CONNECT statement. This (rather inconsistent) behaviour is probably not what you want, so I recommend using Run Script (F5) as described above.

Prevent SQL File From Running As a Script

I have tens of SQL files that contain individual queries and update commands. These commands are intended to be used as a starting point for manual updates to the database, they are not structured to be ran as scripts.
Is there an SQL command I can place at the start of these files to stop an inadvertent click of the 'Run Script' button? Return, halt, stop?
I'm opening these in Oracle's SQL Developer, and the 'Run Command' and 'Run Script' buttons are right next to each other...
You can use the exit command at the top of your script.
If someone accidentally runs it as a script, it will immediately disconnect and will not run the rest of the statements.
Example:
exit
update t set a = 'abc';
drop table t;
...

bulk insert works in ssms but not in other applications

Context: SQL Server 2005
I have a simple proc, which does a bulk load from an external file.
ALTER proc [dbo].[usp_test]
AS
IF OBJECT_ID('tempdb..#promo') is not null BEGIN
DROP TABLE #promo
END
CREATE TABLE #promo (promo VARCHAR(1000))
BULK INSERT #promo
FROM '\\server\c$\file.txt'
WITH
(
--FIELDTERMINATOR = '',
ROWTERMINATOR = '\n'
)
select * from #promo
I can run it in SSMS. But when I call it from another application (Reporting service 2005), it throws this error:
Cannot bulk load because the file "\server\c$\file.txt" could not be opened. Operating system error code 5 (Access is denied.).
Here is complicated because it may related to the account used by reporting service, or some windows security issue.
But I think I can maybe impersonate the login as the one I used to create the proc because the login can run it in SSMS. So tried to change the proc to 'with execute as self', it compiles ok, but when I tried to run it in SSMS, I got:
Msg 4834, Level 16, State 4, Procedure usp_test, Line 12
You do not have permission to use the bulk load statement.
I am still in the same session, so when I run this, it actually execute as the 'self', which is the login I am using now, so why I got this error? What should I do?
I know it's bit unclear so just list the facts.
========update
I just tried using SSIS to load the file into a table so that the report can use. The package runs ok in BIDS but when runs in sql agent job it got the same access to the file is denied error. Then I set up a proxy and let the package run under that account and the job runs no problem.
So I am thinking is it the account ssrs used can't access the file? What account is used by ssrs? Can ssrs be set up to run under a proxy like sql agent does?
==============update again
Finally got it sorted
I have created a SSIS package, put the package in a job (running under a proxy account to access the file), and in the proc to execute the job. This does work although tricky (need to judge whether the job has finished in the proc). This is too tricky to maintain, so just create as a proof of concept, will not go into production.
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/761b3c62-c636-407d-99b5-5928e42f3cd8/execute-as-problem?forum=transactsql
1) The reason you get the "You do not have permission to use the bulk load statement." is because (naturally) you don't have permissions to use the bulk load statement.
You must either be a sysadmin or a bulkadmin at the server level to run BULK commands.
2) Yes, "Access is denied" usually means whatever credentials you are using to run the sproc in SSRS does not have permissions to that file. So either:
Make the file available to everyone.
Set a known credential with full access to the file to the datasource running the sproc.
3) What the heck, dude.
Why not just use the text file directly as a data source in SSRS?
If that's not possible, why not perform all your ETL in one sproc run outside SSRS, and then just use a simple "select * from table" statement for SSRS?
Please do not run a BULK INSERT every time someone wants the report. If they need up to date reads of the file, use the file as a data source. If they can accept, say, a 10 minute lag in data, create a batch job or ETL process to pick the file up and put it into a database table every 10 minutes and just read from that. Write once, read many.

Blocking within SSMS tab

This sutiation can be easily reproduced on your test environment. Open SSMS and connect to you server. Open New Query tab connected to MYTEST database (I assume that MYTEST is online).
Don't do anything with this tab. Open new tab connected to the same database. Type the following code in your new tab
USE master
GO
ALTER DATABASE MYTEST
SET OFFLINE
Your code will be head blocked by the process you are running from your first tab.(Please see Activity Monitor).
Why is the execution blocked even though there is no task accosiated with process in the first tab?
You'd need to tell SQL to kick every connection out
ALTER DATABASE MYTEST
SET OFFLINE
WITH ROLLBACK IMMEDIATE
This is by design: a connection to a database has shared DB lock, whether executing or not.
WITH <termination>::=
Specifies when to roll back incomplete transactions when the database is transitioned from one state to another. If the termination clause is omitted, the ALTER DATABASE statement waits indefinitely if there is any lock on the database. Only one termination clause can be specified, and it follows the SET clauses.
Just run sp_lock (or whatever the new dmvs are :-) and you'll see them

ALTER DATABASE failed because a lock could not be placed on database

I need to restart a database because some processes are not working. My plan is to take it offline and back online again.
I am trying to do this in Sql Server Management Studio 2008:
use master;
go
alter database qcvalues
set single_user
with rollback immediate;
alter database qcvalues
set multi_user;
go
I am getting these errors:
Msg 5061, Level 16, State 1, Line 1
ALTER DATABASE failed because a lock could not be placed on database 'qcvalues'. Try again later.
Msg 5069, Level 16, State 1, Line 1
ALTER DATABASE statement failed.
Msg 5061, Level 16, State 1, Line 4
ALTER DATABASE failed because a lock could not be placed on database 'qcvalues'. Try again later.
Msg 5069, Level 16, State 1, Line 4
ALTER DATABASE statement failed.
What am I doing wrong?
After you get the error, run
EXEC sp_who2
Look for the database in the list. It's possible that a connection was not terminated. If you find any connections to the database, run
KILL <SPID>
where <SPID> is the SPID for the sessions that are connected to the database.
Try your script after all connections to the database are removed.
Unfortunately, I don't have a reason why you're seeing the problem, but here is a link that shows that the problem has occurred elsewhere.
http://www.geakeit.co.uk/2010/12/11/sql-take-offline-fails-alter-database-failed-because-a-lock-could-not-error-5061/
I managed to reproduce this error by doing the following.
Connection 1 (leave running for a couple of minutes)
CREATE DATABASE TESTING123
GO
USE TESTING123;
SELECT NEWID() AS X INTO FOO
FROM sys.objects s1,sys.objects s2,sys.objects s3,sys.objects s4 ,sys.objects s5 ,sys.objects s6
Connections 2 and 3
set lock_timeout 5;
ALTER DATABASE TESTING123 SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
Try this if it is "in transition" ...
http://learnmysql.blogspot.com/2012/05/database-is-in-transition-try-statement.html
USE master
GO
ALTER DATABASE <db_name>
SET OFFLINE WITH ROLLBACK IMMEDIATE
...
...
ALTER DATABASE <db_name> SET ONLINE
Just to add my two cents. I've put myself into the same situation, while searching the minimum required privileges of a db login to run successfully the statement:
ALTER DATABASE ... SET SINGLE_USER WITH ROLLBACK IMMEDIATE
It seems that the ALTER statement completes successfully, when executed with a sysadmin login, but it requires the connections cleanup part, when executed under a login which has "only" limited permissions like:
ALTER ANY DATABASE
P.S. I've spent hours trying to figure out why the "ALTER DATABASE.." does not work when executed under a login that has dbcreator role + ALTER ANY DATABASE privileges. Here's my MSDN thread!
I will add this here in case someone will be as lucky as me.
When reviewing the sp_who2 list of processes note the processes that run not only for the effected database but also for master. In my case the issue that was blocking the database was related to a stored procedure that started a xp_cmdshell.
Check if you have any processes in KILL/RollBack state for master database
SELECT *
FROM sys.sysprocesses
WHERE cmd = 'KILLED/ROLLBACK'
If you have the same issue, just the KILL command will probably not help.
You can restarted the SQL server, or better way is to find the cmd.exe under windows processes on SQL server OS and kill it.
In SQL Management Studio, go to Security -> Logins and double click your Login. Choose Server Roles from the left column, and verify that sysadmin is checked.
In my case, I was logged in on an account without that privilege.
HTH!
Killing the process ID worked nicely for me.
When running "EXEC sp_who2" Command over a new query window... and filter the results for the "busy" database , Killing the processes with "KILL " command managed to do the trick. After that all worked again.
I know this is an old post but I recently ran into a very similar problem. Unfortunately I wasn't able to use any of the alter database commands because an exclusive lock couldn't be placed. But I was never able to find an open connection to the db. I eventually had to forcefully delete the health state of the database to force it into a restoring state instead of in recovery.
In rare cases (e.g., after a heavy transaction is commited) a running CHECKPOINT system process holding a FILE lock on the database file prevents transition to MULTI_USER mode.
In my scenario, there was no process blocking the database under sp_who2. However, we discovered because the database is much larger than our other databases that pending processes were still running which is why the database under the availability group still displayed as red/offline after we tried to 'resume data'by right clicking the paused database.
To check if you still have processes running just execute this command:
select percent complete from sys.dm_exec_requests
where percent_complete > 0