I have a ##table which can be accessed across all the sessions but sometimes I am getting error
There is already an object named
'##table' in the database.
WHY and how to resolve it.
Found an interesting reference (outdated URL referencing what is now a malicious website removed.):
Global temporary tables operate much like local temporary tables; they are created in tempdb and cause less locking and logging than permanent tables. However, they are visible to all sessions, until the creating session goes out of scope (and the global ##temp table is no longer being referenced by other sessions). If two different sessions try the above code, if the first is still active, the second will receive the following:
Server: Msg 2714, Level 16, State 6, Line 1
There is already an object named '##people' in the database.
I have yet to see a valid justification for the use of a global ##temp table. If the data needs to persist to multiple users, then it makes much more sense, at least to me, to use a permanent table. You can make a global ##temp table slightly more permanent by creating it in an autostart procedure, but I still fail to see how this is advantageous over a permanent table. With a permanent table, you can deny permissions; you cannot deny users from a global ##temp table.
There is already an object named '##table' in the database.
You would typically get this error if you are doing a CREATE Table statement which would obviously fail as '##table' already exists in the database.
Seems to me that maybe at some point in your code, the CREATE TABLE logic for this global table is being invoked again leading to this error.
Do the have the details of the exact statement that results in this error?
So the WHY part has been answered and here is how to resolve it:
Do a check to see if the temp table exists before creating it:
if object_id('tempdb..##table') is null begin
--create table ##table...
end
I found a pretty interesting post about how to check the existence of a temp table from Googling http://sqlservercodebook.blogspot.com/2008/03/check-if-temporary-table-exists.html
Related
I have a query which conceptually can be described like this:
CREATE TABLE ##MyTable (
-- rows
)
INSERT INTO ##MyTable (...)
/*inserted SELECT */
WHILE ....
BEGIN
-- do some actions using data from temp table
END
EXEC msdb.dbo.sp_send_dbmail
-- other data needed for email sending ...
#query = N'select ... FROM ##MyTable;',
-- drop the temporary table
DROP TABLE ##MyTable
So, i select some data to the global temp table, them work with it, them send email and finally delete this temp table.
This query is used as a task, which launches periodically to automate some processes in my DB.
The moment I doubt in - is global temporary table. If i plan to use this table (with such name) only in this automatisation script, can i be sure that there will be no collisions or some other similar bugs? It looks like it should not be, cause no users or program connections are going to use this table, so the logic is simple: my task launches once a week, creates this table then deletes it. But is it really so, or i miss some moments, and it is not a good idea to use global temporary table here?
PS: i've tried to use local temp. tables, but sp_send_dbmail returns an error (as far as i understand a table is deleted already when sp_send_dbmail launches):
Msg 22050, Level 16, State 1, Line 0
Error formatting query, probably invalid parameters
Msg 14661, Level 16, State 1, Procedure sp_send_dbmail, Line 504
Query execution failed: Msg 208, Level 16, State 1, Server SERVER\SERVER, Line 1
Invalid object name '#MyTable'.
You are correct that a session temporary table can't be used with sp_send_dbmail. To quote from the docs, emphasis mine:
[ #query= ] 'query' Is a query to execute. The results of the query
can be attached as a file, or included in the body of the e-mail
message. The query is of type nvarchar(max), and can contain any valid
Transact-SQL statements. Note that the query is executed in a
separate session, so local variables in the script calling
sp_send_dbmail are not available to the query.
Global temporary tables can be created by any user so there is a possibility of a clash here. This can happen if one execution of the task takes too long and overlaps with the next run. There are three ways I might consider to resolve this.
Use the NEWID() function to create the name of the temporary table. This would ensure two executions of the script create tables with different names.
Use a permanent table with a column in it that is uniquely set by each execution of the script so that it can be referred to by the query passed to sp_send_dbmail.
I might consider using sp_getapplock to create a lock that other scripts could check to see if the table was in use. (Note if the script executions routinely overlap this may cause a backlog to build up).
A global temporary table means that any other user can also try to create the same global temporary table. This would cause collision.
In most cases, creating and using a permanent table has served us well. You can lot of advantages with a permanent table. You can have a history of things done. If you think the data will grow, you can setup house keeping to delete data older than some days or weeks.
In our projects our guidance is: either create a "real" temporary table or a "real" permanent table.
Not using global temporary table. Convert your query output to HTML body, here may help you https://dba.stackexchange.com/questions/83776/need-to-send-a-formatted-html-email-via-database-mail-in-sql-server-2008-r2
Using global temporary table but consider to reduce the chance of collision
a. try #Martin Brown's suggestion.
b. if your while loop takes some time to finish, you can create a local temporary table for it first. Only dump the output to global temp table right before database mail. And drop it immediately after mail sent.
I've written a SQL query for a report that creates a permanent table and then performs a bunch of inserts and updates to get all the data, according to company policy. It runs fine in SQL Server Management Studio and in Crystal Reports 2008 on my machine. However, when I schedule it to run on the server with SAP BusinessObjects Central Management Console, it fails with the error "Associated statement not prepared."
I have found that changing this permanent table to be a temp table makes the query work. Why would this be?
Some research shows that this error is sometimes sent instead of the true error. Other people reporting it talk of foreign key and (I would also assume) duplicate key errors.
Things I would check:
Does your permanent table have any unique constraints that might be violated? Or any foreign key constraints?
Are you creating indexes on the table after it has been created?
Are you creating any views over this permanent table?
What happens if the table already exists before the job is run?
What happens to the table if the job fails?
Are there any intermediate steps (such as within a stored procedure) that might involve additional temp or permanent tables?
ETA: Also check what schema the permanent table belongs to: is it usually created with "dbo"? Are you specifying that explicitly? Is there any chance that there might be a permissions problem?
That is often a generic error. Are you able to run it on the server as the account that it is scheduled to run as? It is most likely a permission error or constraint issue.
Assuming you really need a regular table, why it's not possible to create the permanent table once, vs creating it every time you run the query?
Recreating regular user table each time query runs does not seem right. But to make it work you may try to recreate the table in a separate batch or query (e.g. put GO in the script, that splits it into separate queries).
Regarding why it happens, I'm thinking about statement caching. Server compiles the query and stores the result for some time in case same query has to run again. So it's my speculation that it tries to run the compiled query which refers to the table you have already dropped and recreated under the same name. Name is the same, but physically it's a new table. You could hit some bug in the server this way. Just a speculation, it can be different kind of problem.
Without seeing code it's a guess, but being that you are creating a permanent table everytime you run the report, I assume you must be dropping the table at some point? (Or you'd have a LOT of tables building up over time.)
I suggest a couple angles to consider:
1) Make certain to prefix tables (perhaps by a session ID or soemthing) if you are concerned about concurrency/locking issues and the like so each report run has a table exclusive to itself.
2) If you are dropping the table at the end, instead adjust your logic to leave the table be. Write code that drops when you (re)start the operation. It's possible the report is clinging to the table and you are destroying it prematurely.
Is there a way in Oracle to create a table that only exists while the database is running and is only stored in memory? So if the database is restarted I will have to recreate the table?
Edit:
I want the data to persist across sessions. The reason being that the data is expensive to recreate but is also highly sensitive.
Using a temporary table would probably help performance compared to what happens today, but its still not a great solution.
You can create a 100% ephemeral table that is usable for the duration of a session (typically shorter than the duration than the database run time) called a TEMPORARY table. The entire purpose of a table in memory is to make it faster for reading from. You will have to re-populate the table for each session as the table will be forgotten (both structure and data) once the session completes.
No exactly, no.
Oracle has the concept of a "global temporary table". With a global temporary table, you create the table once, as with any other table. The table definition will persist permanently, as with any other table.
The contents of the table, however, will will not be permanent. Depending on how you define it, the contents will persist for either the life of the session (on commit perserve rows) or the life of the transaction (on commit delete rows).
See the documentation for all the details:
http://docs.oracle.com/cd/E11882_01/server.112/e25494/tables003.htm#ADMIN11633
Hope that helps.
You can use Oracle's trigger mechanism to invoke a stored procedure when the database starts up or shuts down.
That way you could have the startup trigger create the table, and the shutdown trigger drop it.
You'd probably also want the startup trigger to handle cases where the table exists and truncate it just in case the server stopped suddenly and the shutdown trigger wasn't called.
Oracle trigger documentation
Using Oracle's Global Temporary Tables, you can create a table in memory and have it delete the data at the end of the transaction, or the end of the session.
If I understand correctly, you have some data that needs to be processed when the database is brought online and left available only as long as the database is online. The only use-case I can think of that would require this is if you're encrypting some data and you want to ensure that the unencrypted data is never written to disk.
If this is actually your use-case, I would recommend forgetting about trying to create your own solution for this and, instead, make use of Oracle's encrypted tablespaces or Transparent Data Encryption.
Given the following:
if object_id('MyTable') is null create table MyTable( myColumn int )
Is it not possible that two separate callers could both evaluate object_id('MyTable') as null and so both attempt to create the table.
Obviously one of the two callers in that scenario would fail, but ideally no caller should fail, rather one should block and the other should create the table, then the blocked caller will see object_id('MyTable') as not null and proceed.
On what can I apply exclusive lock, such that I'm not locking more than is absolutely required?
After your initial check, use a try catch when creating the table, and if the error is that the table already exists, proceed, if not, you have a bigger problem
Usually CREATE TABLE is run from setup and installation scripts and is unreasonable to expect installation scripts to allow for concurrent installation from separate connections.
I recommend you use a session scopped app lock acquired at the beginning of your install/upgrade procedure, see sp_getapplock.
I don't think, you should be worrying about this.
DDL statements don't run under a transaction. Also, the 2nd caller will fail, if the table already was created by a call from the 1st caller.
I don't allow users to create tables. In general that's a bad practice. If they need to insert data, the table is already there. If you are worried about two people creating the same table are you also worried about whether their data is crossing? I don't know what your proc does but if it deos something like delte the records if the table exists and then insert, then you could have strange reults if two users were on at a the same time. In general though, if you are needing to creat ea table at run time , it is usually a sign that your design needs work.
I have a stored procedure in SQL 2005. The Stored Procedure is actually creating temporary tables in the beginning of SP and deleting it in the end. I am now debugging the SP in VS 2005. In between the SP i would want to know the contents into the temporary table. Can anybody help in in viewing the contents of the temporary table at run time.
Thanks
Vinod T
There are several kinds of temporary tables, I think you could use the table which is not dropped after SP used it. Just make sure you don't call the same SP twice or you'll get an error trying to create an existing table. Or just drop the temp table after you see it's content. So instead of using a table variable (#table) just use #table or ##table
From http://arplis.com/temporary-tables-in-microsoft-sql-server/:
Local Temporary Tables
Local temporary tables prefix with single number sign (#) as the first character of their names, like (#table_name).
Local temporary tables are visible only in the current session OR you can say that they are visible only to the current connection for the user.
They are deleted when the user disconnects from instances of Microsoft SQL Server.
Global temporary tables
Global temporary tables prefix with double number sign (##) as the first character of their names, like (##table_name).
Global temporary tables are visible to all sessions OR you can say that they are visible to any user after they are created.
They are deleted when all users referencing the table disconnect from Microsoft SQL Server.
Edit the stored procedure to temporarily select * from the temp tables (possibly into another table or file, or just to the output pane) as it runs..?
You can then change it back afterwards. If you can't mess with the original procedure, copy it and edit the copy.
I built a few stored procedures which allow you to query the content of a temp table created in another session.
See sp_select project on github.
The content of the table can be displayed by running exec sp_select 'tempdb..#temp' from no matter which session.
Bottom line: the default Visual Studio Microsoft debugger is not in the same session as the SQL code being executed and debugged.
So you can ONLY look at #temp tables by switching them to global ##temp tables or permanent tables or whatever technique you like best that works across sessions.
note: this is VERY different from normal language debuggers... and I suspect kept
that way by Microsoft on purpose... I've seen third party SQL debugger tools decades ago
that didn't have this problem.
There is no good technical reason why the debugger cannot be in the same session as your SQL code, thus allowing you to examine all produced contructs including #temp tables.
To expand on previous suggestions that you drop the data into a permanent table, you could try the following:
-- Get rid of the table if it already exists
if object_id('TempData') is not null
drop table TempData
select * into TempData from #TempTable
This helped me.
SELECT * FROM #Name
USE [TEMPDB]
GO
SELECT * FROM syscolumns
WHERE id = ( SELECT id FROM sysobjects WHERE [Name] LIKE '#Name%')
this gives the details of all the temp table