I have a temporary table called table1 created like this:
create tabletempdb..table1(id int)
The table was not created by the owner of the tempdb database.
When i tried earlier to access the table with this query, inside a stored procedure ( just for testing ):
select top 10 * from tempdb..table1
I got this error:
Msg 208, Level 16, State 1:
Server 'SERVER', Procedure 'storedProcedure', Line 30:
tempdb..table1 not found. Specify owner.objectname or use sp_help to check whether the object
exists (sp_help may produce lots of output).
However, about an hour later the same stored procedure ran without any problems.
The table was not dropped and created again during that hour and I cannot find any reason for this strange behavior. I could fix this by applying some kind of naming hack, but I do not want to insert hacks into a quite sensitive flow, meaning a lot of users could drop and create the table.
I am asking if someone could explain this behavior so I can avoid it from now on.
Related
IF OBJECT_ID('tempdb..#mytesttable') IS NOT NULL
DROP TABLE #mytesttable
SELECT id, name
FROM
INTO #mytesttable mytable
Question: is it good to check temp table existence (e.g: OBJECT_ID('tempdb..#mytesttable') ) when I first time create this temp table inside a procedure?
What will be the best practice in terms of performance?
It is good practice to check if table exists or not. and it doesn't have any performance impact. anyhow, temp tables automatically drops once procedure execution completes in sql server. sql server appends unique num with temp table name. so if same procedure executes more than once at same time also , it will not cause any issue. it depends on session. all procs executions have different session.
Yes, it is a good practice. I am always doing this at the beginning of the routine. In SQL Server 2016 + objects can DIE (Drop If Exists), so you can simplify the call:
DROP TABLE IF EXISTS #mytesttable;
Also, even at the end of the routine you temporary objects are destroyed and even the SQL Engine gives unique names to temporary tables, there is still another behavior to consider.
If you are naming your temporary table with same name, when nested procedure calls are involved (one stored procedure calls another) it is possible to get an error or corrupt your data. This is due to the fact that temporary table are visible in the current execution scope - so, #table in one procedure will be visible in the second called procedure (so you can modify its data or drop it).
I have a query which conceptually can be described like this:
CREATE TABLE ##MyTable (
-- rows
)
INSERT INTO ##MyTable (...)
/*inserted SELECT */
WHILE ....
BEGIN
-- do some actions using data from temp table
END
EXEC msdb.dbo.sp_send_dbmail
-- other data needed for email sending ...
#query = N'select ... FROM ##MyTable;',
-- drop the temporary table
DROP TABLE ##MyTable
So, i select some data to the global temp table, them work with it, them send email and finally delete this temp table.
This query is used as a task, which launches periodically to automate some processes in my DB.
The moment I doubt in - is global temporary table. If i plan to use this table (with such name) only in this automatisation script, can i be sure that there will be no collisions or some other similar bugs? It looks like it should not be, cause no users or program connections are going to use this table, so the logic is simple: my task launches once a week, creates this table then deletes it. But is it really so, or i miss some moments, and it is not a good idea to use global temporary table here?
PS: i've tried to use local temp. tables, but sp_send_dbmail returns an error (as far as i understand a table is deleted already when sp_send_dbmail launches):
Msg 22050, Level 16, State 1, Line 0
Error formatting query, probably invalid parameters
Msg 14661, Level 16, State 1, Procedure sp_send_dbmail, Line 504
Query execution failed: Msg 208, Level 16, State 1, Server SERVER\SERVER, Line 1
Invalid object name '#MyTable'.
You are correct that a session temporary table can't be used with sp_send_dbmail. To quote from the docs, emphasis mine:
[ #query= ] 'query' Is a query to execute. The results of the query
can be attached as a file, or included in the body of the e-mail
message. The query is of type nvarchar(max), and can contain any valid
Transact-SQL statements. Note that the query is executed in a
separate session, so local variables in the script calling
sp_send_dbmail are not available to the query.
Global temporary tables can be created by any user so there is a possibility of a clash here. This can happen if one execution of the task takes too long and overlaps with the next run. There are three ways I might consider to resolve this.
Use the NEWID() function to create the name of the temporary table. This would ensure two executions of the script create tables with different names.
Use a permanent table with a column in it that is uniquely set by each execution of the script so that it can be referred to by the query passed to sp_send_dbmail.
I might consider using sp_getapplock to create a lock that other scripts could check to see if the table was in use. (Note if the script executions routinely overlap this may cause a backlog to build up).
A global temporary table means that any other user can also try to create the same global temporary table. This would cause collision.
In most cases, creating and using a permanent table has served us well. You can lot of advantages with a permanent table. You can have a history of things done. If you think the data will grow, you can setup house keeping to delete data older than some days or weeks.
In our projects our guidance is: either create a "real" temporary table or a "real" permanent table.
Not using global temporary table. Convert your query output to HTML body, here may help you https://dba.stackexchange.com/questions/83776/need-to-send-a-formatted-html-email-via-database-mail-in-sql-server-2008-r2
Using global temporary table but consider to reduce the chance of collision
a. try #Martin Brown's suggestion.
b. if your while loop takes some time to finish, you can create a local temporary table for it first. Only dump the output to global temp table right before database mail. And drop it immediately after mail sent.
I'm having some issues getting a query that worked just fine in SQL Server 2008 R2 and after upgrading to 2014 it no longer does. Any help would be appreciated. The error i'm getting is:
Msg 213, Level 16, State 7, Line 1
Column name or number of supplied values does not match table definition.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
Here is the Query:
use AccessControl
set nocount on
declare #MaintResults table
(
Result int identity(1,1),
Cardholder_Count varchar(15),
Events_Count varchar(15),
User_Count varchar(15),
etc...
)
The query all together is about 10 pages long. Its purpose is to pull a decent amount of information out of a database, into a temporary table and where it can be viewed. I'm honestly not sure why i'm getting this issue because the name of the database is in fact "AccessControl".. so why am i getting the error I am?
For us to solve this for you, we will need to see the entire statement.
As plain as the error states, either you are selecting a different number of columns when you insert into the table variable
But you stated nothing has changed except an upgrade to SQL server
Otherwise, one of the types in the select statement is no longer compatible with the table variable. This could be as simple as one or two of the fields are the wrong way around, from your select and your insert statements.
I'm not going to analyse too deeply what caused this without a lot more information from OP but if you have upgraded your DB and your application is maintaining the Schema it could be that some minor schema changes were also introduced, either that or some implicit type conversion that worked in the previous version is no longer supported?
Run the Select to fill the table variable on its own, manually review that types are ok and that they match your table variable.
You can cheat by saving the select as a view and then compare the column definitions of the view to your table variable
I saved a SQL table before deleting some information from it with the sql statment:
select * into x_table from y_table
After doing some operations, I want to get back some information from the table I saved with the query above. Unfortunately, MS SQL Server MGMTS shows an error saying that the table does not exist.
However, when I put the drop statement, the table is recognized - and the table is not underlined.
Any idea why this table is recognized by the drop table statement and not the select from statement. This seems strange for me.
EDIT:
Thank you
It may be that the table isn't underlined in your drop table command because its name is still in your IntelliSense cache. Select Edit -> IntelliSense -> Refresh Local Cache in SSMS (or just press Ctrl+Shift+R) and see if the table name is underlined then.
Edit:
Another possibility is that your drop table command might be in the same batch as another statement that creates the table, in which case SSMS won't underline it because it knows that even though the table doesn't exist now, it will exist by the time that command is executed. For instance:
None of the tables one, two, or three existed in my database when I took this screenshot. If I highlight line 6 and try to run it by itself, it will fail. Yet you can see that two is not underlined on line 6 because SSMS can see that if I run the whole script, the table will be created on line 5. On the other hand, three is underlined on line 9 because I commented out the code that would have created it on line 8.
All of that said, I think we might be making too much of this problem. If you try to select from a table and SQL Server tells you it doesn't exist, then it doesn't exist. You can't rely on IntelliSense to tell you that it does; the two examples above are probably not the only ways that IntelliSense might mislead you about the current status of a table.
If you want the simplest way to know whether an object with a given name (like x_table) exists, just use:
select object_id('x_table');
If this query returns null, x_table doesn't exist, regardless of what IntelliSense is telling you. If it returns non-null, then there is some object out there with that name, and then the real question is why your select statement is failing. And to answer that, I'd need to see the statement.
A lot of posts like this, you have to copy in 2 statements :
CREATE TABLE newtable LIKE oldtable;
INSERT newtable SELECT * FROM oldtable;
I have a ##table which can be accessed across all the sessions but sometimes I am getting error
There is already an object named
'##table' in the database.
WHY and how to resolve it.
Found an interesting reference (outdated URL referencing what is now a malicious website removed.):
Global temporary tables operate much like local temporary tables; they are created in tempdb and cause less locking and logging than permanent tables. However, they are visible to all sessions, until the creating session goes out of scope (and the global ##temp table is no longer being referenced by other sessions). If two different sessions try the above code, if the first is still active, the second will receive the following:
Server: Msg 2714, Level 16, State 6, Line 1
There is already an object named '##people' in the database.
I have yet to see a valid justification for the use of a global ##temp table. If the data needs to persist to multiple users, then it makes much more sense, at least to me, to use a permanent table. You can make a global ##temp table slightly more permanent by creating it in an autostart procedure, but I still fail to see how this is advantageous over a permanent table. With a permanent table, you can deny permissions; you cannot deny users from a global ##temp table.
There is already an object named '##table' in the database.
You would typically get this error if you are doing a CREATE Table statement which would obviously fail as '##table' already exists in the database.
Seems to me that maybe at some point in your code, the CREATE TABLE logic for this global table is being invoked again leading to this error.
Do the have the details of the exact statement that results in this error?
So the WHY part has been answered and here is how to resolve it:
Do a check to see if the temp table exists before creating it:
if object_id('tempdb..##table') is null begin
--create table ##table...
end
I found a pretty interesting post about how to check the existence of a temp table from Googling http://sqlservercodebook.blogspot.com/2008/03/check-if-temporary-table-exists.html