Dynamically created temporary table does not persist - sql-server

I want to create a temporary table in a dynamic query and use it afterwards. It will be created from a permanent table:
create table t (a integer);
insert into t values (1);
And the temp table creation is like this:
declare #command varchar(max) = '
select *
into #t
from t
;
select * from #t;
';
execute (#command);
When the #command is executed the select from the temporary table works.
Now if I select from the temporary table an error message is shown:
select * from #t;
Invalid object name '#t'
If the temporary table is created outside of the dynamic query it works:
select top 0 *
into #t
from t
declare #command varchar(max) = '
insert into #t
select *
from t
';
execute (#command);
select * from #t;
Is it possible to persist a dynamically created temporary table?

You are close in your assumption that EXECUTE is carried out in a different session.
According to the MSDN here
Executes a command string or character string within a Transact-SQL
batch
So your temporary table only exists inside the scope of the SQL executed by the EXECUTE command.

You can also create global temporary tables. For example, ##MyTemp.
But, global temporary tables are visible to all SQL Server connections.

Related

Why doesn't this alter after insert statement work?

I have a stored procedure with dynamic sql that i have embedded as below:
delete from #temp_table
begin tran
set #sql = 'select * into #temp_table from sometable'
exec (#sql)
commit tran
begin
set #sql = 'alter table #temp_table add column1 float'
exec(#sql)
end
update #temp_table
set column1 = column1*100
select *
into Primary_Table
from #temp_table
However, I noticed that all the statements work but the alter does not. When run the procedure, I get an error message: "Invalid Column name column1"
What am I doing wrong here?
EDIT: Realized I didn't mention that the first insert is a dynamic sql as well. Updated it.
Alternate approach tried but throws same error:
delete from #temp_table
begin tran
set #sql = 'select * into #temp_table from sometable'
exec (#sql)
commit tran
alter table #temp_table add column1 float
update #temp_table set column1 = column1*100
Local temporary tables exhibit something like dynamic scope. When you create a local temporary table inside a call to exec it goes out of scope and existence on the return from exec.
EXEC (N'create table #x (c int)')
GO
SELECT * FROM #x
Msg 208, Level 16, State 0, Line 4
Invalid object name '#x'.
The select is parsed after the dynamic SQL to create #x is ran. But #x is not there because dropped on exit from exec.
Update
Depending on the situation there are different ways to work around the issue.
Put everything into the same string:
DECLARE #Sql NVARCHAR(MAX) = N'SELECT 1 AS source INTO #table_name;
ALTER TABLE #table_name ADD TARGET float;
UPDATE #table_name SET Target = 100 * source;';
EXEC (#Sql);
Create the table ahead of the dynamic sql that populates it.
CREATE TABLE #table_name (source INT);
EXEC (N'insert into #table_name (source) select 1;');
ALTER TABLE #table_name ADD target FLOAT;
UPDATE #table_name SET target = 100 * source;
In this option, the alter table statement can be removed by adding the additional column to the create table statement.' Note also that the alter table and update statements could be in separate invocations of dynamic SQL, if that was beneficial to your context.
1) It should be ALTER TABLE #temp... Not ALTER #temp.
2) Even if #1 weren't an issue, you're adding column1, as a NULLable column with no default value and, in the next statement setting it's value to itself * 100...
NULL * 100 = NULL
3) Why are you using dynamic sql to alter the #temp table? It can just as easily be done with a regular ALTER TABLE script... or, better yet, can be included in the original table definition.
This is because the #temp_table reference in the outer batch is a different temp table than the one created in dynamic SQL. Consider:
use tempdb
drop table if exists sometable
drop table if exists #temp_table
go
create table sometable(id int, a int)
create table #temp_table(id int, b int)
exec( 'select * into #temp_table from sometable; select * from #temp_table;' )
select * from #temp_table
Outputs
id a
----------- -----------
(0 rows affected)
id b
----------- -----------
(0 rows affected)
A temp table created in a nested batch is scoped to the nested batch and automatically dropped after. A "nested batch" is either a dynamic SQL query or a stored procedure. This behavior is explained here CREATE TABLE, but it only mentions stored procedures. Dynamic SQL behaves the same.
If you create the temp table in a top level batch, you can access it in dynamic SQL, you just can't create a new temp table in dynamic SQL and see it in the outer batch or in subsequent same-level dynamic SQL. So try to use INSERT INTO instead of SELECT INTO.

Temporary table not created from dynamic query execution

If I run this dynamic query:
declare #test nvarchar(1000) = 'select * into #tmp7 from bauser'
execute(#test)
and then try to query #tmp7 with:
select * from #tmp7
error is thrown:
Invalid object name '#tmp7'.
However if I run the same query manually:
select * into #tmp7 from bauser
Everything is OK. Temporary table is created and filled with results.
Why is it not working with dynamic query execution?
SCOPE!
The temporary table exists only in the scope of the dynamic executed query
If you do want to make the select put it inside the dynamic query
declare #test nvarchar(1000) = 'select * into #tmp7 from bauser
select * from #tmp7'
execute(#test)
Also you can check if a such object exists by using this
select * from sys.sysobjects so where so.name like '%tmp7%'
See this similar question
SQL Server 2005 and temporary table scope
Edit
A temp table IS A TABLE so yes you can add columns, indexes, etc. Those tables resides in fact in the TempDB database and you can even "find" them (they can be seen with strange long names) but they are destroyed after the execution of your EXEC.
Maybe your problem is to try the dynamic approach or is not related to your question at all. Try to post a new question what you got and what you need to do to get further assistance.
If you create temp table using dynamic SQL, it will not be available out of dynamic SQL scope.
You need to create it out of dynamic SQL and then use INSERT INTO to populate the table.
-- use this trick to create the temp table easily.
SELECT * INTO #tmp7
FROM bauser
WHERE 1=2
declare #test nvarchar(1000) = 'insert into #tmp7 select * from bauser'
execute(#test)

Moving table data to another table if table is not exists

I am generating dynamic script to move data into another database if table is not present then i want to create. This script runs perfectly if it execute directly. But gives error if that script string execute by execute statement. i have tried exec also.
declare #temp as varchar(max)
set #temp='select * into Allocation_Archive.dbo.Users from Users'
execute #temp
Error
Database 'select * into Allocation_Archive' does not exist. Make sure that the name is entered correctly.
Option 1
Add a USE database statement to your variable or specify the full object name including database and schema.
Wrap the #temp variable in parentheses
For example:
declare #temp as varchar(max)
set #temp='select * into Allocation_Archive.dbo.Users from ThisDatabase.dbo.Users'
execute #temp
Option 2
Don't use:
EXECUTE #temp
Instead use
EXEC sp_executesql #temp
You will also need to change your #temp variable to be nvarchar(max)

MSSQL Stored Procedure Creating A Temp Table Dynamically

We're trying to write some automated reports to execute SQL statements we have stored in a table. The table data is normally used in a stored procedure called by the triggers and uses data passed in via temp tables (created in the trigger statements), and has a table name, then an SQL statement that works on #TempInserted and #TempDeleted, which correspond to the Inserted and Deleted objects from the trigger and then some e-mail columns that determine where to send the output.
This all works fine from the trigger statements, as each creates each temp table once, during execution:-
SELECT * INTO #TempInserted FROM INSERTED
SELECT * INTO #TempDeleted FROM DELETED
Then the trigger calls the TriggerHandler stored procedure, passing the table name through as a pararmeter.
..
However, when I try to create these dynamically from a general stored procedure in order to fire off these statements as reports (so we don't duplicate the statements), in a batch, I'm hitting a problem:-
SELECT * INTO #TempInserted FROM ...
works fine from a defined table, or object (e.g. "FROM INSERTED"), but I've found that it can't get it's schema from a dynamic query.
For example, I can do
SELECT TOP 1 * INTO #Test FROM TableA
SELECT * FROM #Test
DROP TABLE #Test
But I can't then do
EXECUTE sp_executesql N'SELECT TOP 1 * INTO #Test FROM TableA'
SELECT * FROM #Test
DROP TABLE #Test
because then #Test is local to the EXECUTE context, and not its parent.
I can, however, do the insert in the EXECUTE (or a stored procedure) because the temp table is in scope, if I've already created the table schema:-
SELECT * INTO #Test FROM TableA WHERE 1 = 2 -- create an empty schema
EXECUTE sp_executesql N'INSERT INTO #Test SELECT TOP 10 * FROM TableA'
SELECT * FROM #Test
DROP TABLE #Test
So, that's OK, but my problem comes when I want to dynamically create that schema, depending on the table name were running the reports for. The INSERT works:-
SELECT * INTO #Test FROM TableA WHERE 1 = 2 -- create an empty schema
DECLARE #Table NVARCHAR(20) = 'TableA'
DECLARE #SQL NVARCHAR(200) = N'INSERT INTO #Test SELECT TOP 10 * FROM ' + #Table
EXECUTE sp_executesql #SQL
SELECT * FROM #Test
DROP TABLE #Test
But only if the temp table already has a schema. If I try to conditionally create the schema, depending on the table selected, I get a parsing error:-
DECLARE #Table NVARCHAR(20) = 'TableA'
IF #Table = 'TableA'
SELECT * INTO #Test FROM TableA WHERE 1 = 2 -- create an empty schema
IF #Table = 'TableB'
SELECT * INTO #Test FROM TableB WHERE 1 = 2 -- create an empty schema
DECLARE #SQL NVARCHAR(200) = N'INSERT INTO #Test SELECT TOP 10 * FROM ' + #Table
EXECUTE sp_executesql #SQL
SELECT * FROM #Test
DROP TABLE #Test
gives "There is already an object named '#Test' in the database." - so the query parser isn't following the structure of the query, which only actually creates the temp table once. This also holds true if you do
SELECT * INTO #Test FROM ....
DROP TABLE #Test
SELECT * INTO #Test FROM ....
So, is there a way in SQL Server 2012, of either being able to do
SELECT * INTO #Test FROM (dynamic SQL statement)
or to bypass the parser thinking you're creating the object twice
DECLARE #Table NVARCHAR(20) = 'TableA'
IF #Table = 'TableA'
SELECT * INTO #Test FROM TableA WHERE 1 = 2 -- create an empty schema
IF #Table = 'TableB'
SELECT * INTO #Test FROM TableB WHERE 1 = 2 -- create an empty schema
or to dynamically create the locally scoped temp table, from an existing database table's schema, where the table name is stored in a variable (all the examples I've found of this use the "SELECT * INTO #Test" code, which as I mentioned requires a statically defined object to create from)?
-------edit--------
For a bit of context, here's an example of why we're doing this:-
A trigger may fire producing a warning e-mail if a certain item type is transacted into a certain location. This works with our current triggers. The reason we're doing this is so that we can, in future, write a UI so the users can add other item types to this list themselves, rather than us having to update the trigger - this also means that we can control/validate the SQL being generated, behind the scenes of a point-and-click interface so that our users don't need to know any SQL and that we can be sure that nothing malicious or that will cause errors will be used.
We also can't do this in the BLL because it's from our ERP system and this would then mean we'd have to make changes to base objects, which is obviously undesirable if it can be avoided.
There is the potential for some of these e-mails to be missed/ignored/forgotten/not-actioned, so the users requested the same information on a periodic basis, as well as as-at the transaction occurring:-
So, next, we want to produce, for some of these trigger statements, daily/weekly/monthly reports. Now, obviously, it would be ideal if we could use the existing SQL trigger statements we have set up as then if one were changed it would then automatically affect the periodical reports - stay DRY. It would also mean that if we set up a new trigger, we could automatically include it in the reports by merely inserting a reference to the trigger code, along with the table name, frequency, etc, into the table that drives the periodical reports stored procedure. Again, in future, we could then write a UI, so that users can then request and schedule these reports themselves, with no intervention required from us.
I suspect I'm stuck in a catch-22 situation here. However, I've found a way around it that isn't too messy. I extract the item processing code into another stored procedure, and then compound execution of that onto the dynamic "SELECT INTO" statement - that way it runs in the same execution instance and thus has access to the temp table created in, and local to, that instance:-
SET #SQL = 'SELECT * INTO #TestTable FROM ' + #Table + ' WHERE ' + #WhereClause
SET #SQL = #SQL + '; EXEC ReportProcess'
EXECUTE sp_executeSQL #SQL
the ReportProcess stored procedure then has access to the temporary table and can process it, accordingly

nested insert exec work around

I have 2 stored procedures usp_SP1 and usp_SP2. Both of them make use of insert into #tt exec sp_somesp. I wanted to create a 3rd stored procedure which will decide which stored proc to call. Something like:
create proc usp_Decision
(
#value int
)
as
begin
if (#value = 1)
exec usp_SP1 -- this proc already has insert into #tt exec usp_somestoredproc
else
exec usp_SP2 -- this proc too has insert into #tt exec usp_somestoredproc
end
Later, I realized I needed some structure defined for the return value from usp_Decision so that I can populate the SSRS dataset field. So here is what I tried:
Within usp_Decision created a temp table and tried to do "insert into #tt exec usp_SP1". This didn't work out. error "insert exec cannot be nested"
Within usp_Decision tried passing table variable to each of the stored proc and update the table within the stored procs and do "select * from ". That didn't work out as well. Table variable passed as parameter cannot be modified within the stored proc.
Please suggest what can be done.
Can you modify usp_SP1 and usp_SP2?
If so, in usp_Decision, create a local temporary table with the proper schema to insert the results:
create table #results (....)
Then, in the called procedure, test for the existence of this temporary table. If it exists, insert into the temporary table. If not, return the result set as usual. This helps preserve existing behavior, if the nested procedures are called from elsewhere.
if object_id('tempdb..#results') is not null begin
insert #results (....)
select .....
end
else begin
select ....
end
When control returns to the calling procedure, #results will have been populated by the nested proc, whichever one was called.
If the result sets don't share the same schema, you may need to create two temporary tables in usp_Decision.
Have you had a look at table-valued user-defined functions (either inline or multi-statement)? Similar to HLGEM's suggestion, this will return a set which you may not have to insert any where.
Not a fan of global temp tables in any event (other processes can read these table and may interfere with the data in them).
Why not have each proc use a local temp table and select * from that table as the last step.
Then you can insert into a local temp table in the calling proc.
esimple example
create proc usp_mytest1
as
select top 1 id into #test1
from MYDATABASE..MYTABLE (nolock)
select * from #test1
go
--drop table #test
create proc usp_mytest2
as
select top 10 MYTABLE_id into #test2
from MYDATABASE..MYTABLE (nolock)
select * from #test2
go
create proc usp_mytest3 (#myvalue int)
as
create table #test3 (MYTABLE_id int)
if #myvalue = 1
Begin
insert #test3
exec ap2work..usp_mytest1
end
else
begin
insert #test3
exec ap2work..usp_mytest2
end
select * from #test3
go
exec ap2work..usp_mytest3 1
exec ap2work..usp_mytest3 0
See this blog article for one wortkaround (uses OPENROWSET to essentially create a loopback connection on which one of the INSERT EXEC calls happens)

Resources