Insert Results From Dynamic SQL Into 2 Table Variables - sql-server

I need to execute some dynamic SQL that will return 2 result sets and store the result sets in a table variable.
Let's say I have 2 tables (crappy schema but illustrates my issue)
Table1:
ItemID int
ItemName nvarchar(50)
Table2:
ItemId int
Quantity int
I generate some dynamic sql that looks like this:
DECALRE #sql varchar(max);
SET #sql = 'SELECT * FROM Table1; SELECT * from Table2';
Then I create my table variables:
DECALRE #tbl1 TABLE (ItemId int, ItemName nvarchar(50))
DECALRE #tbl2 TABLE (ItemId int, Quantity int)
Then I want to execute that dynamic SQL and insert the results into the table variables I just declared. If there was just one result set in the dynamic sql I could simply just run this:
INSERT into #tbl1execute ('SELECT * FROM Table1;')
However, this obviously fails as soon as I use the #sql parameter that will return multiple result sets. Is this even possible?

Use temporary tables (not table variables):
CREATE TABLE #Table1(...)
CREATE TABLE #Table2(...)
DECLARE #MyDynamicSql NVARCHAR(MAX) = N'
INSERT INTO #Table1(...)
SELECT ...
INSERT INTO #Table2(...)
SELECT ...'
EXEC(#MyDynamicSql)
A couple of things to watch out for when writing this kind of spaghetti code however:
Re-running the same code (outside of a stored procedure) will necessitate dropping (or at least truncating) the temp tables
Addendum to the above: SQL parser will throw errors at you if you change the structure of the temp tables even if you have a DROP statement (read: you have to drop the tables before running the batch to change their structures)
If your process is a subroutine of one which already declared temp tables by the same name... Make sure to avoid that, I spent hours trying to figure this one out thinking I was going crazy.
Temp tables declared inside dynamic SQL need to be connection- or globally-scoped (2 and 3 #'s, respectively) to persist for the parent to access (documentation)

Related

SQL Server : what happens if I run the same stored procedure simultaneously that has a select into same temporary table

Any idea, what happen if I run same stored procedure (using jmeter) simultaneously,and in that stored procedure there are query
SELECT INTO #temp
Will the second stored procedure run after first stored procedure is done?
Or will the temp table be created twice (I heard there is local temp table in SQL Server)?
Sorry for the dumb question, I cannot find any answer on Google.
Thanks
A temporary table only exists in the scope it was created in (and "subscopes" of that scope) and only persist for the duration of that scope.
So, for example. If you were to run the below, you wouldn't get any errors:
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int);';
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int);';
That's because the table, #temp would only exist within the scope of the "dynamic" statement, and would cease to as soon as it completes.
On the other hand, something like the below would fail (This is wrong, see my edit at the bottom):
CREATE TABLE #temp (ID int);
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int);';
DROP TABLE #temp;
That's because the "dynamic" statement has access to the "outer" scope, and so would see that #temp already exists and generate an error.
Running 2 statements at the same time in the same connection isn't possible, so you won't be able to call the same Stored Procedure at the same time. This means that both will have different scopes, and will therefore reference they're own object #temp, that is specific to their scope.
You could again test this with a similar idea. Run the below, and then open a new connection and run it again (before the other is complete). You'll notice that they both succeed:
CREATE TABLE #temp (ID int);
WAITFOR DELAY '00:30'; --Waits 30 seconds
--While the WAITFOR happens, open the another connection and run all this SQL at the same time
DROP TABLE #temp;
Side note, Global Temporary tables do not behave the same way, but I specifically only reference temporary tables here, not global ones.
EDIT: Appears I am wrong, sort of, on inner scopes. You actually get some very odd behaviour. Take the below:
CREATE TABLE #temp (ID int);
INSERT INTO #temp VALUES(1);
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int); SELECT * FROM #temp;';
SELECT *
FROM #temp
DROP TABLE #temp;
This will return 2 datasets, one with no rows, one with 1 row. If, however, you remove the CREATE in the deferred statement then you get 1 row from both:
CREATE TABLE #temp (ID int);
INSERT INTO #temp VALUES(1);
EXEC sys.sp_executesql N'SELECT * FROM #temp;';
SELECT *
FROM #temp
DROP TABLE #temp;
This occurs on SQL Server 2019, but I sure I recall that this behaviour isn't how it was on previous versions. Perhaps I am recalling (very) old behaviour.

SQL Server: Optimizing Updating Multiple Records from Inputs in Stored Procedure

I've come across an efficiency problem for when a stored procedure is called to update multiple records in a single table. The actual problem contains hundreds of parameters, but here is a simplified problem.
CREATE PROCEDURE UpdateData
#ID1 int, #Value1 int,
#ID2 int, #Value2 int,
#ID3 int, #Value3 int
AS
BEGIN
-- Update record with ID1
-- Update record with ID2
-- Update record with ID3
END
I see three methods of doing this:
Create an update query for each ID-Value pair
Create while loop to go through all the inputs and pass them through a single update query
Insert the inputs into a local table and do an update in a single query with use of the table created
I'm unsure at what point creating the table would be more or less efficient than the other options, as I know databases are good at doing things in parallel, but creating a temporary table also uses time.
How can I compare these three methods aside from running and timing them?
Is there a method you recommend?

How to use table variable in dynamic sql? OR create temporary table from user defined table type?

I am using SQL Sever 2016 and I have created user-defined table-type as below:
CREATE TYPE [dbo].[UDTT_Items] AS TABLE(
[ItemId] int identity(1, 1),
[ItemCode] [varchar](10) NULL,
[ItemName] [varchar](255) NULL,
[StockQty] decimal(18,3 ) NULL,
PRIMARY KEY CLUSTERED
(
[ItemId] ASC
)WITH (IGNORE_DUP_KEY = OFF)
)
GO
In my stored procedure I can create table variable like this:
declare #tblItems UDTT_Items
I can insert data in this table variable and can make select queries.
select * from #tblItems
The problem I faced when I need to put this table in dynamic sql. For example, if I try to run the above select statement from execute caluse:
EXECUTE SP_EXECUTESQL N'select * from #tblItems'
It gives me error message:
Must declare the table variable "#tblItems".
I tried to use temporary table variabe (with #) inside dynamic sql, and it works fine, but I dont know if I can create temporary table with already user-defined-table-type. I need something like this:
create #tblItems UDTT_Items
But it also does not work.
Can anybody suggest how to make any work around this issue, either by using table variable in dynamic sql, or creating temp table from user-defined-table-type?
I can think of the following workarounds to solve this using your UDTT:
1. Declare the UDTT variable within your dynamic script and then you can as well retrieve results from there:
EXECUTE SP_EXECUTESQL
N'
DECLARE #dynvariable [UDTT];
insert #dynvariable values (1);
select * from #dynvariable';
2. Pass the UDTT variable to the SP_EXECUTESQL, but then it is readonly, meaning you can only select within the dynamic script:
DECLARE #variable [UDTT];
insert #variable values (1);
EXECUTE SP_EXECUTESQL
N'select * from #dynvariable',
N'#dynvariable [UDTT] READONLY',
#dynvariable=#variable;
3. I think it's not possible to 'create a temp table from UDTT' so your approach would be to dynamically create the temp table using system information for your UDTT (columns, types, etc.).
4. Reading that you want to have a "dynamic" pivot code, the most appropriate would be to dynamically generate the pivot statement based on the columns info and values of the target table.

Options besides table valued parameters for passing table variable?

We have many stored procedures that are used for reports. All these procedures follow the following format. In essence, the SP does work, and the final result is inserted into a #table variable:
ALTER procedure dbo.usp_GetComplexData
as
declare #MyTable table
(
Col1 varchar(20),
Col2 varchar(20)
)
-- Here the SP is doing lots of work with lots of tables.
-- The result inserted in #MyTable
SELECT Col1, Col2 from #MyTable
Now I need to send via email (in html format) the results of these stored procedures.
I also have SpCustomTable2HTML (found at Symantec) that converts any table into an html table. It doesn't need the table schema to do its work; it simply takes the table and returns an html table.
So here's the stored procedure:
ALTER procedure usp_sendHtmlReportViaEmail
as
DECLARE #HTML1 NVARCHAR(MAX)
IF OBJECT_ID('tempdb..#Results') IS NOT NULL
drop TABLE #results
select top 50 * into #results From MyTable
EXEC SpCustomTable2HTML '#results', #HTML1 OUTPUT, '', ''
EXEC sp_send_dbmail #profile_name='My profile',
#recipients='test#Example.com',
#subject='Test message',
#body_format = 'HTML',
#body=#HTML1
I would like to somehow call usp_sendHtmlReportViaEmail from usp_GetComplexData by sending it #MyTable as parameter. I was reading about table valued parameters, but that requires to create a TVP for each table that I would pass. I don't want to create a specific TVP for each table that will be passed to usp_sendHtmlReportViaEmail.
Are there any other options?
Thanks.
If you're determined to use SQL, then you should look into using a global temporary table. You have to make sure you clean up after your code executes to avoid using too many resources, but it might be a valid approach for you.
At the end of your usp_GetComplexData procedure, just insert the data into a ##TemporaryTable and use that as the parameter to usp_sendHtmlReportViaEmail. Since I don't know what exactly you do with the table variable, I won't replace it, but you could potentially replace it with the temporary table instead.
ALTER PROCEDURE usp_GetComplexData
AS BEGIN
DECLARE #MyTable TABLE
(/*... Columns ...*/);
-- Do complex data stuff...
SELECT
*
INTO
##MyTempTable
FROM
#MyTable;
EXECUTE usp_sendHtmlReportViaEmail '##MyTempTable';
SELECT
*
FROM
#MyTable;
END

SQL Server Bulk Insert with FOREIGN KEY parameter (not existant in txt file, ERDs included)

Okay so I have a table ERD designed like so... for regular bulk inserts
(source: iforce.co.nz)
And a tab delimited \t text file with information about each customer (consists of about 100,000+ records).
# columnA columnB columnC
data_pointA data_pointB data_pointC
And a stored procedure that currently does its intended job fine.
CREATE PROCEDURE import_customer_from_txt_para #filelocation varchar(100)
AS BEGIN
TRUNCATE TABLE dbo.[customer_stg]
DECLARE #sql nvarchar(4000) = '
BULK INSERT customer_stg
FROM ''' + #filelocation + '''
WITH
(
FIRSTROW=14,
FIELDTERMINATOR=''\t'',
ROWTERMINATOR=''\n''
)';
print #sql;
exec(#sql);
END
But my question is about the relationship between customer_table and customer_stg is it possible to include a customer_id within the customer_stg bulk insert? with something like so? ( I'm not sure how to apply the foreign key parameter #customer_sk to the bulk insert ).
CREATE PROCEDURE import_customer_from_txt_para #filelocation varchar(100), #customer_sk int
AS BEGIN
TRUNCATE TABLE dbo.[customer_stg]
DECLARE #sql nvarchar(4000) = '
BULK INSERT customer_stg
FROM ''' + #filelocation + '''
WITH
(
FIRSTROW=14,
FIELDTERMINATOR=''\t'',
ROWTERMINATOR=''\n''
)';
print #sql;
exec(#sql);
END
Preferably after each bulk-insert I'd wish to be able to relate the data between the two tables.
(source: iforce.co.nz)
Bulk inserts will either insert NULL or the default value for unspecified column (based on the KEEPNULLS argument), which of course will not work for your situation assuming you have (or will create) a constraint. I assume this is the case because otherwise you could just update your table directly after you run the insert.
I see two ways around this:
- If you have the ability, you can just macro-edit the text file before you run the bulk insert. Since I'm assuming that that isn't in the question...
- First of all, you will need to add your FK column to your _stg table if it's not already there. Then, in your stored procedure, create a temp table with the three columns specified in the input file:
CREATE TABLE dbo.#Temp_STG
(
columnA,
columnB,
columnC
)
Then, batch insert into that table. Then you can insert from the temp table to the main _stg table, but add a column:
INSERT dbo.Customer_STG
SELECT
T.columnA,
T.columnB,
T.columnC,
[your customer key]
FROM dbo.#Temp_STG AS T
Make sure you drop the temp table when you're done.
As a side note, do you need to use dynamic SQL for this task? It's generally best to avoid unless absolutely necessary.
I suppose another option would be setting the default value for the column to whatever you want, and turning KEEPNULLS off. But, I would definitely NOT recommend doing this when you can just use the solution described above.
See more: http://msdn.microsoft.com/en-us/library/ms188365.aspx

Resources