We have many stored procedures that are used for reports. All these procedures follow the following format. In essence, the SP does work, and the final result is inserted into a #table variable:
ALTER procedure dbo.usp_GetComplexData
as
declare #MyTable table
(
Col1 varchar(20),
Col2 varchar(20)
)
-- Here the SP is doing lots of work with lots of tables.
-- The result inserted in #MyTable
SELECT Col1, Col2 from #MyTable
Now I need to send via email (in html format) the results of these stored procedures.
I also have SpCustomTable2HTML (found at Symantec) that converts any table into an html table. It doesn't need the table schema to do its work; it simply takes the table and returns an html table.
So here's the stored procedure:
ALTER procedure usp_sendHtmlReportViaEmail
as
DECLARE #HTML1 NVARCHAR(MAX)
IF OBJECT_ID('tempdb..#Results') IS NOT NULL
drop TABLE #results
select top 50 * into #results From MyTable
EXEC SpCustomTable2HTML '#results', #HTML1 OUTPUT, '', ''
EXEC sp_send_dbmail #profile_name='My profile',
#recipients='test#Example.com',
#subject='Test message',
#body_format = 'HTML',
#body=#HTML1
I would like to somehow call usp_sendHtmlReportViaEmail from usp_GetComplexData by sending it #MyTable as parameter. I was reading about table valued parameters, but that requires to create a TVP for each table that I would pass. I don't want to create a specific TVP for each table that will be passed to usp_sendHtmlReportViaEmail.
Are there any other options?
Thanks.
If you're determined to use SQL, then you should look into using a global temporary table. You have to make sure you clean up after your code executes to avoid using too many resources, but it might be a valid approach for you.
At the end of your usp_GetComplexData procedure, just insert the data into a ##TemporaryTable and use that as the parameter to usp_sendHtmlReportViaEmail. Since I don't know what exactly you do with the table variable, I won't replace it, but you could potentially replace it with the temporary table instead.
ALTER PROCEDURE usp_GetComplexData
AS BEGIN
DECLARE #MyTable TABLE
(/*... Columns ...*/);
-- Do complex data stuff...
SELECT
*
INTO
##MyTempTable
FROM
#MyTable;
EXECUTE usp_sendHtmlReportViaEmail '##MyTempTable';
SELECT
*
FROM
#MyTable;
END
Related
I have a stored procedure, I want to pass a CSV to it and use this csv in a WHERE IN clause.
I have done it by passing XML to the stored procedure like this:
WHERE MyColumn IN (SELECT AOS.s.value('(./text())[1]', 'bigint')
FROM (VALUES(#XML))V(X)
CROSS APPLY
V.X.nodes('/ArrayOfLong/long') AOS(s))
Result-wise, this stored procedure works fine, but it's slow. I want to improve the performance. When I run this stored procedure and get the execution plan, I get a warning.
I have also followed this answer, but I cannot use this solution due to permission issues, as mentioned in the comments.
I am looking for a simple clean and optimized solution.
Parameters should be like this '1,2,3,4,5,6,7' and it will be used like this WHERE IN (1,2,3,4,5,6,7)
Edit:
You should use a Table Valued Parameter for this.
First, create a table type. I usually keep a few standard ones. In this case you probably want a primary key, which gets you a free index.
CREATE TYPE dbo.IdList AS TABLE (Id int PRIMARY KEY);
Then you use it in your procedure like this
CREATE OR ALTER PROC dbo.YourProc
#ids dbo.IdList READONLY
AS
SELECT s.Something
FROM Somewhere s
WHERE s.MyColumn IN (
SELECT i.Id FROM #id i
);
You call it like this from T-SQL
DECLARE #tmp dbo.IdList;
INSERT #tmp (Id) VALUES
(1),
(2),
(3);
EXEC YourProc #ids = #tmp;
In client applications, there is normally special handling for TVPs. Use those rather than injecting INSERT statements into your query.
You may also need to add permissions for non-admin users
GRANT EXECUTE ON TYPE::dbo.IntList to db_datareader;
Load the CSV into a table using SSIS and then just join your original table to the newly loaded table with the CSV data.
Any idea, what happen if I run same stored procedure (using jmeter) simultaneously,and in that stored procedure there are query
SELECT INTO #temp
Will the second stored procedure run after first stored procedure is done?
Or will the temp table be created twice (I heard there is local temp table in SQL Server)?
Sorry for the dumb question, I cannot find any answer on Google.
Thanks
A temporary table only exists in the scope it was created in (and "subscopes" of that scope) and only persist for the duration of that scope.
So, for example. If you were to run the below, you wouldn't get any errors:
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int);';
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int);';
That's because the table, #temp would only exist within the scope of the "dynamic" statement, and would cease to as soon as it completes.
On the other hand, something like the below would fail (This is wrong, see my edit at the bottom):
CREATE TABLE #temp (ID int);
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int);';
DROP TABLE #temp;
That's because the "dynamic" statement has access to the "outer" scope, and so would see that #temp already exists and generate an error.
Running 2 statements at the same time in the same connection isn't possible, so you won't be able to call the same Stored Procedure at the same time. This means that both will have different scopes, and will therefore reference they're own object #temp, that is specific to their scope.
You could again test this with a similar idea. Run the below, and then open a new connection and run it again (before the other is complete). You'll notice that they both succeed:
CREATE TABLE #temp (ID int);
WAITFOR DELAY '00:30'; --Waits 30 seconds
--While the WAITFOR happens, open the another connection and run all this SQL at the same time
DROP TABLE #temp;
Side note, Global Temporary tables do not behave the same way, but I specifically only reference temporary tables here, not global ones.
EDIT: Appears I am wrong, sort of, on inner scopes. You actually get some very odd behaviour. Take the below:
CREATE TABLE #temp (ID int);
INSERT INTO #temp VALUES(1);
EXEC sys.sp_executesql N'CREATE TABLE #temp (ID int); SELECT * FROM #temp;';
SELECT *
FROM #temp
DROP TABLE #temp;
This will return 2 datasets, one with no rows, one with 1 row. If, however, you remove the CREATE in the deferred statement then you get 1 row from both:
CREATE TABLE #temp (ID int);
INSERT INTO #temp VALUES(1);
EXEC sys.sp_executesql N'SELECT * FROM #temp;';
SELECT *
FROM #temp
DROP TABLE #temp;
This occurs on SQL Server 2019, but I sure I recall that this behaviour isn't how it was on previous versions. Perhaps I am recalling (very) old behaviour.
How to insert into a temp table that is all ready created inside of a stored procedure
ALTER PROCEDURE [dbo].[Report_1]
BEGIN
CREATE TABLE #Temp
(
col1 INT,
col2 INT,
col3 VARCHAR(50)
)
INSERT INTO #Temp
EXEC [spSelection] #ID
..do stuff
..do stuff
..do stuff
SELECT * FROM #temp
END
The problem I am having is, I will use this stored procedure (spSelection) in the future and if I change this stored procedure to get more columns for a different stored procedure, then Report_1 will fail.
So I need a way to dynamically create the table or be able to only select distinct columns from the output of exec [spSelection] #ID or have Report_1 be able to read from a temp table created in spSelection.
I have tried to use a global and that will not work because it can be used by other stored procedure at the same time, if I create a dynamic SQL.
#sql ='
create table #Temp(
col1 int,col2 int,col3 varchar(50)
) ' exec sp_executesql #sql
I can not access the #temp table outside of the quotes
One alternative is to change your SP to make the insert inside the SP:
ALTER PROCEDURE [spSelection]
AS
BEGIN
-- Validate that your temporary table was created (the insert will fail otherwise)
IF OBJECT_ID('tempdb..#Temp') IS NULL
BEGIN
RAISERROR ('The table #Temp must be created before executing this SP', 16, 1)
RETURN
END
..do stuff
..do stuff
..do stuff
INSERT INTO #Temp (
col1,
col2,
col3)
SELECT
/*Columns*/
END
GO
ALTER PROCEDURE [dbo].[Report_1]
BEGIN
CREATE TABLE #Temp
(
col1 INT,
col2 INT,
col3 VARCHAR(50)
)
EXEC [spSelection] #ID -- Will insert into #Temp
..do stuff
..do stuff
..do stuff
SELECT * FROM #temp
END
This approach will fail if you eventually add new columns to the #Temp table and insert them inside the SP without updating the CREATE TABLE in every SP that calls it.
There is no definitive solution here, please read this excellent paper about all possible ways to share data between SP, with pros and cons of each (the solution I posted here is listed as 4. Using a table).
Instead of creating a stored procedure for selecting results, you can create a view and use SELECT INTO clause to dynamically create temp table at run time.
You can not use stored procedure in select statement.
I need to execute some dynamic SQL that will return 2 result sets and store the result sets in a table variable.
Let's say I have 2 tables (crappy schema but illustrates my issue)
Table1:
ItemID int
ItemName nvarchar(50)
Table2:
ItemId int
Quantity int
I generate some dynamic sql that looks like this:
DECALRE #sql varchar(max);
SET #sql = 'SELECT * FROM Table1; SELECT * from Table2';
Then I create my table variables:
DECALRE #tbl1 TABLE (ItemId int, ItemName nvarchar(50))
DECALRE #tbl2 TABLE (ItemId int, Quantity int)
Then I want to execute that dynamic SQL and insert the results into the table variables I just declared. If there was just one result set in the dynamic sql I could simply just run this:
INSERT into #tbl1execute ('SELECT * FROM Table1;')
However, this obviously fails as soon as I use the #sql parameter that will return multiple result sets. Is this even possible?
Use temporary tables (not table variables):
CREATE TABLE #Table1(...)
CREATE TABLE #Table2(...)
DECLARE #MyDynamicSql NVARCHAR(MAX) = N'
INSERT INTO #Table1(...)
SELECT ...
INSERT INTO #Table2(...)
SELECT ...'
EXEC(#MyDynamicSql)
A couple of things to watch out for when writing this kind of spaghetti code however:
Re-running the same code (outside of a stored procedure) will necessitate dropping (or at least truncating) the temp tables
Addendum to the above: SQL parser will throw errors at you if you change the structure of the temp tables even if you have a DROP statement (read: you have to drop the tables before running the batch to change their structures)
If your process is a subroutine of one which already declared temp tables by the same name... Make sure to avoid that, I spent hours trying to figure this one out thinking I was going crazy.
Temp tables declared inside dynamic SQL need to be connection- or globally-scoped (2 and 3 #'s, respectively) to persist for the parent to access (documentation)
I have a stored procedure that returns a result set. After that I insert this result set into created real table. And then I am using that real table create SSRS reports.
So, something like this:
CREATE PROCEDURE Test
AS
DECLARE #TempTable TABLE(..)
INSERT INTO #TempTable
SELECT...
FROM ...
WHERE ...
SELECT * FROM #TempTable
--============================
INSERT INTO RealTable EXEC [dbo].[Test]
How can I modify this stored procedure so every time it executed it will truncate table with existing data and then insert a fresh one?
So I need something like that:
create procedure Test
as
TRUNCATE RealTable
DECLARE #TempTable TABLE(..)
INSERT INTO #TempTable
SELECT...
FROM...
WHERE...
SELECT * FROM #TempTable INTO RealTable
Or should I just create agent job that would run command something like:
Truncate Table RealTable
INSERT INTO RealTable EXEC [dbo].[Test]
Am I on a right way in terms of logic?
Dont TRUNCATE. Use a MERGE Statement.
CREATE PROCEDURE Test
AS
MERGE RealTable TRGT
USING SourceTable SRCE
ON SRCE.[Column] = TRGT.Column --use columns that can be joined together
WHEN MATCHED THEN UPDATE
SET TRGT.Column1 = SRCE.Column1,
TRGT.Column2 = SRCE.Column2
....................
WHEN NOT MATCHED BY TARGET THEN INSERT
VALUES
(
SRCE.Column1,
SRCE.Column2,
.....................
)
WHEN NOT MATCHED BY SOURCE THEN
DELETE;
What's the purpose of the truncate if you are inserting the same data?
What should happen if you have more then 1 concurrent user?
another thing you can do:
1.
insert into TargetTable
select * from SourceTable
2.
rebuild indexes on TargetTable
3.
exec sp_rename SourceTable, SourceTable_Old
exec sp_rename TargetTable, SourceTable
drop table SourceTable_Old
this is an old way of entire table data refresh without much impact, when table variable was not an option.
this is what you probably need as you are directly inserting from #TempTable to RealTable.
create procedure Test
as
BEGIN
TRUNCATE TABLE RealTable
INSERT INTO RealTable
SELECT...
FROM someothertable
WHERE...
END