View temporary table`s data when debugging an MS SQL Function - sql-server

I'm currently debugging an Ms SQL Function (SQL 2008).
In this function, I have a variable declared this way:
DECLARE #TempTable TABLE ( Id INT UNIQUE );
Then, I insert some records using an insert into...select statement.
When debugging, I would like to see the records in this table.
Is there a way to do this?
Thanks

I built a procedure which will display the content of a temp table from another database connection. (which is not possible with normal queries).
Note that it uses DBCC PAGE & the default trace to access the data so only use it for debugging purposes.
You can use it by putting a breakpoint in your code, opening a second connection and calling:
exec sp_select 'tempdb..#mytable'

One possible solution, that may not be the best, is to:
Create a permanent table that is the same as the temporary table
Modify the function so that it dumps the data from the temporary table into the permanent table at the point where the temp table contains the data you're interested in seeing
When the function ends, open up the new permanent table and you'll have a copy of the temporary table's state.
This requires that you have permission to create new tables and modify the function.

Related

How to do an inner join rather than for each loop in SSIS?

On the ETL server I have a DW user table.
On the prod OLTP server I have the sales database. I want to pull the sales only for users that are present in the user table on the ETL server.
Presently I am using an execute SQL task to fetch the DW users into a SSIS System.Object variable. Then using a for each loop to loop through each item (userid) in this variable and via a data flow task fetch the OLTP sales table for each user and dump it into the DW staging table. The for each is taking long time to run.
I want to be able to do an inner join so that the response is quicker, but I cant do this since they are on separate servers. Neither can I use a global temp table to make the inner join, for the same reason.
I tried to collect the DW users into a comma separated string variable and then using it (via string_split) to query into OLTP, but this is also taking more time at the pre-execute phase (not sure why exactly) even for small number of users.
I also am aware of lookup transform but that too will result in all oltp rows to be brought into the dw etl server to test the lookup condition.
Is there any alternate approach to be able to do an inner join by taking the list of users into the source?
Note: I do not have write permissions on the OLTP db.
Based on the comments, I think we can use a temporary table to solve this.
Can you help me understand this restriction? "Neither can I use a global temp table to make the inner join, for the same reason."
The restriction is since oltp server and dw server are separate so can't have global temp table common to both servers. Hope makes sense.
The general pattern we're going to do is
Execute SQL Task to create a temporary table on the OLTP server
A Data Flow task to populate the new temporary table. Source = DW. Destination = OLTP. Ensure Delay Validation = True
Modify existing Data Flow. Modify source to be a query that uses the temporary table i.e. SELECT S.* FROM oltp.sales AS S WHERE EXISTS (SELECT * FROM #SalesPerson AS SP WHERE SP.UserId = S.UserId); Ensure Delay Validation = True
A long form answer on using temporary tables (global to set the metadata, regular thereafter)
I don't use temp table in SSIS
Temporary tables, live in tempdb. Your OLTP and DW connection managers likely do not point to tempdb. To be able to reference a temporary table, local or global, in SSIS you need to either define an additional connection manager for the same server that points explicitly at tempdb so you can use the drop down in the source/destination components (technically accurate but dumb). Or, you use an SSIS Variable to hold the name of the table and use the ~From Variable~ named option in source/destination component (best option, maximum flexibility).
Soup to nuts example
I will use WideWorldImporters as my OLTP system and WideWorldImportersDW as my DW system.
One-time task
Open SQL Server Management Studio, SSMS, and connect to your OLTP system. Define a global temporary table with a unique name and the expected structure. Leave your connection open so the table structure remains intact during initial development.
I used the following statement.
DROP TABLE IF EXISTS #SO_70530036;
CREATE TABLE #SO_70530036(EmployeeId int NOT NULL);
Keep track of your query because we'll use it later on but as I advocate in my SSIS answers, perform the smallest task, test that it works and then go on to the next. It's the only way to debug.
Connection Managers
Define two OLE DB Connection Managers. WWI_DW uses points to the named instance DEV2019UTF8 and WWI_OLTP points to DEV2019EXPRESS. Right click on WWI_OLTP and select Properties. Find the property RetainSameConnection and flip that from the default of False to True. This ensures the same connection is used throughout the package. As temporary tables go out of scope when the connection goes away, closing and reopening a connection in a package will result in a fatal error.
These two databases on different instances so we can't cheat and directly comingle data.
Variables
Define 4 variables in SSIS, all of type String.
TempTableName - I used a value of ##SO_70530036 but use whatever value you specified in the One-time task section.
QuerySourceEmployees - This will be the query you run to generate the candidate set of data to go into the temporary table. I used SELECT TOP (3) E.[WWI Employee ID] AS EmployeeId FROM Dimension.Employee AS E WHERE E.[Is SalesPerson] = CAST(1 AS bit);
QueryDefineTables - Remember the drop/create statements from the on-time task? We're going to use the essence of them but use the expression builder to let us dynamically swap the table name. I clicked the ellipses, ..., on the Expression section and used the following "DROP TABLE IF EXISTS " + #[User::TempTableName] + "; CREATE TABLE " + #[User::TempTableName] + "( EmployeeId int NOT NULL);" You should be able to copy the Value from the row and paste it into SSMS to confirm it works.
QuerySales - This is the actual query you're going to use to pull your filtered set of sales data. Again, we'll use the Expression to allow us to dynamically reference the temporary table name. The prettified version of the expression would look something like
"SELECT
SI.InvoiceID
, SI.SalespersonPersonID
, SO.OrderID
, SOL.StockItemID
, SOL.Quantity
, SOL.OrderLineID
FROM
Sales.Invoices AS SI
INNER JOIN
Sales.Orders AS SO
ON SO.OrderID = SI.OrderID
INNER JOIN
Sales.OrderLines AS SOL
ON SO.OrderID = SOL.OrderID
WHERE
EXISTS (SELECT * FROM " + #[User::TempTableName] + " AS TT WHERE TT.EmployeeID = SI.SalespersonPersonID);"
Again, you should be able to pull the Value from the three queries and run them independently and verify they work.
Execute SQL Task
Add an Execute SQL task to the Control Flow. I named mine SQL Create temporary table My Connection Manager is WWI_OLTP and I changed the SQLSourceType to Variable and the SourceVariable is User::QueryDefineTables
Every time your package runs, the first thing it will do is establish create the temporary table. Which is good because SSIS is a metadata driven ETL engine and the next two steps would fail if the table didn't exist.
Data Flow Task - Prime the pump
This data flow is where we'll transfer DW data back to the OLTP system so can filter in the source system.
Drag a Data Flow Task onto the Control Flow. I named mine DFT Load Temp and before you click into it, right click on the Task and find the DelayValidation property and change this from the default of False to True. Normally, a package validates all metadata before actual execution begins as the idea is you want to know everything is good before any data starts moving. Since we're using temporary tables, we need to tell the execution engine "trust us, it'll be ready"
Double click inside the Data Flow Task.
Add an OLE DB Source. I named mine OLESRC SourceEmployees I use the connection manager WWI_DW. My data access mode changes to SQL command from variable and then I select my variable User::QuerySourceEmployees
Add an OLE DB Destination. I named mine OLEDST TempTableName and double clicked to configure it. The Connection Manager is WWI_OLTP and again, since the table lives in tempdb, we can't select it from the drop down. Change the Data access mode to Table name or view name variable - fast load and then select your variable name User::TempTableName. Click the Mapping tab and ensure source columns map to destination columns.
Data Flow Task - Transfer data
Finally, we will pull our source data, nicely filtered against the data from our target system.
Add an OLE DB Source. I named it OLESRC QuerySales. The Connection Manager is WWI_OLTP. Data access mode again changes to SQL command from variable and the variable name is User::QuerySales
From here, do whatever else you need to do to make the magic happen.
Instead of having 270k rows with an unfiltered query
I have 67k as there are only 3 employees in the temporary table.
Reference package
But wait, there's more!
Close out visual studio, open it back up and try to touch something in the data flows. Suddenly, there are red Xs everywhere! Any time you close a data flow component, it fires a revalidate metadata operation and guess what, it can't do that as the connection to the temporary table is gone.
The package will run fine, it will not throw VS_NEEDSNEWMETADATA but editing/maintenance becomes a pain.
If you switched from global temporary table to local, switch the table name variable's value back to a global and then run the define statement in SSMS. Once that's done, then you can continue editing the package.
I assure you, the local temporary table does work once you have the metadata set and you use queries via variables for source/destination.
No need for the global temporary table hack, or the SET FMTONLY OFF hack (which no longer works).
Just specify the result set metadata in the SQL query with WITH RESULT SETS. eg
EXEC ('
create table #t
(
ID INT,
Name VARCHAR(150),
Number VARCHAR(15)
)
insert into #t (Id, Name, Number)
select object_id, name, 12
from sys.objects
select * from #t
')
WITH RESULT SETS
(
(
ID INT,
Name VARCHAR(150),
Number VARCHAR(15)
)
)
If you need to parameterize the query, there's a bit of a catch because there are some limitations in how SSIS discovers parameters. SSIS runs sp_describe_undeclared_parameters, which doesn't really work with batches that call sp_executesql, because sp_executesql has a very unique way it handles parameters, one which you couldn't replicate with a user stored procedure.
So to parameterize the query you'll either need to pass the parameter values into the query using the "query from variable" and SSIS expressions, or push all this TSQL into a stored procedure.

Is it possible to share Local Temp table between SQLCLR triggers across a Linked Server?

I need to implement a Distributed transaction for a third party product. I have two SQL Servers and two SQLCLR triggers. I want to access the local temp table value from the second trigger context, which is on another instance. Is it possible?
//Server 1
[Microsoft.SqlServer.Server.SqlTrigger (Name="SqlTrigger1", Target="Table1", Event="FOR INSERT")]
public static void SqlTrigger1 ()
{
using (SqlConnection conn = new SqlConnection("context connection=true"))
{
conn.Open();
// Create #Temp table
// Insert some data
// Fire trigger Server 2 via Dblink
}
}
//Server 2
[Microsoft.SqlServer.Server.SqlTrigger (Name="SqlTrigger1", Target="Table1", Event="FOR INSERT")]
public static void SqlTrigger2 ()
{
using (SqlConnection conn = new SqlConnection("context connection=true"))
{
conn.Open();
Read #Temp table ???
}
}
The immediate answer has nothing to do with SQLCLR. It is not even conceptually possible to access a local temporary table (or stored procedure) across instances because like any other object, they are local to the instance that they are created on. And when using a Linked Server, there is no way to access the calling session, so a reference back to the local temporary table on Server 1 will never be accessible by code running on Server 2.
Also, while it is at least possible to access a global temporary table between instances (because those are visible to all sessions), that would still require an additional Linked Server to be created on Server 2 that points back to Server 1 because that is where the global temporary table would exist. That's a bit messy, and offers no advantages over creating a real table (unless you create the global temporary table to include a newly created GUID value as part of its name, but then you still need to transfer that value over to Server 2 in order to build the correct reference back to Server 1, which will need to happen in Dynamic SQL).
Clarification from the O.P.:
When user call query insert into dbo.Account (Name) values('something') I intercept this with clr trigger and execute the same query on server2 insert into Server2.dbo.Account (Name) values('something') and i need to shared context in this transaction for example a guid variable.
There is no such thing as a "shard context" between instances. Whatever data and/or values are needed in both places need to be passed into the remote instance. In this case, you can package up the data as XML in an NVARCHAR(MAX) variable, and execute a stored procedure on Server 2, passing in that NVARCHAR(MAX) value, convert it to XML in the stored procedure, and unpack it using .nodes(). Then you can additionally pass in individual scalar values as other parameters to the remote stored procedure. For example:
DECLARE #DataToTransfer NVARCHAR(MAX),
#SomeGuid UNIQUEIDENTIFIER;
SET #DataToTransfer = (
SELECT *
FROM inserted
FOR XML RAW('row')
);
EXEC [LinkedServerName].[DatabaseName].[SchemaName].[StoredProcedureName]
#Param1 = #DataToTransfer,
#Param2 = #SomeGuid;
The approach shown above works quite well. I have used it to transfer millions of rows per day from 18 production servers to a single archive server. Calling a remote stored procedure has less locking issues than attempting to do the straight DML / INSERT statement over the Linked Server. Also, this approach allows for sending both the table of data (packaged as XML) and individual variable values (e.g. the Guid you mentioned).
The remote stored procedure -- referenced in the EXEC in the example code above -- will be executed locally on Server 2, so it can create a local temporary table that the Trigger on the remote table will have access to, or use either SET CONTEXT_INFO or, if using SQL Server 2016 (or newer), use sp_set_session_context.
Also, as you may have noticed, none of this has anything to do with SQLCLR. I see no reason to introduce the additional complexity of having this in SQLCLR when you will be using none of the benefits of SQLCLR triggers / objects.
With a local temp table no, but with a global temp table (two pound signs: ##globalTemp instead of #temp) you should be able to. That being said, it's likely not a good idea because you never know if ##globalTemp table would exist or not. Who should be creating it?
There are two types of temporary tables: local and global. Local
temporary tables are visible only to their creators during the same
connection to an instance of SQL Server as when the tables were first
created or referenced. Local temporary tables are deleted after the
user disconnects from the instance of SQL Server. Global temporary
tables are visible to any user and any connection after they are
created, and are deleted when all users that are referencing the table
disconnect from the instance of SQL Server.

Create a copy of a table within the same database with SSIS

I want to create a copy of a table, say TestTable, with a new name, say TestTableNew, in the same database with the use of an SSIS package. I've created a "Transfer SQL Server Objects Task" for this with the source database specified as both the SourceDatabase and the DestinationDatabase. When I run this task, the original table TestTable is overwritten with a new -empty- TestTable.
This might well be something really obvious that I've overlooked, but can I somehow specify another name for the destination table somewhere in this transfer task? Or should I solve this in another way?
You can't use the "Transfer SQL Server Objects Task" to copy a table to the same database because there isn't an option to specify the new table name. You would be copying table "TestTable" to table "TestTable", which will fail because they both have the same name.
You can set the "DropObjectsFirst" property to true, but that will make you lose your original table and its data, which I think you did on your test, otherwise you would have received a failure message.
The best option here is to use an "Execute SQL Task" to create the structure of your TestTableNew based on your TestTable and then do a simple OleDBSource -> OleDBDestination transformation to load all the data from one table to another.
My knowledge of SSIS is very limited but I assume you can run sql commands passing in
parameters and therefore generating something like the following dynamically
select *
insert into TestTableNew
from TestTable

How to merge table from access to SQL Express?

I have one table named "Staff" in access and also have this table(same name) in SQL 2008.
Both table have thousands of records. I want to merge records from the access table to sql table without affecting the existing records in sql. Normally, I just export using OCBC driver and that works fine if that table doesn't exist in sql server. Please advise. Thanks.
A simple append query from the local access table to the linked sql server table should work just fine in this case.
So, just drop in the first (from) table into the query builder. Then change the query type to append, and you are prompted for the append table name.
From that point on, just drop in the columns you want (do not drop in the PK column, as they need not be used nor transferred in this case).
You can also type in the sql directly in the query builder. Either way, you will wind up with something like:
INSERT INTO dbo_custsql
( ADMINID, Amount, Notes, Status )
SELECT ADMINID, Amount, Notes, Status
FROM custsql1;
This may help: http://www.red-gate.com/products/sql-development/sql-compare/
Or you could write a simple program to read from each data set and do the comparison, adding, updating, and deleting, etc.

How to empty foxpro db table?

How can i empty the whole foxpro db table using any trigger or stored procedure
The only way to empty a FoxPro table is first by getting exclusive access (ie no-one else has the file open and you can USE yourTable EXCLUSIVE): then you can use the ZAP command.
You can also
use table
Delete all
But they wont be fully deleted until a deleted until PACK is issued

Resources