Retrieving column in deleted/inserted tables referencing another table - sql-server

It may save some time to skip the "why" on this.
I would like to compile a string within a SQL Server trigger that contains all the values in the inserted or deleted tables, including their column names. I'm not used to doing something this extensive in SQL so I may be approaching this entirely wrong.
Something like:
CREATE OR ALTER TRIGGER [dbo].[TRIGGER_NAME]
ON [MY_TABLE_NAME]
AFTER UPDATE, INSERT
AS
DECLARE #columns TABLE(COLUMN_NAME VARCHAR(100))
INSERT INTO #columns
SELECT
COLUMN_NAME, DATA_TYPE
FROM
INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = 'MY_TABLE_NAME'
DECLARE #string nvarchar(MAX)
SET #string = (SELECT TOP 1
STRING_AGG(CONCAT('s:', LEN(COLUMN_NAME), ':"',
COLUMN_NAME, '";', inserted.COLUMN_NAME), ';' )
FROM #columns)
This doesn't tell the full story, but the critical piece is the inserted.COLUMN_NAME part. The value in COLUMN_NAME is a varchar and I want that value to be evaluated as the name of the column in the inserted table.
I explored writing a function or stored procedure but fear I may be overthinking this. It feels like a basic thing to do. In javascript I'd simply say something like inserted[Column_Name] but SQL is a whole different beast...
--EDIT--
What I want is to be able to evaluate the value held in a variable as a column name. i.e.
DECLARE #var varchar(50)
SET $var = 'name';
SELECT TABLE_NAME.$var (should get the column named 'name' from table named TABLE_NAME)
Ultimate goal is to...
Say there is a table like this:
Row NUM
Email
Name
1
Jack#name.com
Jack
2
Jill#name.com
Jill
I want to be able to be able convert this table into a single string like this without knowing all the names of the columns ahead of time.
'Email:Jack#name.com,Name:Jack;Email:Jill#name.com,Name:Jill;'

Related

How to get the datatype of a column of a view in SQL Server?

I want to get the datatype of a column of a view in SQL Server. Is there an efficient way to do that?
I have to get the Database Name, Schema, View Name all dynamically from one database, look for the view in another database and find the data type of the column in the third database.
E.g.
SELECT #db2 = Name FROM db1.schema.databases
SELECT #c = Name FROM db1.schema.columns
SELECT #v = Name FROM db1.schema.views
SELECT #datatype = query to get {datatype} of column {c} from {db2}.{schema}.{v}
Here column {c} of {db2}.{schema}.{v} can refer another database say {db3}
Please suggest.
Don't know exactly what you need, but this might help you:
USE master;
GO
CREATE VIEW dbo.TestView AS
SELECT * FROM master..spt_values;
GO
--This is the view's output
SELECT * FROM dbo.TestView;
GO
--Set your variables
DECLARE #db2 VARCHAR(100) = 'master';
DECLARE #c VARCHAR(100) = 'type';
DECLARE #vSchema VARCHAR(100) = 'dbo';
DECLARE #vName VARCHAR(100) = 'TestView'
--The query will display the DATA_TYPE and all other columns returned by INFORMATION_SCHEMA.COLUMNS
SELECT c.DATA_TYPE
,c.*
FROM master.INFORMATION_SCHEMA.COLUMNS AS c
WHERE c.TABLE_NAME=#vName
AND c.TABLE_SCHEMA=#vSchema
AND c.COLUMN_NAME=#c
AND c.TABLE_CATALOG=#db2; --forgot this in the first post...
--Clean-Up
GO
DROP VIEW dbo.TestView;
It's a bit fuzzy, that the COLUMNS view returns tables and views as if they were the same. The advantage: You can use the same approach to check a table's column...
Hint: This INFORMATION_SCHEMA.COLUMNS is just a built-in view looking into the corresponding sys tables.

Dynamic table in SQL Server

I have a really weird and complex requirement that I need help with. I have a table let's say Tasks that contains all the tasks for a user/system. I need to filter out the tasks per user and show it in UI. But here is the scene, the Tasks table contains a column base_table that stores the table name (real SQL Server table) on which it is based. It also stores the base table id which navigates to a particular record in the base table. Now I need to add some filter in the base table and if it satisfies the task would get retrieved.
I did try to put up a procedure which would hit a select query against base table and also check conditions.
CREATE PROCEDURE gautam_dtTable_test
(#TableName AS nvarchar(max))
AS
BEGIN try
declare #sql nvarchar(max)
declare #ret tinyint
set #ret = 0
set #sql = 'select 1 where exists (Select top 1 Id from ' + #TableName+' where some_condition)';
Exec sp_executesql #sql, N'#var tinyint out', #ret out
return #ret
end try
begin catch
return 0
end catch
I have used the procedure to input table name and hit some conditions and return a flag, 1/0 kind of thing. I also want to use try catch so that if there is any error, it would return false.
That's why I have used the procedure, not function. But seems like we can use this procedure into sql statement. Overall what I have in my mind is
Select *
from tasks
where some_conditions
and procedure/function_to_check(tasks.base_table)
Key issues with my approach
The base_table name could be invalid, so are some columns in it. So, I would love to use a try-catch.
Need to Embed it as sub-query to avoid parallel operations. But it seems tough when your procedure/function have EXEC and sp_executesql defined.
Any kind of help is appreciated. Thanks in advance!
The question as stated is a bit unclear so I am going to make some assumptions here. It looks like you are trying achieve the following:
First it seems you are trying to only return task in your task table where the ‘base_table’ column value references a valid SQL Server table.
Secondly if I understand the post correctly, based on the where clause condition passed to the tasks table you are trying to determine if the same columns exists in your base table.
The first part is certainly doable. However, the second part is not since it would require the query to somehow parse itself to determine what columns are being filtered on.
The following query show how you can retrieve only tasks for which there is a valid corresponding table.
SELECT *
FROM [dbo].[tasks] ts
CROSS APPLY (
SELECT [name]
FROM sys.objects WHERE object_id = OBJECT_ID('[dbo].' + QUOTENAME(ts.base_table)) AND type in (N'U')
) tb
If the field(s) you are trying to filter on is known up front (i.e. you are not trying to parse based of the tasks table) then you can modify the above query to pass the desired columns you want to check as follow:
DECLARE #columnNameToCheck NVARCHAR(50) = 'col2'
SELECT ts.*
FROM [dbo].[tasks] ts
CROSS APPLY (
SELECT [name]
FROM sys.objects WHERE object_id = OBJECT_ID('[dbo].' + QUOTENAME(ts.base_table)) AND type in (N'U')
) tb
CROSS APPLY (
SELECT [name]
FROM sys.columns WHERE object_id = OBJECT_ID('[dbo].' + QUOTENAME(ts.base_table)) AND [name] = #columnName

SQL Query Results to Local Variable without unique identifier

I'm relatively new to SQL and I'm trying to write a query that will assign the result of multiple rows into a local variable
DECLARE #x VARCHAR(MAX)
SET #x= (SELECT someCol
FROM table)
--Does important stuff to the #x variable
SELECT #x
During my research I realized that this won't work because the subquery can only return one value and my query will return multiple results. However I can not do something like this:
DECLARE #x VARCHAR(MAX)
SET #x= (SELECT someCol
FROM table
where id= 'uniqueIdentifier')
--Does important stuff to the #x variable
SELECT #x
The reason I can't use a where clause is that I need to do this for the entire table and not just one row. Any ideas?
EDIT: I realized my question was too broad so I'll try to reformat the code to give some context
SELECT col_ID, col_Definition
FROM myTable
If I were to run this query col_Definition would return a large varchar which holds a lot of information such as the primary key of another table that I'm trying to obtain. Lets say for example I did:
DECLARE #x VARCHAR(MAX)
SET #x= (SELECT col_Definition
FROM myTable
WHERE col_ID = 1)
--Function to do the filtering of the varchar variable that works as expected
SELECT #x as [Pk of another table] --returns filtered col_Definition
This will work as expected because it returns a single row. However, I would like to be able to run this query so that it will return the filtered varchar for every single row in the "myTable" table.
If I understand correctly, you store a PK embedded in a string that you want to eventually get out and use to join to that table. I would think putting the group of records you want to work with into a temp table and then applying some logic to that varchar column to get the PK. That logic is best as set based, but if you really want row by row, use a scalar function rather than a variable and apply it to the temp table:
select pk_column, dbo.scalarfunction(pk_column) as RowByRowWithFunction, substring(pk_column,5,10) as SetBasedMuchFaster
from #tempTable.
You need to define what the 'uniqueIdentifier' is first.
Not sure about using a subquery and grabbing the result and executing another query with that result unless you do an INNER JOIN of some sort or if using python or another language to process the data then use:
("SELECT someCol
FROM table
where id='%s'" % Id)

How to INSERT INTO table column with string/variable

This is the data I have pulled from powershell and inserted it into a #temptable:
Name : SULESRKMA
Location : Leisure Services - Technology Services
Shared : False
ShareName :
JobCountSinceLastReset : 0
PrinterState : 131072
Status : Degraded
Network : False
I'm while looping through the data and have stripped the values from the identifiers. I'd like to use these identifiers to insert the values into a table with identical Column names to the identifiers. So for example, I have a variable called #identifier = "Name" and a temp table #printers with a column name of Name. I'd like to do something like:
SELECT --select statement
INSERT INTO #printers(#identifier)
But This doesn't seem to work, unsurprisingly. Is there a way to accomplish this? (The #identifier variable will be changing to the other identifiers in the data throughout the course of the while loop.)
Any alternate suggestions that don't even involve using this sort of method are welcome. My ultimate goal is just to get this data as a row into a table.
(I'm currently using Microsoft SQL Server Management Studio if that matters)
First, it's unlikely you need to loop over anything in this situation. Think set based operations when you think about SQL.
INSERT INTO #temptable (Column1Name, Column2Name, Column3Name)
VALUES (#identifer, #anotherIdentifier, #someOtherIdentifier)
--optional clauses
WHERE Column1Name = 'some value' OR Column1Name = #someIdentifier
Or you can SELECT INTO
SELECT
#identifier,
#anotherIdentifer,
#someOtherIdentifier
INTO #temptable
It's important that you have a value in your SELECT INTO for each column in the table which you are trying to add the data to. So, for example, if there were 4 columns in #temptable and you only had 3 values to insert (columns 1, 2 , and 3) then you'd need to NULL column 4 or set it statically.
SELECT
#identifier,
#anotherIdentifer,
#someOtherIdentifier,
NULL
INTO #temptable
--or
SELECT
#identifier,
#anotherIdentifer,
#someOtherIdentifier,
'static value'
INTO #temptable
EDIT
If you want to use a varible to speciy the column that you want to insert into, you have to use dynamic sql. Here is an example:
if object_id ('tempdb..#tempTable') is not null drop table #tempTable
create table #tempTable (Column1Name int, Column2Name int, Column3Name int)
declare #columnName varchar(64) = 'Column1Name'
declare #sql varchar(max)
set #sql =
'insert into #tempTable (' + #columnName + ')
select 1'
exec(#sql)
select * from #tempTable

Dynamically create temp table based on resultset from SP

I have a SP that calls another SP and the resultset needs to be filtered to only the columns that I am interested in.
DECLARE #myTable TABLE
(
Field1 INT,
Field2 INT,
Field3 INT
)
--If someSP returns say 30 params and I need only 3 params, I don't want to declare all 30 in my temp table #myTable
INSERT INTO #myTable
(Field1, Field2, Field3)
EXEC someSP --Need to selectively filter recordset from SP
#InputParam1 = 'test'
If I cannot do this, I would want to create the temp table DYNAMICALLY based on the resultset from someSP (This way it relieves maintenance issues when someSP is modified to add a new param, I dont need to modify this proc as well
Short answer: no, you can't do that.
You have to pre-declare your temp table with the exact number of columns that will be returned from the stored proc.
The workaround is to use persistent tables. For example, you could have a permanent table in your database called someSPResults. Whenever someSP is changed to have a different number of output columns, change the format of someSPResults as part of the deployment.
Then you can either do this:
insert into dbo.someSPresults
exec someSP
Or inside someSP, you can have the results be inserted directly into the someSPresults table as a normal part of execution. You just have to make sure to identify exactly which records in the someSPresults table came from each execution of someSP, because that stored proc could be fired multiple times simultaneously, thereby dumping a lot of data into someSPresults.
cmsjr stated, "A table variable cannot be the target of a result set from another stored procedure."
I thought that was true too, but then I tested it. This code works in both 2005 and 2008:
CREATE PROCEDURE someSP (#InputParam1 varchar(100)) AS
SELECT LEN(#InputParam1), DATALENGTH(#InputParam1), ##SPID
GO
DECLARE #myTable TABLE (
Field1 INT,
Field2 INT,
Field3 INT
)
INSERT INTO #myTable (Field1, Field2, Field3)
EXEC someSP
#InputParam1 = 'test'
SELECT * FROM #myTable
I knew that would work with #temp tables, but I thought it would not work with #temp tables.
That doesn't answer DotnetDude's question though.
A table variable cannot be the target of a result set from another stored procedure, also you can't perform DDL on table variables after they are declared, they will always have the same definition they were declared with. Temp table is your best bet.
I could think of two options, but I didn't have time to test them: convert the SP into an User Defined Function and use the SELECT * FROM {function} INTO {table}, or use OPENROWSET:
SELECT *
FROM OPENROWSET('SQLOLEDB',
'servername';'username';'password',
'exec dbname.{owner}.yourstoredproc') AS spResult
INTO {tablename}
Both solutions should create the table on the fly, then you can simply select from it.
Based on the comments above, I'd suggest you consider a table valued function.
This can be parameterised and you can do this:
INSERT #foo (col1, col14, col29)
SELECT col1, col14, col29 FROM dbo.ufnTVF (#p1, #p2)
Otherwise, it's OPENROWSET as the "cleanest" (I use this loosely) solution
Or, you modify the resultset of your stored proc to onlky return the columns you want.
This implies dynamic SQL or lots of IF statements. Which in some circumstances will not parse correctly (with SET FMTONLY etc).
You could be trying to code against a 3rd party app or system stored procs (we don't have full details), but it feels messy and wrong. SQL Server 2005 has a huge number of DMVs and catalogue (or catalog depending on which side of the Atlantic you are) views that remove the need for system proc calls.
If you're trying to mimic some aspects of OO design (one proc to do something for everyboy), then I wouldn't. If you need a query that retruns 3 columns of 30, then do so. This will run far better because unused tables and columns will be ignored on in the plan, indeed do not need included.

Resources