Join a table whose name is stored in the first table - sql-server

I have a first table [TABLE1] with columns [ID], [Description], [DetailTable]. I want to join [TABLE1] with the [DetailTable]. The name of [DetailTable] is stored in [TABLE1] column.
"SELECT * FROM TABLE1 CROSS JOIN ?????"
Any suggestions?

So... if you cheat and SELECT * from the detailtab, you could do something a bit like this, with dynamic SQL:
-- For the example, choose either 1 or 2 to see the #foo table or the #bar table
DECLARE #Id INT = 1
-- EXAMPLE SCENARIO SETUP --
CREATE TABLE #ListOfTables
( ID INT IDENTITY(1,1) NOT NULL
,[Description] NVARCHAR(255) NOT NULL
,[DetailTable] NVARCHAR(255) NOT NULL);
CREATE TABLE #foo
(Foothing VARCHAR(20));
CREATE TABLE #bar
(Barthing VARCHAR(20));
-- TEST VALUES --
INSERT #ListOfTables VALUES ('foo','#foo'),('bar','#bar');
INSERT #foo VALUES ('A foothing Foothing');
INSERT #bar VALUES ('A barthing Barthing');
-- THE SCRIPT --
DECLARE #SQL NVARCHAR(MAX) = '';
SELECT #SQL =
' SELECT Tab.Id, Tab.[Description], Tab2.*
FROM #ListOfTables AS Tab
CROSS JOIN ' + T.DetailTable + ' AS Tab2
WHERE Tab.Id = ' + CONVERT(VARCHAR(10),#Id)
FROM #ListOfTables T
WHERE T.Id = #Id;
PRINT #SQL
EXEC sp_executesql #SQL;
-- CLEAN UP --
DROP TABLE #ListOfTables;
DROP TABLE #bar;
DROP TABLE #foo;
However, I have to agree with the comments that this is a pretty nasty way to do things. If you want to choose particular columns and the columns are different for each detail table, then things will start to get really unpleasant... Does this give you something you can start with?
Remember, the best solution will almost certainly involve redesigning things so you don't have to jump through these hoops!

All of the detail tables must have identical schema.
Create a view that unions all the tables
CREATE VIEW vAllDetails AS
SELECT 'DETAIL1' table_name, * from DETAIL1
UNION ALL
SELECT 'DETAIL2' table_name, * from DETAIL2
UNION ALL
SELECT 'DETAIL3' table_name, * from DETAIL3
When you join against this view, SQL Server can generate a plan that uses a "startup predicate expression". For example, a plan like this: sample plan. At first glance, it looks like SQL is going to scan all of the detail tables, but it won't. The left most filters include a "startup predicate", so for each row we read from table1, only if TableName matches will that branch be executed.

Related

What is easiest and optimize way to find specific value from database tables?

As per my requirement, I have to find if some words like xyz#test.com value exists in which tables of columns. The database size is very huge and more than 2500 tables.
Can anyone please provide an optimal way to find this type of value from the database. I've created a loop query which took around almost more than 9 hrs to run.
9 hours is clearly a long time. Furthermore, 2,500 tables seems close to insanity for me.
Here is one approach that will run 1 query per table, not one per column. Now I have no idea how this will perform against 2,500 tables. I suspect it may be horrible. That said I would strongly suggest a test filter first like Table_Name like 'OD%'
Example
Declare #Search varchar(max) = 'cappelletti' -- Exact match '"cappelletti"'
Create Table #Temp (TableName varchar(500),RecordData xml)
Declare #SQL varchar(max) = ''
Select #SQL = #SQL+ ';Insert Into #Temp Select TableName='''+concat(quotename(Table_Schema),'.',quotename(table_name))+''',RecordData = (Select A.* for XML RAW) From '+concat(quotename(Table_Schema),'.',quotename(table_name))+' A Where (Select A.* for XML RAW) like ''%'+#Search+'%'''+char(10)
From INFORMATION_SCHEMA.Tables
Where Table_Type ='BASE TABLE'
and Table_Name like 'OD%' -- **** Would REALLY Recommend a REASONABLE Filter *** --
Exec(#SQL)
Select A.TableName
,B.*
,A.RecordData
From #Temp A
Cross Apply (
Select ColumnName = a.value('local-name(.)','varchar(100)')
,Value = a.value('.','varchar(max)')
From A.RecordData.nodes('/row') as C1(n)
Cross Apply C1.n.nodes('./#*') as C2(a)
Where a.value('.','varchar(max)') Like '%'+#Search+'%'
) B
Drop Table #Temp
Returns
If it Helps, the individual queries would look like this
Select TableName='[dbo].[OD]'
,RecordData= (Select A.* for XML RAW)
From [dbo].[OD] A
Where (Select A.* for XML RAW) like '%cappelletti%'
On a side-note, you can search numeric data and even dates.
Make a procedure with VARCHAR datatype of column with table name and store into the temp table from system tables.
Now make one dynamic Query with executing a LOOP on each record with = condition with input parameter of email address.
If condition is matched in any statement using IF EXISTS statement, then store that table name and column name in another temp table. and retrieve the list of those records from temp table at end of the execution.

Can I use dynamically LIKE and IN together?

I want to be able to say :
SELECT * FROM myTable WHERE accountName LIKE('%john%', '%bill%', '%lory%'.....)
I want that to be dynamically, which means depend on user input the list of '%name%' parts will be different. One time could have 3 names and another probably just 1.
You could use JOIN:
SELECT DISTINCT myTable.*
FROM myTable
JOIN (SELECT '%john%' UNION ALL
SELECT '%bill%' UNION ALL
SELECT '%lory%') sub(c) -- this could be anything table variable/temp
ON myTable.accountName LIKE sub.c;
Keep in mind that '%...%' is not SARG-able.
With table variable:
DECLARE #tab AS TABLE (c NVARCHAR(100));
INSERT INTO #tab(c) VALUES ('...');
-- ...
SELECT DISTINCT myTable.*
FROM myTable
JOIN #tab t
ON myTable.accountName LIKE t.c;
WHERE accountName LIKE('%john%', '%bill%', '%lory%'.....)
This is invalid syntax and won't work. The easiest way to do what you're trying to do would be to use a "string splitter" function like Jeff Moden's DelimitedSplit8K
DECLARE #Names VARCHAR(1000) = 'john, bill, lory';
SELECT
*
FROM
dbo.myTable mt
CROSS APPLY dbo.DelimitedSplit8K(#Names, ',') dsk
WHERE
mt.accountname LIKE '%' + dsk.Item + '%';
OR
SELECT
*
FROM
dbo.myTable mt
WHERE
EXISTS (
SELECT 1
FROM dbo.DelimitedSplit8K(#Names, ',') dsk
WHERE mt.accountname LIKE '%' + dsk.Item + '%'
);
HTH, Jason
Building off of Jason and Lad's very excellent solutions you could speed things up with an indexed view.
Before I continue - it's important to note that this will slow down inserts/updates/deletes in high-traffic OLTP environments that modify these tables often. TEST, TEST, TEST!.
I work in the Data Warehouse world where this works out quite nicely.
First a keyword table for common search terms:
CREATE TABLE dbo.keyword (kw varchar(100) not null, constraint uq_keyword unique clustered(kw));
INSERT dbo.keyword VALUES ('john'), ('bill'), ('lory');
Next for your table:
CREATE TABLE dbo.mytable(someid int identity not null, accountname varchar(100));
INSERT dbo.mytable(accountname) VALUES
('John''s Flowers'), ('Candles by Bill'), ('Some other account'), ('The Lory Group LLC');
Now the view along with a couple important indexes:
CREATE VIEW dbo.vw_mytable_fastKWLookup
WITH SCHEMABINDING AS -- schemabinding required for indexed views
SELECT
t.someid,
t.accountname,
k.kw
FROM dbo.mytable t
CROSS JOIN dbo.keyword k
WHERE t.accountname LIKE '%'+k.kw+'%';
GO
-- required unique clustered index
CREATE UNIQUE CLUSTERED INDEX uq_cl_vw_mytable_fastKWLookup
ON dbo.vw_mytable_fastKWLookup(kw, someid);
-- A good nonclustered index
CREATE NONCLUSTERED INDEX nc_vw_mytable_fastKWLookup__kw
ON dbo.vw_mytable_fastKWLookup(kw) include (someid, accountname);
Once your indexed view is in place you can add your search words to an in list like so:
SELECT someid, accountname, kw
FROM vw_mytable_fastKWLookup WITH (NOEXPAND)
WHERE kw IN ('John', 'Lory', 'Bill', 'Fred');
Results:
someid accountname kw
----------- -------------------- -----
1 John's Flowers john
2 Candles by Bill bill
4 The Lory Group LLC lory
The reward here is an execution plan that does a non-clustered index seek for %searchstring% style searches.

How to check if a value is included by a list in an effective way in sql 2008?

I would like to use something like an .Include function in SQL Server 2008, but I could not find the correct syntax for it. I have a sql query like below:
--#values has to be varchar list and start & end with comma
declare #values varchar(max) = ',7,34,37,74,85,'
select (case when #values like '%,' + m.Id + ',%' then m.Name else null end)
from #myTable m
So the logic is, if ID of a record matches with one of the numbers in #values list, I would like to see its name in the output list. This query is working fine, but I would like to find a more professional way to handle it, maybe like:
case when #values.Include(m.Id) then m.Name else null end
Any advice would be appreciated. Thanks.
The fastest method to split a delimited string is using xquery in my experience.
Ex:
DECLARE #values VARCHAR(50), #XML XML
SET #values = ',7,34,37,74,85,'
SET #XML = cast(('<X>'+replace(#values,',' ,'</X><X>')+'</X>') as xml)
SELECT N.value('.', 'VARCHAR(255)') as value FROM #XML.nodes('X') as T(N)
declare #table table (id varchar(5))
insert into #table(id)
values ('7')
select *
from #table y
where exists (SELECT 1 FROM #XML.nodes('X') as T(N) where N.value('.', 'VARCHAR(255)') = y.id)
If you are calling this code from an application, you might want to consider using Table-Valued Parameters and a stored procedure to do this.
First, you would need to create a table type to use with the procedure:
create type dbo.Ids_udt as table (Id int not null);
go
Then, create the procedure:
create procedure dbo.get_names_from_list (
#Ids as dbo.Ids_udt readonly
) as
begin;
set nocount, xact_abort on;
select t.Name
from t
inner join #Ids i
on t.Id = i.Id
end;
go
Then, assemble and pass the list of Ids to the stored procedure using a DataTable added as a SqlParameter using SqlDbType.Structured.
Table Valued Parameter Reference:
SQL Server 2008 Table-Valued Parameters and C# Custom Iterators: A Match Made In Heaven! - Leonard Lobel
Table Value Parameter Use With C# - Jignesh Trivedi
Using Table-Valued Parameters in SQL Server and .NET - Erland Sommarskog
Maximizing Performance with Table-Valued Parameters - Dan Guzman
Maximizing throughput with tvp - sqlcat
How to use TVPs with Entity Framework 4.1 and CodeFirst
Assuming that the data/list is not required to be structered as a comma separated list you could either use IN, EXISTS or SOME / ANY
If it is unavoidable you could use JiggsJedi way but since you asked for a fast way you should try to store the data in a way that in can be processed faster and does not require additional work to be queried.
IF OBJECT_ID('tempdb..#Temp') IS NOT NULL
Drop table #Temp
Create table #Temp (ID INt ,Name varchar(5))
INSERT into #Temp
SELECT 7,'AA' Union all
SELECT 34,'BA' Union all
SELECT 37,'CA' Union all
SELECT 74,'DA' Union all
SELECT 85,'TA'
DECLARE #values varchar(max) = ',,,,,,7,,34,,,74,85,,,,' --If extra commas are added in starting or end or in between of string it could handle
SET #values=','+#values+','
SELECT #values= LEFT(STUFF(#values,1,1,''),LEN(#values)-2)
DECLARE #SelectValuesIn TABLE(Value INT)
INSERT INTO #SelectValuesIn
SELECT Split.a.value('.', 'VARCHAR(100)') AS Data
FROM
(
SELECT
CAST ('<M>' + REPLACE(#values, ',', '</M><M>') + '</M>' AS XML) AS Data
) AS A CROSS APPLY Data.nodes ('/M') AS Split(a);
SELECT * FROM #Temp WHERE ID IN(SELECT Value from #SelectValuesIn)

How to INSERT INTO table column with string/variable

This is the data I have pulled from powershell and inserted it into a #temptable:
Name : SULESRKMA
Location : Leisure Services - Technology Services
Shared : False
ShareName :
JobCountSinceLastReset : 0
PrinterState : 131072
Status : Degraded
Network : False
I'm while looping through the data and have stripped the values from the identifiers. I'd like to use these identifiers to insert the values into a table with identical Column names to the identifiers. So for example, I have a variable called #identifier = "Name" and a temp table #printers with a column name of Name. I'd like to do something like:
SELECT --select statement
INSERT INTO #printers(#identifier)
But This doesn't seem to work, unsurprisingly. Is there a way to accomplish this? (The #identifier variable will be changing to the other identifiers in the data throughout the course of the while loop.)
Any alternate suggestions that don't even involve using this sort of method are welcome. My ultimate goal is just to get this data as a row into a table.
(I'm currently using Microsoft SQL Server Management Studio if that matters)
First, it's unlikely you need to loop over anything in this situation. Think set based operations when you think about SQL.
INSERT INTO #temptable (Column1Name, Column2Name, Column3Name)
VALUES (#identifer, #anotherIdentifier, #someOtherIdentifier)
--optional clauses
WHERE Column1Name = 'some value' OR Column1Name = #someIdentifier
Or you can SELECT INTO
SELECT
#identifier,
#anotherIdentifer,
#someOtherIdentifier
INTO #temptable
It's important that you have a value in your SELECT INTO for each column in the table which you are trying to add the data to. So, for example, if there were 4 columns in #temptable and you only had 3 values to insert (columns 1, 2 , and 3) then you'd need to NULL column 4 or set it statically.
SELECT
#identifier,
#anotherIdentifer,
#someOtherIdentifier,
NULL
INTO #temptable
--or
SELECT
#identifier,
#anotherIdentifer,
#someOtherIdentifier,
'static value'
INTO #temptable
EDIT
If you want to use a varible to speciy the column that you want to insert into, you have to use dynamic sql. Here is an example:
if object_id ('tempdb..#tempTable') is not null drop table #tempTable
create table #tempTable (Column1Name int, Column2Name int, Column3Name int)
declare #columnName varchar(64) = 'Column1Name'
declare #sql varchar(max)
set #sql =
'insert into #tempTable (' + #columnName + ')
select 1'
exec(#sql)
select * from #tempTable

SQL Query Delete rows from a table

I have a table #temp. The data in #temp are table names in a database. I wish to only show the table names of which the table has data. How can I do this without using dynamic SQL?
My sample data is as below:
create TABLE #temp (Table_Name VARCHAR(50))
insert into #temp values ('#temp1')
,('#temp2')
,('#temp3')
,('#temp4')
create TABLE #temp1 (Col1 int)
insert into #temp1 values (1)
,(3)
,(4)
create TABLE #temp2 (Col1 int)
insert into #temp2 values (7)
,(9)
,(6)
create TABLE #temp3 (Col1 int)
create TABLE #temp4 (Col1 int)
I manually delete the blank tables, How to do this using a query for numerous blank tables?
DELETE FROM #temp
WHERE Table_Name = '#temp3'
or Table_Name = '#temp4'
This is the result I want
select * from #temp
-- It only shows the two table names which are not blank
DROP TABLE #temp
DROP TABLE #temp1
DROP TABLE #temp2
DROP TABLE #temp3
DROP TABLE #temp4
This is my old query for this question:
DECLARE #TABLE_NAME VARCHAR(50), #COMMAND VARCHAR(500), #COUNT INT, #COUNTT INT
DECLARE #CountResults TABLE (CountReturned INT)
create TABLE #TABLE_NAME (TABLE_NAME VARCHAR(50))
SELECT #COUNTT= COUNT(*) FROM #temp
WHILE #COUNTT > 0
BEGIN
SELECT TOP 1 #TABLE_NAME = Table_Name FROM #temp
SET #COMMAND = 'SELECT COUNT(*) FROM ' + #TABLE_NAME
INSERT #CountResults EXEC (#COMMAND)
SET #Count = (SELECT * FROM #CountResults)
BEGIN TRANSACTION
DELETE #CountResults
ROLLBACK TRANSACTION
IF(#Count > 0)
BEGIN
INSERT INTO #TABLE_NAME VALUES (#TABLE_NAME)
END
DELETE FROM #temp WHERE Table_Name = #TABLE_NAME
SELECT #COUNTT= COUNT(*) FROM #temp
END
SELECT * FROM #TABLE_NAME
I don't know of any way to determine whether or not a table is empty without querying that table, which in your case means dynamic SQL. Your comments make it sound like you're okay with this but are looking for a way to do this more concisely than using a loop. Here's a (limited) possibility:
declare #sql nvarchar(max);
select #sql =
-- coalesce() ensures that UNION ALL is inserted before every SELECT but the first.
coalesce(#sql + N' union all ', N'') +
-- Select each table name. Note that SQL Server allows table names that contain
-- single quotes. In this case (or in the case of plain old bad/malicious data in
-- #temp), we need to make sure those characters are enclosed within the string
-- literal we're building.
N'select ''' + replace(table_name, N'''', N'''''') +
-- Use EXISTS to make sure there are one or more records in the table.
N''' where exists (select 1 from ' + quotename(table_name) + N')'
from #temp;
exec sp_executesql #sql;
This will build and execute a query that looks like this:
select '#temp1' where exists (select 1 from [#temp1])
union all
select '#temp2' where exists (select 1 from [#temp2])
union all
select '#temp3' where exists (select 1 from [#temp3])
union all
select '#temp4' where exists (select 1 from [#temp4])
This approach has a few limitations that you should be aware of:
The query will fail if #temp contains any string which is not the name of a table or view. Normally I'd suggest mitigating this by using object_id() or querying INFORMATION_SCHEMA.TABLES, but the fact that you've loaded #temp with the names of other temp tables complicates matters.
The query will also fail if #temp contains a table name that explicitly names the table schema, e.g. dbo.Stuff, because quotename() will render it as [dbo.Stuff] rather than [dbo].[Stuff]. But if you omit quotename(), you run the risk of incorrect and/or damaging behavior if a table_name contains spaces or other problematic characters.
In short, if you just want something for personal use and are okay with making certain assumptions about the data in #temp, then something like the above ought to work. But if you want something that will work correctly and safely under any circumstances, then it's going to take some doing, enough so that even if you could avoid using some kind of a loop, doing so is unlikely to make things any less complicated.
I have a method that does not use dynamic sql. It uses the sysindexes table, which according to Microsoft is subject to change at their whim. So this may not be a good candidate for a production system. But it could be a good place to start. This is also a bit easier if your source table is not a temp table, since temp tables have actual names that do not match the name used to create them.
This script worked for me on SQL Server 2008 r2.
-- drop table #MyTempTable;
Create table #MyTempTable(Table_Name varchar(50));
insert #MyTempTable values ('#MyTempTable2');
insert #MyTempTable values ('#MyTempTable3');
insert #MyTempTable values ('#MyTempTable4');
Create table #MyTempTable2 (Col1 int);
insert #MyTempTable2 values (1);
Create table #MyTempTable4 (Col1 int);
Create table #MyTempTable3 (Col1 int);
SELECT *
FROM #MyTempTable M1
JOIN tempdb.sys.tables T ON T.name LIKE (M1.Table_Name + '%')
JOIN [tempdb].[dbo].[sysindexes] S ON S.id = T.object_id
WHERE S.rowcnt > 0
It's not an ideal solution, but it satisfies your requirements. If you play around with it in your environment, it might give you some insight into a better way to achieve your larger goals. good luck.
EDIT: sysindexes will have one entry per index on the table. Or in the case of my example, for the heap (with no index.) So if your base tables have multiple indexes, you will need to modify the query a bit. Maybe change the JOIN and WHERE clause to a WHERE EXISTS SELECT * FROM [tempdb].[dbo].[sysindexes] S WHERE S.id = T.object_id AND S.rowcnt > 0 Play with it and you should be able to get where you were asking.
EDIT 2: Replacing sys.tables with sysobjects.
SELECT *
FROM #MyTempTable M1
JOIN [tempdb].[dbo].[sysobjects] O ON O.name LIKE (M1.Table_Name + '%')
JOIN [tempdb].[dbo].[sysindexes] S ON S.id = O.id
WHERE S.rowcnt > 0
Based on DeadZone's Query, the following works for non temp tables:
SELECT DISTINCT Table_Name
INTO #TABLE_NAME
FROM #Temp M1
JOIN [dbo].[sysobjects] O ON O.name LIKE (M1.Table_Name + '%')
JOIN [dbo].[sysindexes] S ON S.id = O.id
WHERE S.rowcnt > 0

Resources