Querying a table with an additional column containing table name - sql-server

List item
I am looking for a way query to query a table and add a column with the table name, without explicitly writing the actual 'tablename' within the select statement. Is there a way to do this?
For example I want;
Table name: Construction
The original columns would be Modif_num, modif_desc.
I'd like a query with these results;
The original columns would be Modif_num, modif_desc.
MODIF_NUM TABLE_NAME MODIF_DESC
2 Construction Quality
2 Construction Quality
2 Construction Quality
2 Construction Quality
A regular select * would yield
MODIF_NUM MODIF_DESC
2 Quality
2 Quality
2 Quality
2 Quality

In this instance i would use excel.
column A : table name
column B : ="select cast('"&A1&"' as nvarchar(50)) as tablename ,* into TARGETTABLE from "& A1
Then fill column A with all your table names.. then copy and paste column B into SSMS
This assumes based on your comment this is a one off task. If its not a one off task use the same logic to generate a bunch of strings and execute them.
Ah Wait sorry you cannot do select into repeatedly, what am i thinking.. sorry more like this:

In your select statement you can return a column based on a string, for example:
SELECT 'Construction' As Table_Name, MODIF_NUM FROM MyTable
OR
SELECT 'Construction' As Table_Name, * FROM MyTable
To bring them together, a UNION may work:
SELECT 'Construction' As Table_Name, MODIF_NUM, MODIF_DESC FROM tblConstruction
UNION ALL
SELECT 'Demolition' As Table_Name, MODIF_NUM, MODIF_DESC FROM tblDemolition
UNION ALL
SELECT 'Reconstruction' As Table_Name, MODIF_NUM, MODIF_DESC FROM tblReconstruction
Does this help?

Try this query:
SELECT TABLE_NAME, a.*
FROM [Construction] a,
INFORMATION_SCHEMA.TABLES

Related

Is there a way to search all SQL tables by column name?

In this answer, you can search all tables for a column by column name.
Say I have a list of columns like this:
DECLARE #columnNames TABLE (Id varchar(30))
INSERT INTO #columnNames
VALUES ('xColumn1Name'), ('xColumn2Name'), ('xColumn3Name')
I want to find all tables that have at least these three columns. Is it possible to do a foreach loop with the code below, or is there a simpler way?
SELECT
COLUMN_NAME AS 'ColumnName', -- this code will get all tables with a column by name #xColumnName, but I would like to pass in a list
TABLE_NAME AS 'TableName'
FROM
INFORMATION_SCHEMA.COLUMNS
WHERE
COLUMN_NAME LIKE '#xColumnName'
ORDER BY
TableName, ColumnName;
The table must have all 3 colums named in the list, and it would be cool if I could filter out tables that do not have a certain column or list of columns
This is a relational division question. There are a few methods to solve this as Joe Celko writes. The common solution is as follows:
DECLARE #columnNames TABLE (Id varchar(30))
INSERT INTO #columnNames
VALUES ('xColumn1Name'), ('xColumn2Name'), ('xColumn3Name')
select t.name
from sys.tables t
join sys.columns c on c.object_id = t.object_id
join #columnNames cn on cn.Id = c.name
group by t.object_id, t.name
having count(*) >=
(select count(*) from #columnNames);
What this says is: give me all tables, where the number of columns which match the list #columnName is the same or more as the number in that list, in other words tehre is a match for every column.
This should get your initial goal.
SELECT
[TableName]
FROM (
SELECT
COLUMN_NAME AS 'ColumnName', -- this code will get all tables with a column by name #xColumnName, but I would like to pass in a list
TABLE_NAME AS 'TableName',
ROW_NUMBER() OVER(PARTITION BY TABLE_NAME ORDER BY COLUMN_NAME) rn
FROM
INFORMATION_SCHEMA.COLUMNS
WHERE
COLUMN_NAME IN ('xColumn1Name', 'xColumn2Name', 'xColumn3Name')
) a
WHERE rn >= 3
For a short explanation, this query will look through the information schema to find any of these columns in a table. The ROW_NUMBER() then basically groups the columns by table. If there are 3 or more results (rn) then all 3 columns are there.
Since it is a sub select, you can also filter the outside select for particular columns if you want.
You could use INTERSECT to combine different result sets. This will give the records that are in all result sets, so in this case, the tables that have all three columns.
SELECT OBJECT_NAME(object_id) AS Table
FROM sys.columns
WHERE name = 'xColumn1Name'
INTERSECT
SELECT OBJECT_NAME(object_id) AS Table
FROM sys.columns
WHERE name = 'xColumn3Name'
INTERSECT
SELECT OBJECT_NAME(object_id) AS Table
FROM sys.columns
WHERE name = 'xColumn3Name'

GROUP BY T1.* ? Group by all columns in Table1, joined left by table 2, and Aggregate functions on T2 columns?

I have a query that is merging 2 tables. Table 1 has many columns, and may eventually expand. Table 2 also has several columns, but I will be performing aggregate functions on 90% of its columns. Table 1 has 300 + rows, Table 2 has 84K + rows.
SELECT
t1.*
,t2.c2
,SUM(t2.c3)
,SUM(t2.c4)
FROM
Table1 AS t1
LEFT JOIN Table2 AS t2 ON t1.c10 = t2.c1
GROUP BY
t1.*
,t2.c2
I'm getting an error Incorrect Syntax near '*' and it points to the line containing the GROUP BY statement.
I am aware that the SELECT t1.* works as I ran this portion prior to trying to aggregate T2 columns and it worked as expected.
Is there a way to quickly GROUP BY all the columns in T1? I know normally we would select only needed columns, but in this case, I need all the T1 columns.
Previous research has led me to only find instances where 1 table was used, and mostly people were looking to get or remove duplicate values. I'm looking to specifically combine the 300 records of T1 to the 84K records of T2 without having to name off all the columns from T1 in the GROUP BY section.
This method is slightly unconventional, but you can pass it into a variable by using dynamic sql. Below is an example of how you can do it:
declare #test nvarchar(max)
set #test = ''
select #test += Column_name +',' from information_schema.columns where table_name='Table1'
DECLARE #sql nvarchar(max)
SELECT #sql = N'SELECT top 10 ' +#test+ 'NULL as a FROM Table1;'
EXEC sp_executesql #sql
You can apply the same principle and rewrite your query to use the group by function. Hope this helps.
Based on the article posted by #wosi, https://dba.stackexchange.com/questions/21226/why-do-wildcards-in-group-by-statements-not-work, I was able to modify the code and get the expected results. Please note I went from 80K to 70K because I was joining the tables on 1 column. The way my data was structured I had to join on 2 columns. Final code looks something like this:
SELECT
t1.*
,t2.c2
,t2.c3
,t2.c4
FROM
Table1 AS t1
LEFT JOIN
(SELECT c2, SUM(c3), SUM(c4)
FROM Table2
GROUP BY c2) AS t2
ON t1.c10 = t2.c1 AND t1.c15 = t2.c2
You can't use * in GroupBy Statement. Of course, there are some Dynamic SQL to prevent typing all columns in the SP but if you are using T-SQL in a view you should type all columns.

get the data type of columns in a select query

I know that you can get the type of a table's columns using the query below.
select COLUMN_NAME, DATA_TYPE
from INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME = 'myTbl'
I was wondering if when you write a select query which involves 2 tables or more whether you can do something similar? I.e. is there a way to determine the column data types?
Just another option is sys.dm_exec_describe_first_result_set()
The nice thing about this is you can supply virtually any query, table, view, or even a stored procedure.
Example
Select column_ordinal
,name
,system_type_name
From sys.dm_exec_describe_first_result_set('Exec [dbo].[prc-App-Lottery-Search] ''7613''',null,null )
Returns
column_ordinal name system_type_name
1 DrawDate date
2 DrawDE varchar(1)
3 DrawAct varchar(4)
4 DrawNrm varchar(4)
5 Hits int
6 Elapsed nvarchar(4000)
Generic Example to see all columns available
Select *
From sys.dm_exec_describe_first_result_set('Select * from master..spt_values',null,null )
You can do this:
select COLUMN_NAME, DATA_TYPE
from INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME IN ('myTbl1','myTbl2')
I used the below query to know all details about the columns in the table (shorter than selecting all columns)
describe tableName;

What is easiest and optimize way to find specific value from database tables?

As per my requirement, I have to find if some words like xyz#test.com value exists in which tables of columns. The database size is very huge and more than 2500 tables.
Can anyone please provide an optimal way to find this type of value from the database. I've created a loop query which took around almost more than 9 hrs to run.
9 hours is clearly a long time. Furthermore, 2,500 tables seems close to insanity for me.
Here is one approach that will run 1 query per table, not one per column. Now I have no idea how this will perform against 2,500 tables. I suspect it may be horrible. That said I would strongly suggest a test filter first like Table_Name like 'OD%'
Example
Declare #Search varchar(max) = 'cappelletti' -- Exact match '"cappelletti"'
Create Table #Temp (TableName varchar(500),RecordData xml)
Declare #SQL varchar(max) = ''
Select #SQL = #SQL+ ';Insert Into #Temp Select TableName='''+concat(quotename(Table_Schema),'.',quotename(table_name))+''',RecordData = (Select A.* for XML RAW) From '+concat(quotename(Table_Schema),'.',quotename(table_name))+' A Where (Select A.* for XML RAW) like ''%'+#Search+'%'''+char(10)
From INFORMATION_SCHEMA.Tables
Where Table_Type ='BASE TABLE'
and Table_Name like 'OD%' -- **** Would REALLY Recommend a REASONABLE Filter *** --
Exec(#SQL)
Select A.TableName
,B.*
,A.RecordData
From #Temp A
Cross Apply (
Select ColumnName = a.value('local-name(.)','varchar(100)')
,Value = a.value('.','varchar(max)')
From A.RecordData.nodes('/row') as C1(n)
Cross Apply C1.n.nodes('./#*') as C2(a)
Where a.value('.','varchar(max)') Like '%'+#Search+'%'
) B
Drop Table #Temp
Returns
If it Helps, the individual queries would look like this
Select TableName='[dbo].[OD]'
,RecordData= (Select A.* for XML RAW)
From [dbo].[OD] A
Where (Select A.* for XML RAW) like '%cappelletti%'
On a side-note, you can search numeric data and even dates.
Make a procedure with VARCHAR datatype of column with table name and store into the temp table from system tables.
Now make one dynamic Query with executing a LOOP on each record with = condition with input parameter of email address.
If condition is matched in any statement using IF EXISTS statement, then store that table name and column name in another temp table. and retrieve the list of those records from temp table at end of the execution.

Display the table name in the select statement

I need to display the table name in the select statement. how?
exact question:
we have common columns in two tables. we are displaying the records by using
select column_name from table_name_1
union
select column_name from table_name_2
But the requirement is, we need to display the source table_name along with the data.
consider a,c are present in table_1 and b,d are present in table_2.
we need the output in the following way
eg:
column_name table_name
a table_1
b table_2
c table_1
d table_2
.......................................................
......................................................
Is this possible
select 'table1', * from table1
union
select 'table2',* from table2

Resources