Looping through all columns in a table - sql-server

I'm trying to find a way of evaluating all fields in many tables and return % of the data filled in. i need to look for specific things rather than just NULL.
So for instance in the client table, it would return all fields and say Client Name 45% completed,
Address 90% Completed etc.
I need to search on NULL, Blank, UNCODED, -1 vs row count
some of the tables have more than 30+ fields hence why I think a loop my be best.
select
cast(100 - ((select cast(count([ClientName])as decimal(10,2))
from [dbo].[Client]
where
[ClientName] is null
or [ClientName] = ''
or [ClientName] = 'UNCODED'
or [ClientName] = -1
)
/
(select cast(count([ClientName])as decimal(10,2))
from [dbo].[Client]
where
[ClientName] is not null
or [ClientName] <> ''
or [ClientName] <> 'UNCODED'
or [ClientName] <> -1
))as decimal(10,2)) as '%Completed'
The below gets me the table names
select
c.column_id,c.name
from sys.columns c
inner join sys.objects o on c.object_id=o.object_id
where o.name = 'Client'
order by c.column_id
I'm new to SQL and trying to get my head round variables and loops but just not getting it.

You can build the dynamic SQL query you need using simple string concatenation from sys.columns, just don't try to add an ORDER BY as that behavior is undefined and in many cases will leave out rows inexplicably.
-- given these variables/parameters:
DECLARE #c int, #t nvarchar(511) = N'dbo.Client';
-- only proceed if this is actually an object (some protection from SQL injection):
IF OBJECT_ID(#t) IS NOT NULL
BEGIN
-- we can get the row count from metadata instead of scanning the table an extra time:
SELECT #c = SUM(rows) FROM sys.partitions
WHERE [object_id] = OBJECT_ID(#t) AND index_id IN (0,1);
DECLARE #sql nvarchar(max) = N'SELECT [table] = N''' + #t + N'''',
#col nvarchar(max) = N',' + CHAR(13) + CHAR(10) + N'[% $c$ complete] = '
+ N'CONVERT(decimal(5,2), 100.0*SUM(CASE WHEN $qc$ IS NULL '
+ N'OR $qc$ = SPACE(0) OR RTRIM($qc$) IN (''UNCODED'',''-1'') '
+ N'THEN 0 ELSE 1 END)/#c)';
SELECT #sql += REPLACE(REPLACE(#col, N'$c$', name), N'$qc$', QUOTENAME(name))
FROM sys.columns WHERE [object_id] = OBJECT_ID(#t);
SELECT #sql += N' FROM #t;';
SELECT #sql = REPLACE(#sql, N'#t', #t);
PRINT #sql;
--EXECUTE sys.sp_executesql #stmt = #sql, #params = N'#c int', #c = #c;
END
When you are satisfied the output looks like you expect, uncomment the EXECUTE. Note that you may have to filter out certain data types (I don't know what happens with XML, binary, or types like hierarchyid or geography, and don't have the energy to test those right now).

Related

How to search for a value in a common column across multiple tables in a SQL Server database?

I have almost 1000 tables and most of them have a common column ItemNumber. How do I search across all the tables in the database for a value or list of values that exist in this common column, such as 350 or (350, 465)? The tables have different schemas.
Table A100
ItemNumber
Detail
230
Car
245
Plane
Table A1000
ItemNumber
ProductDescription
350
Pie
465
Cherry
This does not perform type checking, so you can get conversion errors if the target column is not the correct type. Also, this script uses LIKE, you would probably need to change that to a direct comparison.
SET NOCOUNT ON
DECLARE #ID NVARCHAR(100) = '2'
DECLARE #ColumnName NVARCHAR(100) ='UserID'
DECLARE #Sql NVARCHAR(MAX)=N'CREATE TABLE #TempResults(TableName NVARCHAR(50), ColumnName NVARCHAR(50), ItemCount INT)'
SELECT
#Sql = #Sql + N'INSERT INTO #TempResults SELECT * FROM (SELECT '''+ST.Name+''' AS TableName, '''+C.Name+''' AS ColumnName, COUNT(*) AS ItemCount FROM '+ST.Name+' WHERE '+C.Name+'='+#ID+') AS X WHERE ItemCount > 0 '
FROM
sys.columns C
INNER JOIN sys.tables ST ON C.object_id = ST.object_id
WHERE
C.Name LIKE '%'+#ColumnName+'%'
SET #Sql = #Sql + N'SELECT * FROM #TempResults'
exec sp_executesql #sql
You need to do this with dynamic SQL. You will need to query all 1000 tables, and make sure you are converting the values correctly if the columsn are different types.
You don't need a temp table for this, you can just script one giant UNION ALL query. You must make sure to quote all dynamic names correctly using QUOTENAME.
To be able to return data for multiple items, you should create a Table Valued Parameter, which you can pass in using sp_executesql.
First create a table type
CREATE TYPE dbo.IntList (Id int PRIMARY KEY);
Then you create a table variable containing them, and pass it in. You can also do this in a client application and pass in a TVP.
SET NOCOUNT ON;
DECLARE #Items dbo.IntList;
INSERT #Items (Id) VALUES(350),(465);
DECLARE #Sql nvarchar(max);
SELECT
#Sql = STRING_AGG(CONVERT(nvarchar(max), N'
SELECT
' + QUOTENAME(t.name, '''') + ' AS TableName,
t.ItemNumber,
COUNT(*) AS ItemCount
FROM ' + QUOTENAME(t.Name) + ' t
JOIN #items i ON i.Id = t.ItemNumber
GROUP BY
t.ItemNumber
HAVING COUNT(*) > 0
' ),
N'
UNION ALL
' )
FROM
sys.tables t
WHERE t.object_id IN (
SELECT c.object_id
FROM sys.columns c
WHERE
c.Name = 'ItemNumber'
);
PRINT #sql; -- your friend
EXEC sp_executesql
#sql,
N'#items dbo.IntList',
#items = #items READONLY;
If you don't need to know the count, and only want to know if a value exists, you can change the dynamic SQL to an EXISTS
....
SELECT
#Sql = STRING_AGG(CONVERT(nvarchar(max), N'
SELECT
' + QUOTENAME(t.name, '''') + ' AS TableName,
t.ItemNumber
FROM #items i
WHERE i.Id IN (
SELECT t.ItemNumber
FROM ' + QUOTENAME(t.Name) + ' t
)
' ),
N'
UNION ALL
' )
....

Update with dynamic tables

I have to write update using dynamic sql becaus i know only name of column that I want to update and names of columns which I will use to join tables in my update. But I don't know the numbers of tables and names. Names of tables I will get in parameter of my procedure in this way
declare #Tables = N'Customer,Employee,Owner'
So I want to have update like this:
update t
set [Status] = 100
from
TemporaryTable t
left join Customer t1 on t1.RecordId = t.RecordId
left join Employee t2 on t2.RecordId = t.RecordId
left join Owner t3 on t3.RecordId =t.RecordId
where
t1.RecordId is null
and t2.RecordId is NULL
and t3.RecordId is null
I know that each table will have column RecordId and want to left join this tables to my TemporaryTable on this column but I don't know the names and numbers of tables. For example I will have one, two, or ten tables with different names. I know that this tables names will be save in parameter #Tables in that way:
#Tables = N'Customer,Employee,Owner'
There is possilble to write this update in dynamic way?
This is an answer, which helps ... to write update using dynamic sql ... and only shows how to generate a dynamic statement. It's based on string splitting. From SQL Server 2016+ you may use STRING_SPLIT() (because here the order of the substrings is not important). For previous versions you need to find a string splitting function.
T-SQL:
DECLARE #Tables nvarchar(max) = N'Customer,Employee,Owner'
DECLARE #join nvarchar(max) = N''
DECLARE #where nvarchar(max) = N''
DECLARE #stm nvarchar(max) = N''
SELECT
#join = #join + CONCAT(
N' LEFT JOIN ',
QUOTENAME(s.[value]),
N' t',
ROW_NUMBER() OVER (ORDER BY (SELECT 1)),
N' ON t',
ROW_NUMBER() OVER (ORDER BY (SELECT 1)),
N'.RecordId = t.RecordId'
),
#where = #where + CONCAT(
N' AND t',
ROW_NUMBER() OVER (ORDER BY (SELECT 1)),
N'.RecordId is NULL'
)
FROM STRING_SPLIT(#Tables, N',') s
SET #stm = CONCAT(
N'UPDATE t SET [Status] = 100 ',
N'FROM TemporaryTable t',
#join,
N' WHERE ',
STUFF(#where, 1, 5, N'')
)
PRINT #stm
EXEC sp_executesql #stm
Notes:
One note, that I think is important - consider passing tables names using table value type for parameter, not as comma-separated text.
It seems like this will suit your needs, though I don't fully understand what you're trying to do. Here we're constructing the final SQL in two pieces (#s and #where) and then concatenating into the final SQL at the end.
declare #Tables varchar(100) = N'Customer,Employee,Owner'
declare #tablenames table (tablename nvarchar(100))
insert #tablenames (tablename)
select value
from string_split(#Tables, ',');
declare #where varchar(max) = ''
declare #s varchar(max) = '
update t
set [Status] = 100
from TemporaryTable t'
select #s += '
left join ' + tablename + ' on ' + tablename + '.RecordId = t.RecordId'
, #where += case when #where = '' then '' else ' and ' end + tablename + '.RecordId is null
'
from #tablenames
print #s + char(13) + ' where ' + #where
exec( #s + char(13) + ' where ' + #where)

How to insert with table creation

I am looking for a more appropriate way to execute several inserts into a nonexistent table.
To create the table beforehand is not easily possible, as I don't know the data type of the selected column.
An "insert with create" would do, but I don't think there is anything like that.
Is there any better way to do so than to select into and then to insert?
Here is the "bad" way I do it, in an example very much stripped down to demonstrate the problem.
set nocount on
declare
#name sysname = '',
#i int = 0,
#sql nvarchar(4000) = ''
declare test cursor for
select top 10 a.name from sys.tables a inner join sys.columns b on a.object_id = b.object_id --and b.name = 'description'
open test
fetch next from test into #name
while (##FETCH_STATUS <> -1)
begin
if #i = 0 begin
set #sql = 'select distinct top 10 description into #t1 from ' + #name + ''
select #sql
-- exec sp_executesql #sql
end
else begin
set #sql = 'insert #t1 select distinct top 10 description into #t1 from ' + #name + ''
select #sql
-- exec sp_executesql #sql
end
set #i = #i + 1
fetch next from test into #name
end
close test
deallocate test
if object_id ('tempdb..#t1') is not null select * from #t1
This solution is "bad" as you need the statement at two positions. In the case shown here this is trivial, but when the statement gets more complex this can become an issue.
You can simplify your query into this one:
set nocount on
declare
#name sysname = '',
#i int = 0,
#sql nvarchar(4000) = N''
if object_id ('tempdb..#t1') is not null DROP TABLE #t1
;WITH cte AS (
select top 10 a.[name]
from sys.tables a
inner join sys.columns b
on a.object_id = b.object_id --and b.name = 'description'
)
SELECT #sql = #sql + N'UNION ALL
select distinct top 10 description
from ' + QUOTENAME([name]) + CHAR(13)
FROM cte
SELECT #sql = N';WITH cte AS (' + STUFF(#sql,1,10,') SELECT * INTO #t1 FROM cte')
PRINT #sql
--EXEC (#sql)
select * from #t1
No cursor or while loop;
Temporary table is dropped (if exists) before query execution;
You got a weird query, as for now it takes the first table from sys.tables and SELECT TOP 10 Descriptions from this table as many times as there are columns in this table.
The SELECT INTO statement copies data from one table into a new table, this might help you.
Example:-
SELECT *
INTO newtable
FROM oldtable
WHERE condition
The above also supports joins.

How do I validate a variable against multple database tables

Does anyone know how to check a a variable against all database table with columns storing the same type of information? I have a poorly designed database that stores ssn in over 60 tables within one database. some of the variations of columns in the various tables include:
app_ssn
ca_ssn
cand_ssn
crl_ssn
cu_ssn
emtaddr_ssn
re_ssn
sfcart_ssn
sfordr_ssn
socsecno
ssn
Ssn
SSN
I want to create a stored procedure that will accept a value and check it against every table that has 'ssn' in the name.Does anyone have idea as to how to do this?
-- I assume that table/column names don't need to be surrounded by square braces. You may want to save matches in a table - I just select them. I also assume ssn is a char.
alter proc proc1
#search1 varchar(500)
as
begin
set nocount on
declare #strsql varchar(500)
declare #curtable sysname
declare #prevtable sysname
declare #column sysname
select top 1 #curtable= table_schema+'.'+table_name, #column=column_name
from INFORMATION_SCHEMA.COLUMNS
where CHARINDEX('ssn',column_name) > 0
order by table_schema+'.'+table_name +column_name
-- make sure that at least one column has ssn in the column name
if #curtable is not null
begin
while (1=1)
begin
set #strsql = 'select * from ' +#curtable +' where '+''''+#search1+''''+ ' = '+#column
print #strsql
-- any matches for passed in ssn will match here...
exec (#strsql)
set #prevtable = #curtable+#column
select top 1 #curtable= table_schema+'.'+table_name, #column=column_name
from INFORMATION_SCHEMA.COLUMNS
where CHARINDEX('ssn',column_name) > 0
and table_schema+'.'+table_name +column_name> #prevtable
order by table_schema+'.'+table_name +column_name
-- when we run out of columns that contain ssn we are done...
if ##ROWCOUNT = 0
break
end
end
end
What you will need to do is some research. But here is where you can start;
SELECT tbl.NAME AS TableName
,cl.NAME AS ColumnName
,IDENTITY(INT, 1, 1) AS ID
INTO #ColumnsToLoop
FROM sys.tables tbl
JOIN sys.columns cl ON cl.object_id = tbl.object_id
This will give you the table / column relation then you can simply build a dynamic SQL string based on each row in the query above (basically loop it) and use EXEC or sp_execsql. So basically;
DECLARE #Loop int = (select min(ID) From #ColumnsToLoop),#MX int = (Select MAX(ID) From #ColumnsToLoop)
WHILE(#Loop<=#MX)
BEGIN
DECLARE #SQL nvarchar(MAX) = 'SQL String'
//Construct the dynamic SQL String
EXEC(#SQL);
SET #Loop += 1
END
Perhaps I went a little too crazy with this one, but let me know. I thought it would best the primary key of the search results with the table name so you could join it to your tables. I also managed to do it without a single cursor or loop.
DECLARE #SSN VARCHAR(25) = '%99%',
#SQL VARCHAR(MAX);
WITH CTE_PrimaryKeys
AS
(
SELECT TABLE_CATALOG,
TABLE_SCHEMA,
TABLE_NAME,
column_name
FROM INFORMATION_SCHEMA.KEY_COLUMN_USAGE D
WHERE OBJECTPROPERTY(OBJECT_ID(constraint_name), 'IsPrimaryKey') = 1
),
CTE_Columns
AS
(
SELECT A.*,
CONCAT(A.TABLE_CATALOG,'.',A.TABLE_SCHEMA,'.',A.TABLE_NAME) AS FullTableName,
CASE WHEN B.COLUMN_NAME IS NOT NULL THEN 1 ELSE 0 END AS IsPrimaryKey
FROM INFORMATION_SCHEMA.COLUMNS A
LEFT JOIN CTE_PrimaryKeys B
ON A.TABLE_CATALOG = B.TABLE_CATALOG
AND A.TABLE_SCHEMA = B.TABLE_SCHEMA
AND A.TABLE_NAME = B.TABLE_NAME
AND A.COLUMN_NAME = B.COLUMN_NAME
),
CTE_Select
AS
(
SELECT
'SELECT ' +
--This returns the pk_col casted as Varchar and the table name in another columns
STUFF((SELECT ',CAST(' + COLUMN_NAME + ' AS VARCHAR(MAX)) AS pk_col,''' + B.TABLE_NAME + ''' AS Table_Name'
FROM CTE_Columns B
WHERE A.Table_Name = B.TABLE_NAME
AND B.IsPrimaryKey = 1
FOR XML PATH ('')),1,1,'')
+ ' FROM ' + fullTableName +
--This is where I list the columns where LIKE desired SSN
' WHERE ' +
STUFF((SELECT COLUMN_NAME + ' LIKE ''' + #SSN + ''' OR '
FROM CTE_Columns B
WHERE A.Table_Name = B.TABLE_NAME
--This is where I filter so I only get desired columns
AND (
--Uncomment the Collate if your database is case sensitive
COLUMN_NAME /*COLLATE SQL_Latin1_General_CP1_CI_AS*/ LIKE '%ssn%'
--list your column Names that don't have ssn in them
--OR COLUMN_NAME IN ('col1','col2')
)
FOR XML PATH ('')),1,0,'') AS Selects
FROM CTE_Columns A
GROUP BY A.FullTableName,A.TABLE_NAME
)
--Unioning them all together and getting rid of last trailing "OR "
SELECT #SQL = COALESCE(#sql,'') + SUBSTRING(selects,1,LEN(selects) - 3) + ' UNION ALL ' + CHAR(13) --new line for easier debugging
FROM CTE_Select
WHERE selects IS NOT NULL
--Look at your code
SELECT SUBSTRING(#sql,1,LEN(#sql) - 11)

How can I inexpensively determine if a column contains only NULL records?

I have a large table with 500 columns and 100M rows. Based on a small sample, I believe only about 50 of the columns contain any values, and the other 450 contain only NULL values. I want to list the columns that contain no data.
On my current hardware, it would take about 24 hours to query every column (select count(1) from tab where col_n is not null)
Is there a less expensive way to determine that a column is completely empty/NULL?
What about this:
SELECT
SUM(CASE WHEN column_1 IS NOT NULL THEN 1 ELSE 0) column_1_count,
SUM(CASE WHEN column_2 IS NOT NULL THEN 1 ELSE 0) column_2_count,
...
FROM table_name
?
You can easily create this query if you use INFORMATION_SCHEMA.COLUMNS table.
EDIT:
Another idea:
SELECT MAX(column_1), MAX(column_2),..... FROM table_name
If result contains value, column is populated. It should require one table scan.
Try this one -
DDL:
IF OBJECT_ID ('dbo.test2') IS NOT NULL
DROP TABLE dbo.test2
CREATE TABLE dbo.test2
(
ID BIGINT IDENTITY(1,1) PRIMARY KEY
, Name VARCHAR(10) NOT NULL
, IsCitizen BIT NULL
, Age INT NULL
)
INSERT INTO dbo.test2 (Name, IsCitizen, Age)
VALUES
('1', 1, NULL),
('2', 0, NULL),
('3', NULL, NULL)
Query 1:
DECLARE
#TableName SYSNAME
, #ObjectID INT
, #SQL NVARCHAR(MAX)
SELECT
#TableName = 'dbo.test2'
, #ObjectID = OBJECT_ID(#TableName)
SELECT #SQL = 'SELECT' + CHAR(13) + STUFF((
SELECT CHAR(13) + ', [' + c.name + '] = ' +
CASE WHEN c.is_nullable = 0
THEN '0'
ELSE 'CASE WHEN ' + totalrows +
' = SUM(CASE WHEN [' + c.name + '] IS NULL THEN 1 ELSE 0 END) THEN 1 ELSE 0 END'
END
FROM sys.columns c WITH (NOWAIT)
CROSS JOIN (
SELECT totalrows = CAST(MIN(p.[rows]) AS VARCHAR(50))
FROM sys.partitions p
WHERE p.[object_id] = #ObjectID
AND p.index_id IN (0, 1)
) r
WHERE c.[object_id] = #ObjectID
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, ' ') + CHAR(13) + 'FROM ' + #TableName
PRINT #SQL
EXEC sys.sp_executesql #SQL
Output 1:
SELECT
[ID] = 0
, [Name] = 0
, [IsCitizen] = CASE WHEN 3 = SUM(CASE WHEN [IsCitizen] IS NULL THEN 1 ELSE 0 END) THEN 1 ELSE 0 END
, [Age] = CASE WHEN 3 = SUM(CASE WHEN [Age] IS NULL THEN 1 ELSE 0 END) THEN 1 ELSE 0 END
FROM dbo.test2
Query 2:
DECLARE
#TableName SYSNAME
, #SQL NVARCHAR(MAX)
SELECT #TableName = 'dbo.test2'
SELECT #SQL = 'SELECT' + CHAR(13) + STUFF((
SELECT CHAR(13) + ', [' + c.name + '] = ' +
CASE WHEN c.is_nullable = 0
THEN '0'
ELSE 'CASE WHEN '+
'MAX(CAST([' + c.name + '] AS CHAR(1))) IS NULL THEN 1 ELSE 0 END'
END
FROM sys.columns c WITH (NOWAIT)
WHERE c.[object_id] = OBJECT_ID(#TableName)
FOR XML PATH(''), TYPE).value('.', 'NVARCHAR(MAX)'), 1, 2, ' ') + CHAR(13) + 'FROM ' + #TableName
PRINT #SQL
EXEC sys.sp_executesql #SQL
Output 2:
SELECT
[ID] = 0
, [Name] = 0
, [IsCitizen] = CASE WHEN MAX(CAST([IsCitizen] AS CHAR(1))) IS NULL THEN 1 ELSE 0 END
, [Age] = CASE WHEN MAX(CAST([Age] AS CHAR(1))) IS NULL THEN 1 ELSE 0 END
FROM dbo.test2
Results:
ID Name IsCitizen Age
----------- ----------- ----------- -----------
0 0 0 1
Could you check if colums idexing will help you reach some performance improve
CREATE UNIQUE NONCLUSTERED INDEX IndexName ON dbo.TableName(ColumnName)
WHERE ColumnName IS NOT NULL;
GO
SQL server query to get the list of columns in a table along with Data types, NOT NULL, and PRIMARY KEY constraints
Run SQL in best answer of above questions and generate a new query like below.
Select ISNULL(column1,1), ISNULL(column2,1), ISNULL(column3,1) from table
You would not need to 'count' all of the 100M records. When you simply back out of the query with a TOP 1 as soon as you hit a column with a not-null value, would save a lot of time while providing the same information.
500 Columns?!
Ok, the right answer to your question is: normalize your table.
Here's what happening for the time being:
You don't have an index on that column so SQL Server has to do a full scan of your humongous table.
SQL Server will certainly fully read every row (it means every columns even if you're only interested in one).
And since your row are most likely over 8kb... http://msdn.microsoft.com/en-us/library/ms186981%28v=sql.105%29.aspx
Seriously, normalize your table and if needed split it horizontally (put "theme grouped" columns inside separate table, to only read them when you need them).
EDIT: You can rewrite your query like this
select count(col_n) from tab
and if you want to get all columns at once (better):
SELECT
COUNT(column_1) column_1_count,
COUNT(column_2) column_2_count,
...
FROM table_name
If most records are not null maybe you can mix some of the approach suggested (for example check only nullable fields) with this:
if exists (select * from table where field is not null)
this should speed up the search because exists stops the search as soon as condition is met, in this example a single not null record is enough to decide the status of the field.
If the field has an index this should be almost instant.
Normally adding top 1 to this query is not needed because the query optimizer knows that you do not need to retrieve all the matching records.
You can use this stored procedure to the trick You need to provide the table name you wish to query note that if you'll pass to procedure the #exec parameter = 1 it will execute the select query
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[SP_SELECT_NON_NULL_COLUMNS] ( #tablename varchar (100)=null, #exec int =0)
AS BEGIN
SET NOCOUNT ON
IF #tablename IS NULL
RAISERROR('CANT EXECUTE THE PROC, TABLE NAME IS MISSING',16 ,1)
ELSE
BEGIN
IF OBJECT_ID('tempdb..#table') IS NOT NULL DROP TABLE #table
DECLARE #i VARCHAR (max)=''
DECLARE #sentence VARCHAR (max)=''
DECLARE #SELECT VARCHAR (max)
DECLARE #LocalTableName VARCHAR(50) = '['+#tablename+']'
CREATE TABLE #table (ColumnName VARCHAR (max))
SELECT #i+=
' IF EXISTS ( SELECT TOP 1 '+column_name+' FROM ' +#LocalTableName+' WHERE ' +column_name+
' '+'IS NOT NULL) INSERT INTO #table VALUES ('''+column_name+''');'
FROM INFORMATION_SCHEMA.COLUMNS WHERE table_name=#tablename
INSERT INTO #table
EXEC (#i)
SELECT #sentence = #sentence+' '+columnname+' ,' FROM #table
DROP TABLE #table
IF #exec=0
BEGIN
SELECT 'SELECT '+ LTRIM (left (#sentence,NULLIF(LEN (#sentence)-1,-1)))+
+' FROM ' +#LocalTableName
END
ELSE
BEGIN
SELECT #SELECT= 'SELECT '+ LTRIM (left (#sentence,NULLIF(LEN (#sentence)-1,-1)))+
+' FROM '+#LocalTableName
EXEC (#SELECT)
END
END
END
Use it like this:
EXEC [dbo].[SP_SELECT_NON_NULL_COLUMNS] 'YourTableName' , 1

Resources