SQL Server special characters significance in fetch sp - sql-server

In the following fetch sp of sql server 2008R2, what does-> /^/ these characters symbolise in the column names in the select statement in #object?
CREATE Proc Pr_FetchPatientIssueItemsMis
As
Begin
Declare #Object varchar(MAX)
SET NOCOUNT ON
Set #Object= '(Select '
+ 'ISD.[' + /*v*/'SalesID'/*v*/ + '] SalesID,'
+ 'ISD.[' + /*v*/'ItemID'/*v*/ + '] ItemID,'
+ 'V.DisplayName ItemName,'
+ 'V.ItemCode ItemCode,'
+ 'v.CategoryID CategoryID,'
+ 'v.Category Category'
+ ' From Dbo.[' + /*v*/'V_ItemPatientIssues'/*v*/ + ']ISD'
+ ' Inner Join Dbo.[v_items]V ON ISD.[' + /*v*/'ItemID'/*v*/ + ']= V.ItemID'
+ ')OBJ'
EXEC('Select * from (Select * From ' + #Object +')XYZ')
Return 0
End

They are in-line comments. They don't do anything. The only thing that they could symbolize would be some documentary meaning that was totally up to whoever wrote it.
My off-hand guess would be that whoever wrote this uses this procedure as a template to generate other procedures by doing searches on pairs of the /*v*/ and replacing whatever's between them with different table/column names.

Related

How to find the list of tables which has new set of records impacted in SQL?

I am working on exporting data from one environment to another environment. I want to select the list of tables which has new set of records either inserted or modified.
Database has around 200 tables and only if 10 table records are impacted since yesterday, i want to filter only those tables. Some of these tables does not have createdate table column. It is harder to identify the record difference based on plain select query to the table.
How to find the list of tables which has new set of records impacted in SQL?
And if possible only those newly impacted records from the identified tables.
I tried with this query, however this query is not returning actual tables.
select * from sysobjects where id in (
select object_id
FROM sys.dm_db_index_usage_stats
WHERE last_user_update > getdate() - 1 )
If you've not got a timestamp or something to identify newly changed records such as auditing, utilising triggers or Change Data Capture enabled on those tables, it's quiet impossible to do.
However, reading your scenario is it not possible to ignore what has changed or been modified and just simply export those 200 tables from one environment to the other and override it on the destination location?
If not, then you might be only interested in comparing data rather than identifying newly changed records to identify which tables did not match. You can do that using EXCEPT
See below example of comparing two databases with the same table names and schema creating a dynamic SQL statement column using EXCEPT from both databases on the fly and running them in a while loop; inserting each table name that was effected into a temp table.
DECLARE #Counter AS INT
, #Query AS NVARCHAR(MAX)
IF OBJECT_ID('tempdb..#CompareRecords') IS NOT NULL DROP TABLE #CompareRecords
IF OBJECT_ID('tempdb..#TablesNotMatched') IS NOT NULL DROP TABLE #TablesNotMatched
CREATE TABLE #TablesNotMatched (ObjectName NVARCHAR(200))
SELECT
ROW_NUMBER() OVER( ORDER BY (SELECT 1)) AS RowNr
, t.TABLE_CATALOG
, t.TABLE_SCHEMA
, t.TABLE_NAME
, Query = 'IF' + CHAR(13)
+ '(' + CHAR(13)
+ ' SELECT' + CHAR(13)
+ ' COUNT(*) + 1' + CHAR(13)
+ ' FROM' + CHAR(13)
+ ' (' + CHAR(13)
+ ' SELECT ' + QUOTENAME(t.TABLE_NAME, '''''') + ' AS TableName, * FROM ' + QUOTENAME(t.TABLE_CATALOG) + '.' + QUOTENAME(t.TABLE_SCHEMA) + '.' + QUOTENAME(t.TABLE_NAME) + CHAR(13)
+ ' EXCEPT' + CHAR(13)
+ ' SELECT ' + QUOTENAME(t.TABLE_NAME, '''''') + ' AS TableName, * FROM ' + QUOTENAME(t2.TABLE_CATALOG) + '.' + QUOTENAME(t.TABLE_SCHEMA) + '.' + QUOTENAME(t.TABLE_NAME) + CHAR(13)
+ ' ) AS sq' + CHAR(13)
+ ') > 1' + CHAR(13)
+ 'SELECT ' + QUOTENAME(QUOTENAME(t.TABLE_CATALOG) + '.' + QUOTENAME(t.TABLE_SCHEMA) + '.' + QUOTENAME(t.TABLE_NAME), '''''') + ' AS TableNameRecordsNotMatched'
INTO #CompareRecords
FROM <UAT_DATABASE>.INFORMATION_SCHEMA.TABLES AS t
LEFT JOIN <PROD_DATABASE>.INFORMATION_SCHEMA.TABLES AS t2 ON t.TABLE_SCHEMA = t2.TABLE_SCHEMA
AND t.TABLE_NAME = t2.TABLE_NAME
WHERE t.TABLE_TYPE = 'BASE TABLE'
SET #Counter = (SELECT MAX(RowNr) FROM #CompareRecords)
WHILE #Counter > 0
BEGIN
SET #Query = (SELECT cr.Query FROM #CompareRecords AS cr WHERE cr.RowNr = #Counter)
INSERT INTO #TablesNotMatched
EXECUTE sp_executesql #Query
SET #Counter = #Counter - 1
END
SELECT
*
FROM #TablesNotMatched
Note when using EXCEPT both tables have to have the exact same column count and type.
I hope this slightly helps.

Get data into Excel from SQL Server datasource depending on values in Excel sheet

Let's say in my Excel sheet, I have the following "table":
ZipCodes |
---------+
4059 |
5806 |
4529 |
I need to get the data from a SQL Server view WHERE column ZipCodes has one of the values in my Excel table.
I want the user to be able to add as many ZipCodes as he wants, while not adjusting the query manually.
I've looked that this question but I cannot get it working.
I'm stuck with passing a list of values to the sql query. It's working if I am passing a single ZipCode.
Sth. like this would help me:
WHERE ZipCode IN (SELECT * FROM [sheet2$a1:a2])
but that's breaking the query.
One solution here would be to use OPENROWSET. After discussion, it does appear that the location of the Excel file can and will change; this is fine but means we need to build a dynamic SQL solution, as OPENROWSET requires literal strings for it's parameters (no variables).
Firstly, I don't know what what version of the ACE Drivers you have installed. As a result in my code I am using the drivers I have installed; which are the 2010 drivers (version 12). If that's not the version you're using, you'll need to change the value that I comment.
Normally, an OPENROWSET query might look like this:
SELECT *
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0', --This declares the Driver. You may need to change this
'Excel 8.0;HDR=YES;;Database=\\YourFileServer\yourShare\YourFile.xlsx',
'SELECT *
FROM [Sheet1$A1:G];');
This'll return all rows from columns A through to G, starting at Row 2. Row 1 will be assumed to contain the Header (HDR=YES); if you don't have headers use HDR=NO.
The problem here is that we can't pass a a variable here. Thus we need to do something more dynamic. This gets you something along the lines of:
DECLARE #File nvarchar(500); --This would be your parameter
SET #File = N'\\YourFileServer\yourShare\YourFile.xlsx';
DECLARE #SQL nvarchar(MAX);
SET #SQL = N'SELECT *' + NCHAR(10) +
N'FROM OPENROWSET(''Microsoft.ACE.OLEDB.12.0'',' + NCHAR(10) +
N' ' + QUOTENAME(N'Excel 8.0;HDR=YES;Database=' + #File,N'''') + N',' + NCHAR(10) +
N' ''SELECT *' + NCHAR(10) +
N' FROM [Sheet1$A1:G];'');';
PRINT #SQL;
EXEC sp_executesql #SQL;
Now, finally, you want to use this data against your table/view. This, therefore, might finally look something like this (assuming your view is called Customer_vw, the data in the excel file in in column A, and the column in both datasets is called ZipCode):
DECLARE #File nvarchar(500); --This would be your parameter
SET #File = N'\\YourFileServer\yourShare\YourFile.xlsx';
DECLARE #SQL nvarchar(MAX);
SET #SQL = N'WITH ExcelZips AS (' + NCHAR(10) +
N' SELECT ZipCode' + NCHAR(10) +
N' FROM OPENROWSET(''Microsoft.ACE.OLEDB.12.0'',' + NCHAR(10) +
N' ' + QUOTENAME(N'Excel 8.0;HDR=YES;Database=' + #File,N'''') + N',' + NCHAR(10) +
N' ''SELECT *' + NCHAR(10) +
N' FROM [Sheet1$A1:A];''))' + NCHAR(10) +
N'SELECT [YourColumns]' + NCHAR(10) +
N'FROM Customer_vw C' + NCHAR(10) +
N' JOIN ExcelZips EZ ON C.ZipCode = EZ.ZipCode --Note that EZ.ZipCode will not show in intellisense' + NCHAR(10) +
N'WHERE ...;'; --You'll need to complete the WHERE here, and add any ORDER BY etc.
PRINT #SQL;
EXEC sp_executesql #SQL;
Note that I have PRINT statements in the queries. These are your friends. I (personally) suggest that you comment out the EXEC statements first, and just use the PRINT statements. Check that the output from the PRINT looks correct. If it does then run it, and if you get an error trouble shoot the output from the PRINT, rather than the dynamic SQL. Once you've fixed the non-dynamic SQL, propagate the changes to the dynamic SQL.
Hopefully that explains everything. If you have any questions, please do ask.

Cursor in T-SQL doesn't work

I am using cursor to randomly copy data from one table(Family_tree) to two other tables(Family_tree1 + #time, Family_tree2 + #time). Code executes successfully, but no row is updated. There IS actually what to copy from a table.
I am using Microsoft SQL Server Management Studio. Here's the part of script with cursor:
---creating two tables beforehand
DECLARE #random int
DECLARE
#first_name nvarchar(20),
#last_name AS nvarchar(20),
#date_of_birth AS nchar(10),
#date_of_death AS nchar(10),
#place_of_death AS nvarchar(30),
#credit_card AS nchar(16),
#ID_member AS int,
#FK_gender AS nchar(3),
DECLARE curs CURSOR FOR SELECT first_name, last_name, date_of_birth, date_of_death, place_of_death, credit_card, ID_member, FK_gender, FROM Family_tree
OPEN curs
FETCH NEXT FROM curs INTO #first_name, #last_name, #date_of_birth, #date_of_death, #place_of_death, #credit_card, #ID_member, #FK_gender,
WHILE ##FETCH_STATUS = 0
BEGIN
SET #random = RAND() * 1000000
IF #random % 2 = 1
BEGIN
SET #sqlString = 'INSERT INTO [Family_tree1' + #time + '] (first_name, last_name, date_of_birth, date_of_death, place_of_death, credit_card, ID_member, FK_gender)
VALUES ('
+ #first_name + ',' + #last_name + ',' + #date_of_birth + ',' + #date_of_death + ',' + #place_of_death + ',' + #credit_card + ','
+ CAST(#ID_member AS nvarchar) +','+ #FK_gender + ')'
END
ELSE
BEGIN
SET #sqlString = 'INSERT INTO [Family_tree2' + #time + '] (first_name, last_name, date_of_birth, date_of_death, place_of_death, credit_card, ID_member, FK_gender)
VALUES (' + #first_name + ',' + #last_name + ',' + #date_of_birth + ',' + #date_of_death + ',' + #place_of_death + ',' + #credit_card + ','
+ CAST(#ID_member AS nvarchar) +','+ #FK_gender + ')'
END
EXECUTE(#sqlString)
FETCH NEXT FROM curs INTO #first_name, #last_name, #date_of_birth, #date_of_death, #place_of_death, #credit_card, #ID_member, #FK_gender
END
CLOSE curs
DEALLOCATE curs
END;
I am new to T-SQL and will appreciate any advice!
Thank You in advance (:
If you "have" to do it in the cursor, then you need to watch out for nulls. You also need to watch out for the fact that you're using strings and these should be quoted when inserted into a VALUES clause.
Instead of
VALUES (' + #first_name + ',
You need something like:
VALUES (' + COALESCE('''' + REPLACE(#first_name,'''','''''') + '''','NULL') + ',
And so on for the rest of your values. This replaces any single quotes within the value with doubled-up quotes, then wraps the whole string in single quotes. NULLs survive through all of that processing so we then also use COALESCE to replace the NULL with a NULL literal in the eventual string1.
Before running the cursor in anger, I'd suggest you do this for one row and print the string rather than executing it, to check that it "looks right".
I'd also suggest you look into using better data types - dates of birth/death would be much better dealt with as actual date values rather than strings.
1Guido suggested ISNULL in the comments, which is similar to COALESCE but has some odd limitations and I'd usually recommend against using. They also suggested that the replacement should be an empty string, but here that would result in VALUES(... ,, ...) at the position of the NULL value which would generate an error.

T-sql and json and XML

Does anybody know how to get intellisense working in query windows in SSMS for JSON methods?
I'm just getting started with querying from remote databases and for some reason when I attempt to type in methods, e.g., isjson() in the t-sql editor, not showing any of the functions.
Do I need to import any packages?
You can use Json Format by create this procedure :
CREATE PROCEDURE [dbo].[sp_query_json]
#query nvarchar(4000)
AS
BEGIN
SET NOCOUNT ON;
Declare #query_adjust nvarchar(max)
set #query_adjust = 'select ' + '''['''
+ ' + replace(replace(replace(replace(replace(('
+ #query + ' for xml raw ),'
+ '''<row ''' + ',' + '''{"''' + '),'
+ '''/>''' + ',' + '''},''' + '),'
+ '''" ''' + ',' + '''","''' + '),'
+ '''="''' + ',' + '''":"''' + ') + '
+ ''']''' + ','
+ ''',]''' + ',' + ''']''' + ')'
EXECUTE sp_executesql #query_adjust
END
You can test query like that :
EXECUTE [dbo].[sp_query_json] 'select id,name from table'
this call real Table or View and change it to XML by using
For XML RAW
And replace XML tags to JSON like

SSMS: Create a table script from results of a procedure

I have a piece of dynamic sql query. I'd like to keep the result in a temp table. But the query returns +100 fields so I'd prefer not to type the Create table script manually.
Is it possible in Sql Server Management Studio to script out the results returned by the procedure into a create table script.
So let's say that in a new query panel I execute my script. Below I get the returned results. Option I'm looking for is something like e.g. right clicking the results and selecting "Script into a table". Or something similar to what you get when you select "Edit top 200 rows" on a table and then in the returned results when you right click -> Pane -> SQL you can edit the SQL that generated the results.
Is anything like that possible in SSMS for results of a procedure? If external tools are required are there any that are free? I'm using SQL Server 2012 and SSMS 2012. Some tools that where available for previous versions are licensed for 2012.
Below some mocked version of the core code in the procedure:
exec dbo.spDynamic
#ID ,
#ID2 ,
#Parameters ,
#prefixLogic,
#selectprefix,
#selectsuffix,
#Table_1,
#Function_1,
#SelectExtra_1,
#Where_1,
#Function_2,
#SelectExtra_2,
#Where_2,
#On,
#finalWhere
#tempSchema
#tempTable
#tempWhere
And here's the essential part of the spDynamic:
Declare #sql nvarchar(max)
set #sql =
+ #parameterLogic
+ ' ' + #prefixlogic
+ #selectprefix
+ ' SELECT ' + #DeltaLogic
+ ' FROM ( Select * ' + Case When #ID < 0 Then Replace(#SelectExtra_1, '<ID>', CAST(#ID AS nvarchar))
When #ID = 0 Then Replace(#SelectExtra_2, '<ID>', CAST(#ID AS nvarchar))
Else Replace(#Table_1, '<ID>', CAST(#ID AS nvarchar)) End
+ ' From ' + Case When #ID < 0 Then #Function_1 + ' ' + #Where_1
When #ID = 0 Then #Function_2 + ' ' + #Where_2
Else #tempSchema + #tempTable
+ ' ' + Replace(#tempWhere, '<ID>', CAST(#ID AS nvarchar)) End
+ ' ) A '
+ ' FULL OUTER JOIN '
+ ' ( Select * ' + Case When #ID2 < 0 Then Replace(#SelectExtra_1, '<ID>', CAST(#ID2 AS nvarchar))
When #ID2 = 0 Then Replace(#SelectExtra_2, '<ID>', CAST(#ID2 AS nvarchar))
Else Replace(#Table_1, '<ID>', CAST(#ID2 AS nvarchar)) End
+ ' From ' + Case When #ID2 < 0 Then #Function_1 + ' ' + #Where_1
When #ID2 = 0 Then #Function_2 + ' ' + #Where_2
Else #tempSchema + #tempTable
+ ' ' + Replace(#tempWhere, '<ID>', CAST(#ID2 AS nvarchar)) End
+ ' ) B ' + #on
+ #finalWhere
+ #selectsuffix
exec sp_sqlexec #sql
One solution could be to execute the stored procedure inside an OPENROWSET function as described here.
This allows you to create the table on-the-fly, for example by writing:
SELECT * INTO MyTempTable
FROM OPENROWSET('SQLNCLI', 'Server=localhost;Trusted_Connection=yes',
'EXEC myStoredProc')
If you don't want actual data into MyTempTable, you could add WHERE 1 = 0 at the end. Afterwards, you can then script out MyTempTable just like any other table.
If you use SSMSBoost tool bar then it has a "script grid data" option:
This will create a table called NewTable
SELECT *
INTO NewTable
FROM AnotherTable
Can you could alter your dynamic SQL to do this?

Resources