We have a script that must allow for being re-run several times.
We have an MS-SQL script that updates a table if a (now obsolete) column exists, then deletes the column. To ensure that the script can be run several times, it first checks for the existence of a column before performing the updates.
The script works as expected on our dev database, updating the data on the first run, then displaying the message 'Not updating' on subsequent runs.
On our test database the script runs fine on the first run, but errors with "Invalid column name 'OldColumn'" on subsequent runs; if I comment out the UPDATE and ALTER statements it runs as expected.
Is there a way to force the script to run even if there's a potential error, or is it something to do with how the database was set-up? (fingers crossed I'm not looking like a complete noob!)
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'MyTable' AND COLUMN_NAME = 'OldColumn')
BEGIN
PRINT 'Updating and removing old column...'
UPDATE MyTable SET NewColumn='X' WHERE OldColumn=1;
ALTER TABLE MyTable DROP COLUMN OldColumn;
END
ELSE
PRINT 'Not updating'
GO
As a work around you could do
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'MyTable' AND COLUMN_NAME = 'OldColumn')
BEGIN
PRINT 'Updating and removing old column...'
EXEC ('UPDATE MyTable SET NewColumn=''X'' WHERE OldColumn=1;');
ALTER TABLE MyTable DROP COLUMN OldColumn;
END
ELSE
PRINT 'Not updating'
Related
I am trying to execute a procedure with a parameter, and depending on the value of the parameter, three different IF conditions will be evaluated to verify which query it will execute from a linked server.
But when I execute the query, it seems to be checking if the tables inside all the IF exists before starting the query. And I know that only one of the table exists, that is why I am using the parameter, so it shouldn't fail. but I anyhow get the following error:
Msg 7314, Level 16, State 1, Line 25
The OLE DB provider "Microsoft.ACE.OLEDB.16.0" for linked server "LinkedServer" does not contain the table "D100". The table either does not exist or the current user does not have permissions on that table.
So in this code, assume that the parameter is 300. then I get the message above.
Do you know, if there is a way, to limit the query to do not check all the tables, but only the one where the IF condition will be met?
ALTER PROCEDURE[dbo].[Import_data]
#p1 int = 0
AS
BEGIN
SET NOCOUNT ON;
IF(#p1 = 100)
BEGIN
DROP TABLE IF EXISTS Table1
SELECT [Field1], [Field2], [Field3], [Field4], [Field5], [Field6]
INTO Table1
FROM[LinkedServer]...[D100]
END
IF(#p1 = 200)
BEGIN
DROP TABLE IF EXISTS Table2
SELECT[Field1], [Field2], [Field3], [Field4], [Field5], [Field6]
INTO Table2
FROM[LinkedServer]...[D200]
END
IF(#p1 = 300)
BEGIN
DROP TABLE IF EXISTS Table3
SELECT[Field1], [Field2], [Field3], [Field4], [Field5], [Field6]
INTO Table3
FROM[LinkedServer]...[D300]
END
END
I have tried googling it, but I found mostly workarounds as running a sub procedure, but it is not really a clean solution, I think.
Okay, it seems I that I found the answer. Even with an IF statement, the SQL Server validates the entire query before executing it, so the way to overcome it, is to use a Dynamic SQL Query.
"SQL Server Dynamic SQL is a programming technique that allows you to construct SQL statements dynamically at runtime. It allows you to create more general purpose and flexible SQL statement because the full text of the SQL statements may be unknown at compilation."
This is how the query looks now. so instead of multiple IF statements, the query changes dynamically depending on the parameter.
DECLARE #SQL NVARCHAR(MAX)
SET #SQL = N'DROP TABLE IF EXISTS Table1;
SELECT [Field1]
,[Field2]
,[Field3]
,[Field4]
,[Field5]
,[Field6]
INTO Table1
FROM [LinkedServer]...[D' + CONVERT(nvarchar(3),#p1) + N']'
EXEC sp_executesql #SQL
At one point a few years ago I setup a project in Visual Studio to output to an Azure SQL V12 database, and tried to use sqlcmd with migration scripts (I say this also to point out that at one point everything did work, and everything discussed below made it into production).
My postdeployment script looks something like this:
PRINT N'Starting post deployment';
:setvar WorkDirectory "F:\path\to\scripts"
PRINT N'Starting 1.01';
:r $(WorkDirectory)\1.01\Deployment.sql
PRINT N'Starting 1.02';
:r $(WorkDirectory)\1.02\Deployment.sql
PRINT N'Starting 1.03'; :r $(WorkDirectory)\1.03\Deployment103.sql
PRINT N'Starting 1.04'; :r $(WorkDirectory)\1.04\Deployment104.sql
After not making any changes to the project (not consciously anyway), I've discovered that when I try and debug/generate the database and apply the migration scripts, I get an error because a column is referenced that doesn't exist on the table. The following is the offending part from the Deployment104.sql file:
IF NOT EXISTS (SELECT *
FROM sys.columns
WHERE Name = N'MissingColumn'
AND Object_ID = Object_ID(N'TableA'))
BEGIN
ALTER TABLE TableA
ADD MissingColumn INT NOT NULL
END
GO
IF NOT EXISTS (SELECT 1
FROM information_schema.Routines
WHERE ROUTINE_NAME = 'LIST_CODES'
AND ROUTINE_TYPE = 'PROCEDURE'
AND ROUTINE_SCHEMA = 'dbo' )
BEGIN;
EXEC('CREATE PROCEDURE dbo.LIST_CODES as set nocount on;')
END
GO
ALTER PROCEDURE [dbo].LIST_CODES
AS
SELECT *
FROM CodesTable ct
LEFT JOIN TableA ta ON ct.CodeId = ta.CodeId
WHERE ta.MissingColumn= 1999
ORDER BY ct.CodeId
GO
The Db ends up partially created at this point. However, when I run that first check for whether the column exists, I actually get an entry back, implying that the column does exist in TableA. And yet I don't see it in Server Explorer (VS 2019 preview 16.4), and obviously the next part of the script doesn't see it either.
What am I doing wrong, and/or how can I make the script and the db agree as to which columns exist and which do not?
EDIT: as an update, while running the PostDeploymentScript fails, running the problematic script itself (Deployment104 in this case) works just fine, after which both the column and the stored procedure show up exactly as expected in the server explorer and can be referenced by other queries.
I believe the issue is that you are trying to add a non null column to an existing table. You would need to add a default value for example of 0 or use a data type that supports null. For example:
BEGIN
alter table TableA
ADD MissingColumn int NOT NULL DEFAULT 0
END
There is also a semicolon after the second BEGIN.
Current:
IF NOT EXISTS ( SELECT 1 FROM information_schema.Routines
WHERE ROUTINE_NAME = 'LIST_CODES'AND ROUTINE_TYPE = 'PROCEDURE' AND ROUTINE_SCHEMA = 'dbo' )
BEGIN;
EXEC('CREATE PROCEDURE dbo.LIST_CODES as set nocount on;')
END
GO
Fixed:
IF NOT EXISTS ( SELECT 1 FROM information_schema.Routines
WHERE ROUTINE_NAME = 'LIST_CODES'AND ROUTINE_TYPE = 'PROCEDURE' AND ROUTINE_SCHEMA = 'dbo' )
BEGIN
EXEC('CREATE PROCEDURE dbo.LIST_CODES as set nocount on;')
END
GO
After these changes I was able to get the statement to run in sqlfiddle. Here is the link http://sqlfiddle.com/#!18/51253/3
The answer ended up being splitting the command into its own batch, unfortunately not detectable through the posted code. The last instruction in Deployment103.sql was an ALTER PROCEDURE, which has to appear by itself in a batch, but the terminating GO was missing.
It looks like t-sql won't complain about this (because in the file, it does look like it's along in its batch). That led to the next instructions not working correctly, until the first GO statement in Deployment104.sql.
Adding the following GO resolved the issues (as does terminating Deployment103.sql with a GO)
PRINT N'Starting 1.03'; :r $(WorkDirectory)\1.03\Deployment103.sql
PRINT N'Starting 1.04';
GO
:r $(WorkDirectory)\1.04\Deployment104.sql
There's a couple data migration scripts in my SSDT project.
First one stores data from one table to another temporary table:
IF EXISTS
(
SELECT *
FROM INFORMATION_SCHEMA.COLUMNS
WHERE table_name = N'DocumentEvent'
AND column_name = N'Thumbprint'
)
BEGIN
IF NOT EXISTS
(
SELECT * FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_NAME = N'tmp_DocumentEventCertificates'
)
BEGIN
CREATE TABLE tmp_DocumentEventCertificates
(
[EventId] UNIQUEIDENTIFIER NOT NULL,
[Thumbprint] nvarchar(100)
)
END
INSERT INTO
tmp_DocumentEventCertificates
SELECT
[EventId],
[Thumbprint]
FROM
[DocumentEvent]
WHERE
[Thumbprint] IS NOT NULL
END
Second one transfers data from temporary table to another table:
IF EXISTS
(
SELECT * FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_NAME = N'tmp_DocumentEventCertificates'
)
BEGIN
UPDATE
[DocumentAttachment]
SET
[DocumentAttachment].[Certificate_Thumbprint] = tmp.[Thumbprint]
FROM
tmp_DocumentEventCertificates AS tmp
WHERE
([DocumentAttachment].[EventId] = tmp.[EventId]) AND
([DocumentAttachment].[ParentDocumentAttachmentId] IS NOT NULL)
DROP TABLE tmp_DocumentEventCertificates
END
Column [Thumbprint] is being removed from [DocumentEvent] table.
Column [Certificate_Thumbprint] is being added to [DocumentAttachment] table.
Data must be transferred from [DocumentEvent].[Thumbprint] to [DocumentAttachment].[Certificate_Thumbprint].
These scripts works as expected, when database is in the state, which requires migration from above, that is, [DocumentEvent].[Thumbprint] exists, and [DocumentAttachment].[Certificate_Thumbprint] does not exist.
But when database is migrated, all attempts to deploy dacpac fail because of
"Invalid column name 'Thumbprint'" error.
I'm almost sure, that this happens because SQLCMD tries to compile deploy script in whole, and this could be done successfully only when [DocumentEvent].[Thumbprint] exists.
But what is the workaround?
Looks like IF EXISTS in first script can't help.
Yes you are right, it's compilation error.
If the column does not exist your script cannot be compiled.
IF Exists and other data flow constructions are not analyzed.
You should wrap your code producing compilation error in EXEC:
exec(
'INSERT INTO
tmp_DocumentEventCertificates
SELECT
[EventId],
[Thumbprint]
FROM
[DocumentEvent]
WHERE
[Thumbprint] IS NOT NULL')
Below is an excerpt of a SQL Query that I am using to update a table to the correct datatypes if needed.
If NOT Exists(Select * From Information_Schema.Columns
Where Table_Name = N'RXINFO'
And Table_Schema = N'scriptassist'
And Column_Name = N'LastChanged'
And DATA_Type = N'TIMESTAMP'
AND IsNull(CHARACTER_MAXIMUM_LENGTH, 0) = 0)
BEGIN
Print 'LastChanged Field needed type updating'
Alter Table [scriptassist].[RXINFO] Alter Column LastChanged TIMESTAMP
END
Currently the problem is as follows:
If I run the statement With the Alter Table present SQL Server throws this error at me.
Msg 4927, Level 16, State 1, Line 12
Cannot alter column 'LastChanged' to be data type timestamp.
The problem isn't that it can't change the Datatype the problem is that it is attempting to execute that code block regardless of the evaluation of the Condition.
It should evaluate to False in this case.
If I take it out, nothing happens, the print statement doesn't even fire.
The only thing that I can think of thus far is that somehow MS SQL is evaluation the SQL beforehand and determining if all the code paths can execute, and since they can't it throws the error. However this doesn't make that much sense.
SQL Server parses your SQL before it executes it. The error is raised during parsing.
To delay parsing until the line is actually run, use exec:
exec ('Alter Table [scriptassist].[RXINFO] Alter Column LastChanged TIMESTAMP')
I believe you're getting this error because SQL cannot perform a conversion from the previous datatype of your TimeStamp column to an actual TimeStamp datatype. You'll need to drop and then add the column instead.
If NOT Exists(Select * From Information_Schema.Columns
Where Table_Name = N'RXINFO'
And Table_Schema = N'scriptassist'
And Column_Name = N'LastChanged'
And DATA_Type = N'TIMESTAMP'
AND IsNull(CHARACTER_MAXIMUM_LENGTH, 0) = 0)
BEGIN
Print 'LastChanged Field needed type updating'
Alter Table [scriptassist].[RXINFO] Drop Column LastChanged
Alter Table [scriptassist].[RXINFO] Add LastChanged TimeStamp
END
I have a query to add a column if it doesn't exist. It works the first time, but if I run it again, it fails, saying that the column exists?!?
IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = 'TABLE_NAME' AND COLUMN_NAME = 'COLUMN_NAME')
BEGIN
ALTER TABLE TABLE_NAME ADD COLUMN nchar(3) NULL;
END
And when I run the query against INFORMATION_SCHEMA.COLUMNS, it returns nothing.
I've also tried the
IF NOT EXISTS (SELECT * FROM SYS.COLUMNS WHERE NAME = N'COLUMN_NAME' AND OBJECT_ID = OBJECT_ID(N'TABLE_NAME'))
version, which exhibits the same behavior (it works once, and fails on the second run).
At what point do the sys tables get updated, and what is the fool proof way to test if a column exists?
Thanks,
Sam
I believe it was a problem with USE due to our screwy database setup. The information_schema query was hitting master while the update targeted another database. D'oh!