Can't use LONG data type - sql-server

I'm trying to get some legacy SQL 2005 code to work on SQL 2012 Express. However, whenever I set the compatibility_level to 90, I error out when I try to use older data types. In theory the following code should work:
USE wsus_results
GO
ALTER DATABASE wsus_results
SET compatibility_level = 90
GO
CREATE TABLE ScTable (
TblName VARCHAR(255) NULL,
TblType VARCHAR(255) NULL,
FieldCnt INTEGER NULL,
RecordCnt LONG NULL,
Description LONGVARCHAR NULL,
TblId AUTOINCREMENT PRIMARY KEY)
GO
But, I get the following error:
Msg 2715, Level 16, State 6, Line 2 Column, parameter, or variable #4:
Cannot find data type LONG.
I'm sure there's something simple I'm missing, and I just need a nudge in the right direction. This isn't a permission issue and as far as I can tell, the SET compatibility_level = 90 executes fine with no errors. Still, I get an error when using LONG.

LONG is not a valid data type in any version of SQL Server. And changing compatibility level will not affect your ability to use old or new data types. This only affects the way certain language constructs are parsed.
Perhaps you meant DECIMAL or BIGINT.
And to pre-empt further questions: LONGVARCHAR and AUTOINCREMENT are not valid data types either (check the documentation instead of guessing). Where did you get this script, and who suggested it should work in SQL Server? I think you may have been pranked. Try this instead:
USE wsus_results;
GO
ALTER DATABASE wsus_results
SET compatibility_level = 110;
GO
CREATE TABLE dbo.ScTable -- schema prefix is important!
(
TblName VARCHAR(255),
TblType VARCHAR(255),
FieldCnt INT,
RecordCnt BIGINT,
Description VARCHAR(MAX),
TblId INT IDENTITY(1,1) NOT NULL PRIMARY KEY
);
GO
As an aside, is every other column in the table really nullable? Does your table name really need a suffix Table? What does Sc mean? Why not actually call the table what it represents (such as SocialCows or ScientificCholesterol) instead of obfuscating the name and adding a meaningless suffix just to incur more typing?

Related

How to cast a text column into integer column in MS SQL Express?

I am using Microsoft SQL Express and SQL Server Management Studio.
I am following a tutorial to create a small table from scratch and enter some values as per the below code. The tutorial is teaching how to correctly cast a column if by mistake it is incorrectly declared in the first place.
CREATE TABLE transactions(
transaction_date date,
amount integer,
fee text
);
SELECT * FROM transactions;
INSERT INTO transactions (transaction_date, amount, fee)
VALUES ('2018-09-24', 5454, '30');
The 'fee' column wrongly created as text. I am trying to typecast this column into integer while using the below code. But this is giving following error. Any suggestions?
SELECT transaction_date, amount + CAST (fee AS integer) AS net_amount
FROM transactions;
Explicit conversion from data type text to int is not allowed.
The error is telling you the problem here; you can't explicitly (or implicitly) convert/cast a text value to an int. text has been deprecated for 16 years, so you should not be using it. It was replaced by varchar(MAX) way back in 2005 (along with nvarchar(MAX) for ntext and varbinary(MAX) for image).
Instead, you'll need to convert the value to a varchar first, and then an int. I also recommend using TRY_CONVERT for the latter, as a value like '3.0' will fail to convert:
SELECT TRY_CONVERT(int,CONVERT(varchar(MAX),fee))
FROM dbo.transactions;
Of course, what you should really be doing is fixing the table:
ALTER TABLE dbo.transactions ADD TextFee varchar(MAX) NULL; --To retain any data that couldn't be converted
GO
UPDATE dbo.transactions
SET fee = TRY_CONVERT(int,CONVERT(varchar(MAX),fee)),
TextFee = fee;
GO
ALTER TABLE dbo.transactions ALTER COLUMN fee int;

Unable to run "INSERT INTO" from Azure SQL external table

In my Azure SQL DB I have an external table - let's call this tableName_origData - and I have another table which we'll refer to as tableName.
tableName was created using a generated CREATE script from tableName_origData (in its original location) so I can be sure that all the column types are identical.
However, when I run
INSERT INTO tableName (
[list of column names]
)
SELECT
[same list of column names]
FROM
tableName_origData
I encounter the following exception:
Large object column support is limited to only nvarchar(max) data
type.
As far as my understanding of Azure SQL's data types goes, I don't have anything larger than NVARCHAR(MAX). Furthermore, the message implies that NVARCHAR(MAX) is supported (and I can see that the same script works on other tables which contain NVARCHAR(MAX).
Can anyone better explain the cause of this exception and what I might need to do in order to insert its data into an identical table?
Here is a list of all the column types used in the table(s):
BIGINT x 3
NCHAR(20) x 1
NVARCHAR(45) x 5
NVARCHAR(100) x 14
NVARCHAR(MAX) x 10
External tables is read-only. The developer can select data, but cannot perform any form of DML-processing
To solve this issue please use this technique:
https://technology.amis.nl/2005/04/05/updateable-external-tables/
Warn: Unless for the simplest of uses, we do not recommend using this technique for any serious application

SQL Server snapshot replication: error on table creation

I receive the following error (taken from replication monitor):
The option 'FILETABLE_STREAMID_UNIQUE_CONSTRAINT_NAME' is only valid when used on a FileTable. Remove the option from the statement. (Source: MSSQLServer, Error number: 33411)
The command attempted is:
CREATE TABLE [dbo].[WP_CashCenter_StreamLocationLink](
[id] [bigint] NOT NULL,
[Stream_id] [int] NOT NULL,
[Location_id] [numeric](15, 0) NOT NULL,
[UID] [uniqueidentifier] NOT NULL
)
WITH
(
FILETABLE_STREAMID_UNIQUE_CONSTRAINT_NAME=[UC_StreamLocation]
)
Now, for me there's two things unclear here.
Table already existed on subscriber, and I've set #pre_creation_cmd = N'delete' for the article. So I don't expect the table to be dropped and re-created. In fact, table still exists on subscriber side, although create table command failed to complete. What am I missing? Where does this create table command come from and why?
I don't understand why does this FILETABLE_STREAMID_UNIQUE_CONSTRAINT_NAME option appear in creation script. I tried generating create table script from table in SSMS and indeed, it's there. But what's weird, I can't drop and re-create the table this way - I get the very same error message.
EDIT: Ok, I guess now I know why the table is still there - I noticed begin tran in sql server profiler.
If your table on the publisher is truly not defined as a FileTable, then the issue has to do with the column named "Stream_id". I believe there is a known issue in SQL 2012 where if you have a column named "Stream_id", which is kind of reserved for FileTable/FileStream, it will automatically add that constraint, and unfortunately break Replication. The workaround here is to rename the column to something other than "Stream_id".
Another workaround is to set the schema option to not replicate constraints (guessing this will work). If you require constraints on the subscriber, you can then try to manually apply them on the sbuscriber after the fact (or script them out and use #post_snaphsot_script).

SQl server Exception in Liquibase-update ,While creating table with column type varchar(2147483647)

Using Liquibase 3.1.1,I am trying to migrate Db scripts between any two SQl server Databases dynamically.
I am using Diff and update command of Liquibase.It is working working well with Db,which doesnot uses varchar(max) column.
For DB using varchar(max) column,liquibase uses VARCHAR(2147483647) for varchar(max),while creating changelog,and when it updates the target db,the below sql server exception is thrown.
liquibase.exception.DatabaseException: Error executing SQL CREATE TABLE [ASSET_ATTRIBUTE] ([ASSET_ATTR_ID] BIGINT IDENTITY (1, 1) NOT NULL, [ASSET_GUID] UNIQUEIDENTIFIER NOT NULL, [APP_ID] VARCHAR(30), [ATTR_TYPE] VARCHAR(30) NOT NULL, [ATTR_NAME] VARCHAR(50) NOT NULL, **[ATTR_VALUE_STRING] VARCHAR(2147483647)**, [ATTR_VALUE_DATE] datetime, CONSTRAINT [PK_ASSET_ATTRIBUTE] PRIMARY KEY ([ASSET_ATTR_ID])): The size (2147483647) given to the column 'ATTR_VALUE_STRING' exceeds the maximum allowed for any data type (8000).
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:61)
at liquibase.executor.jvm.JdbcExecutor.execute(JdbcExecutor.java:106)
at liquibase.database.AbstractJdbcDatabase.execute(AbstractJdbcDatabase.java:1189)
at liquibase.database.AbstractJdbcDatabase.executeStatements(AbstractJdbcDatabase.java:1172)
at liquibase.changelog.ChangeSet.execute(ChangeSet.java:352)
... 4 more
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: The size (2147483647) given to the column 'ATTR_VALUE_STRING' exceeds the maximum allowed for any data type (8000).
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1515)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:792) and some more
Even though its an sql server exception,i need to fix it through liquibase.
Is there any fix for this?
Why liquibase is not storing the column type as varchar(max),instead of varchar(2147483647),so that during update it wont throw the exception.
Thanks in Advance

Error SQL72014: Conversion failed when converting the varchar value to data type int

I'm having a problem with inserting test data into a table. I am getting the error listed in the title. I have done a lot of googling without finding a solution.
Within in Visual Studio 2012 solution, I have several projects, one of which is a Database project. I am defining several tables. The one in question is:
CREATE TABLE [dbo].[tblKppHierarchyPcl]
(
[ID] NUMERIC(18,0) NOT NULL,
[Name] VARCHAR(500),
[PartStructureKey] NUMERIC(18,0) NOT NULL,
[PartNumber] VARCHAR(500) NOT NULL,
[ParentPartNumber] VARCHAR(500) NULL,
[TargetCost] DECIMAL(30,4) NULL,
[UnitCost] DECIMAL(30,4) NULL,
[CostMaturityID] INT NULL,
[Ratio] DECIMAL(16,2) NULL,
[Contribution] DECIMAL(16,2) NULL,
[ChildPartTargetWeight] NUMERIC(18,2) NULL,
[ChildPartWeight] NUMERIC(18,2) NULL,
CONSTRAINT [FK_tblKppHierarchyPcl_tblCostMaturity] FOREIGN KEY (CostMaturityID)
REFERENCES tblCostMaturity([CostMaturityID])
)
Using a Script.PostDeployment1.sql file, I am trying to populate the table with test data like so:
INSERT INTO [dbo].[tblKppHierarchyPcl]
([ID]
,[Name]
,[PartStructureKey]
,[PartNumber]
,[ParentPartNumber]
,[TargetCost]
,[UnitCost]
,[CostMaturityID]
,[Ratio]
,[Contribution]
,[ChildPartTargetWeight]
,[ChildPartWeight]) VALUES
(61090,'Coolant Quick Disconnect',125216,'FS-252-6FO','H432677DB-1',27.03,70.61,2,2.61,0.01,0,NULL)
I am trying to push the data to the database via SqlPublish.
My problem is this: When the post-deployment script tries to insert the data, I get the following error:
Error SQL72014: .Net SqlClient Data Provider: Msg 245, Level 16, State 1, Line 76 Conversion
failed when converting the varchar value 'Coolant Quick Disconnect' to data type int.
So it has a problem with inserting 'Coolant Quick Disconnect' into the Name column. The Name column is CLEARLY a varchar column but somehow it thinks it's an int column.
Any ideas?
EDIT: I am using SQL Server 2012. There are no triggers on this table.
The problem is due to the column's datatype originally being an int, then changing the columns datatype to a varchar and a problem with the database project caching the previous schema. Deploying twice will refresh the database project's schema to detect the new datatype.

Resources