sqlpackage breaks when trying to alter a FILESTREAM column - dacpac

I have a SQL Database project, where I have recently amended a FILESTREAM column to allow nulls. I am trying to publish the project's dacpac to a db where the column already exists, as a filestream column, but is currently NOT NULL.
I am performing the publish using the sqlpackage.exe command line tool (version 12.0.2882.1). However, it produces the error:
Error SQL72014: .Net SqlClient Data Provider: Msg 4990, Level 16, State 1, Line 1 Cannot alter column 'document' in table 'Document' to add or remove the FILESTREAM column attribute.
Error SQL72045: Script execution error. The executed script:
ALTER TABLE [dbo].[Document] ALTER COLUMN [document] VARBINARY (MAX) NULL;
The sql in the error looks like the generated script is trying to remove the FILESTREAM attribute on the column, which I guess is why SQL Server is complaining... but why is it doing this? My DB project still has the column marked as FILESTREAM

Fixed by upgrading sqlpackage.exe and associated libraries from version 12.0.2882.1 to 12.0.3021.1

Related

creating SQL memory table visual studio

I'm trying to setup an SQL database project (using Visual Studio 2017).
I want this to create a dacpac that could be run on either SQL Server or Azure SQL, would create a database from scratch with a single publishing.
However, I'm running into couple issues:
I can't create a memory optimised table if there are no existing database (and tables). When running this (SQL Server):
CREATE TABLE [dbo].[TicketReservationDetail] (
TicketReservationID BIGINT NOT NULL,
TicketReservationDetailID BIGINT IDENTITY NOT NULL,
Quantity INT NOT NULL,
FlightID INT NOT NULL,
Comment NVARCHAR (1000),
CONSTRAINT [PK_TicketReservationDetail] PRIMARY KEY NONCLUSTERED HASH
([TicketReservationDetailID]) WITH (BUCKET_COUNT=10000000)
) WITH (MEMORY_OPTIMIZED=ON);
GO
ALTER DATABASE [$(DatabaseName)] ADD FILEGROUP [mod] CONTAINS MEMORY_OPTIMIZED_DATA
It gives me this:
(63,1): SQL72014: .Net SqlClient Data Provider: Msg 10794, Level 16, State 125, Line 1 The operation 'AUTO_CLOSE' is not supported with databases that have a MEMORY_OPTIMIZED_DATA filegroup.
(63,0): SQL72045: Script execution error. The executed script:
ALTER DATABASE [$(DatabaseName)]
ADD FILEGROUP [mod] CONTAINS MEMORY_OPTIMIZED_DATA;
An error occurred while the batch was being executed.
However, if I publish a disk-table first then the SQL above works just fine.
I've tried having pre deployment script where I would create disk-table schema,
It publishes the disk-table, and fails when it tries to publish the actual memory optimised table schema.
I've used sample database from: Ticket-reservations
Any ideas or suggestions would be appreciated.

Cannot import SQL Azure bacpac to 2016 CTP

I'm very familiar with the process of exporting from Azure SQL V12 down to my dev box and then importing to my local sql (2014) instance. I'm spinning up a new Win10 box and have installed the SQL 2016 CTP. I'm connecting to that same Azure instance and can operate against it -- and can export a .bacpac just as with 2014.
But when I try to import to local I'm getting:
Could not import package.
Warning SQL72012: The object [FOO33_Data] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Warning SQL72012: The object [FOO33_Log] exists in the target, but it will not be dropped even though you selected the 'Generate drop statements for objects that are in the target database but that are not in the source' check box.
Error SQL72014: .Net SqlClient Data Provider: Msg 547, Level 16, State 0, Line 3 The ALTER TABLE statement conflicted with the FOREIGN KEY constraint "FK_dbo.Address_dbo.User_idUser". The conflict occurred in database "FOO33", table "dbo.User", column 'idUser'.
Error SQL72045: Script execution error. The executed script:
PRINT N'Checking constraint: FK_dbo.Address_dbo.User_idUser [dbo].[Address]';
ALTER TABLE [dbo].[Address] WITH CHECK CHECK CONSTRAINT [FK_dbo.Address_dbo.User_idUser];
Since this question was also asked and answered on MSDN, I will share here.
https://social.msdn.microsoft.com/Forums/azure/en-US/0b025206-5ea4-4ecb-b475-c7fabdb6df64/cannot-import-sql-azure-bacpac-to-2016-ctp?forum=ssdsgetstarted
Text from linked answer:
I suspect what's going wrong here is that the export operation was performed using a DB instance that was changing while the export was on-going. This can cause the exported table data to be inconsistent because, unlike SQL Server's physical backup/restore, exports do not guarantee transactional consistency. Instead, they're essentially performed by connecting to each table in the database in turn and running select *. When a foreign key relationship exists between two tables and the read table data is inconsistent, it results in an error during import after the data is written to the database and the import code attempts to re-enable the foreign key. We suggest using the database copy mechanism (create database copyDb as copy of originalDb), which guarantees a copy with transactional consistency, and then exporting from the non-changing database copy.

Store spss data on SQL database based on uniqueidentifier

I Imported a table from SQL database to SPSS dataset, I edited the table by adding a new column to it after some statistics, but when I try to export the table back to the database by adding the new column, by matching the Primary key from dataset and database table which is of type "uniqueIdentifier, this error is shown to me
> Error # 6492
>The ODBC subsystem has issued an error which prevents the processing of SAVE
>TRANSLATE ODBC request.
>Execution of this command stops.
>[Microsoft][ODBC SQL Server Driver][SQL Server]Conversion failed when
>converting from a character string to uniqueidentifier.
how to resolve this error?
uniqueidentifier is probably a type that the ODBC driver doesn't know how to convert to as it is probably a database internal type. Try writing your data to a new table without converting and then do an update of the database table from within the database converting the identifier type inside the database. You can put the necessary SQL in your SAVE TRANSLATE syntax using its SQL subcommand.

Change database options during EF migration

We have some POCO classes and migrations enabled for my dataaccess layer we have created a initial migration to - remark we use the CreateDatabaseIfNotExist db initialization.
But in the database I would like have a MessageBody field that uses SQL Filestream, because the limitation on EF for Filestream - we try to do it manually in the migration script.
There we execute the following sql command.
Sql("alter table [msg].[Message] add [MessageBody] varbinary(max) FILESTREAM not null");
But I have to set the filestream options
So I would like to execute the following command during / before the migration.
ALTER DATABASE CURRENT SET FILESTREAM ( NON_TRANSACTED_ACCESS = FULL)
But when adding this bedore the creation of the tables I get the error: ALTER DATABASE statement not allowed within multi-statement transaction.
What's the best way to adapt Database options when you want to create the database automatically? Is it possible to intercept the migration process to execute some sql before the process executes the acutal migrations?
To fix error that occured to you you only need to invoke Sql method with additional bool parameter set to true:
Sql("alter table [msg].[Message] add [MessageBody] varbinary(max) FILESTREAM not null", true);
This will cause that your query will be executed in separate transaction.

Error in export sql server database

I have SQL server and SQL management studio 2012. After i create tables and when i export the data base to the server "online" every "primary and auto incremental" field become as a normal field. so when i try to add row to database i have the following error
"Cannot insert the value NULL into column 'taskID', table 'lawyersDB.dbo.tasks'; column does not allow nulls. INSERT fails.
The statement has been terminated."
To solve this problem i have to manage the database "online" and go to each table and set again the fields primary and auto incremental. Since i have many tables, this process takes lot of time.
So Any idea to solve this problem!?
Note: my hosting is on Arvix company server
This is SQL Management Studio, not your account.
Try this:
You need to set this, or else if you have a non-nullable column, with no default error, if you provide no value it will error.
To set up auto-increment in SQL Server Management Studio:
Open your table in Design
Select your column and go to Column Properties
Under Indentity Specification, set (Is Identity)=Yes and Indentity Increment=1

Resources