"Value cannot be null. Parameter name: reportedElement" when adding a new Always Encrypted column to an existing table - sql-server

Using Visual Studio database projects (SSDT) I added a new column to an existing table. I am using Always Encrypted to encrypt individual columns. When I add the column and try to publish, I get a popup in Visual Studio that says "Value cannot be null. Parameter name: reportedElement".
If I don't encrypt the column, it works.
If I clear the existing data out of the table, it works.
But just trying to add a new nullable encrypted column does not publish. It will not even generate the script that would be applied.
I ran the daxFX and SSDT logging and viewed the logs with Windows Event Viewer, but I just see the same error "Value cannot be null. Parameter name: reportedElement".
This is what the added column definition looks like.
[MyNewColumn] INT ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [DefaultColumnEncryptionKey], ENCRYPTION_TYPE = DETERMINISTIC, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NULL
I expect Visual Studio to publish successfully, adding my new nullable encrypted column but the actual behavior is a pop up that states "Value cannot be null. Parameter name: reportedElement".

I had the exact same issue, except I had decrypted the column to perform a lookup based on it that I couldn't while it was encrypted (this is a local development db).
The solution was to just perform the encryption manually via SSMS and then run the publish. I'm not sure why VS can't publish the changes, the encryption keys are stored in the local cert store and VS is running as admin but it might not be able to access the keys to encrypt the data but SSMS can.

Related

Catastrophic failure (Exception from(E_UNEXPECTED)) (SQLEditors)) after modifying column type

I'm using Microsoft SQL Server Management Studio, after I have modified column type from varchar to int I tried to update the table but it throw the following error
Saving changes is not permitted. The change you have made requires the
following table to be dropped and re-created. You have either made
changes to a table that can't be recreated or enabled the option
prevent saving changes that require the table to be re-created. (list of 3 tables have a relation with this table as fk)
I tried to fix it by Tools >> Options >> Designers and uncheck “Prevent Saving changes that require table re-creation”
from this question
Sql Server 'Saving changes is not permitted' error ► Prevent saving changes that require table re-creation
then the update not throw any error but It throw the following error when open table design
Catastrophic failure (Exception from(E_UNEXPECTED)) (SQLEditors))
after modifying column type
I tried this but also the same error
RegSvr32 msxml3.dll
RegSvr32 msxml6.dll
per comments updating here:
Revert back the changes to what they were before the error and work on a differnt solution to fix the problem.
One alternative is:
Create new column in table that you want the data to be inserted into/converted to
Update that new column from the existing column you have the data in you are trying to convert from.
Verify the data in the new column is correct/works good.
Drop the old column.
Rename the new column to the old columns name.

SQL Server BackEnd Access Front-End ODBC Error

I can read/write/update the table fine in SSMS, I can open/read/write the table fine if I open the table itself in Access 2013, but if I try to query the table, I get the generic access error message of
ODBC -- call failed
This table has 558,672 rows in it. I have tried using a DSNLess connection with VBA as well as a manually linking the table in through the toolbar in access. What is causing access to throw this error?
EDIT
I have also tried to compact and repair the database to no avail.
EDIT #2
It seems that only one element (a subform) is throwing the ODBC error. The peculiar thing is the main form is based on the same datasource that the sub form is, but only the subform is throwing an error?
I had this problem before here are the thing I had to change to access table with MS Access and edit it.
1.your tables should have a primary key. In the column properties, set identity specification to yes, and Identity increment by 1. I would prefer to set a completely new column with int data type.
2. No null values in boolean fields everything should be 1 or 0. and set a constraint to 0.

SSDT does not publish column COLLATION change

It seems that SSDT does not publish column COLLATION, even though it detects a change during comparison process.
An issue appears that if you change a column COLLATION on a specific column in a table, and try to publish the change, the SSDT will ignore it when creating a publish script.
Here is a similar issue described on msdn forums, detected long ago, that is still reproduced.
I have been using SSDT version 14.0.60629.0
Does the SSDT still have this issue, or is there a valid workaround?
Update
This issue is only for the columns which are using a User-Defined Data Type.
Update
(added steps to reproduce, and corrected the question text):
Steps to reproduce:
1. Start with a database and note the collations(this is the one I have, a DB on my Dev server):
Current COLLATION setup is:
ServerSQL_Latin1_General_CP1_CI_AS
DatabaseSQL_Latin1_General_CP1_CI_AS
TableSQL_Latin1_General_CP1_CI_AS
User-Defined Data Type (dt_Source AS varchar(20))SQL_Latin1_General_CP1_CI_AS
Column (Source AS dt_source)SQL_Latin1_General_CP1_CI_AS
2.Then change the database collation.
USE master;
ALTER DATABASE [<db_name>] COLLATE SQL_Latin1_General_CP1250_CS_AS
New COLLATION setup will be:
ServerSQL_Latin1_General_CP1_CI_AS
DatabaseSQL_Latin1_General_CP1250_CS_AS
TableSQL_Latin1_General_CP1250_CS_AS
User-Defined Data Type (dt_Source AS varchar(20))SQL_Latin1_General_CP1250_CS_AS
Column (Source AS dt_source)SQL_Latin1_General_CP1_CI_AS
Previous column collation (SQL_Latin1_General_CP1_CI_AS) will remain, and SSDT Compare mechanism will not be able to detect any change.
This will lead to an error message, if I try to create a Foreign Key constraint on this column, referencing another, newly populated column, in another table, because the Publish Script from Comparison was built without knowing the true collation.
For, example, this produces an error, because column collations are different:
ALTER TABLE [FCT].[Inventory] WITH NOCHECK
ADD CONSTRAINT [FK_Inventory_Source] FOREIGN KEY ([Source]) REFERENCES [DIM].[Source] ([SourceCode]);
Make sure you ENABLE "script database collation" in the publish settings (tab: general)
source: https://dba.stackexchange.com/questions/128002/ssdt-publish-window-what-does-checkbox-enable-mean
then it might take multiple publications
first it does on db level, later on table/column level

SQL Server 2008 R2 import from access fails on datetime column

I'm trying to import an Access 2010 DB table (*.accdb) to SQL Server 2008 R2,
as a new table not to an existing one.
I'm doing it through SQL Server's "Import an Export Data Wizard".
I have two datetime columns that I know for a fact that contain some non valid datetime values (don't ask me how some genius managed to enter bad values there).
So I thought I can map these columns to nvarchar(max) columns in the wizard and deal with the problem later.
But unfortunately after the mapping, I get this message:
Found 2 unknown column type conversion(s)
The package will not be run.
This is a screenshot:
ok, so i just found a very weird workaround (bug in the wizard?):
when you choose the destination column type to be "nvarchar" the size is set automatically to "max" and you can't change that.
switched to "nchar" (next on the combobox list) and the size is set to "50" (deafult).
then switched back to "nvarchar" and the size was still "50", pressed "next" button and voila - i can choose to ignore bad values and was able to run the import.
weirdest workaround ever.
In addition to the workaround described in another answer, another approach would be to import a query that does the DateTime -> Text conversion, for example,
SELECT CStr(Field1) AS Field1Text, Field2 FROM Table1
like this:

VS SchemaCompare: Making Table Updates

Does anyone know how the SchemaCompare in Visual Studio (using 2010 currently) determines how to handle [SQL Server 2008R2] database table updates (column data type, optionality, etc)?
The options are to:
Use separate ALTER TABLE statements
Create a new table, copy the old data into the new table, rename the old table before the new one can be renamed to assume the proper name
I'm asking because we have a situation involving a TIMESTAMP column (for optimistic locking). If SchemaCompare uses the new table approach, the TIMESTAMP column values will change & cause problems for anyone with the old TIMESTAMP values.
I believe Schema Compare employs the same CREATE-COPY-DROP-RENAME (CCDR) strategy as VSTSDB described here: link
Should be able to confirm this by running a compare and scripting out the deploy, no?

Resources