Why is Entity Framework ignoring existing GUIDs on insert? - sql-server

I have a table defined in Sql Server with a GUID primary key and a default of newsequentialid():
CREATE TABLE [dbo].[mything](
[id] [uniqueidentifier] NOT NULL,
[foo] [varchar][32] NULL,
CONSTRAINT [PK_mything] PRIMARY KEY CLUSTERED
(
[id] ASC
)
)
GO
ALTER TABLE [dbo].[mything]
ADD CONSTRAINT [DF_mything_id]
DEFAULT (newsequentialid()) FOR [id]
GO
And when I add an entity with the guid primary key already set, it ends up as a record in the database with a new guid primary key.
var anEntity = new mything
{
id = "17870C25-FC04-EB11-80E9-000C29F38B54",
foo = "Some stuff",
}
dbContext.mythings.Add(anEntity);
dbContext.SaveChanges();
EF seems to ignore the guid that was provided in the record and writes the record with a new Guid.
What I would expect is that the record provided had a null guid, it would be populated with a new guid, and if it was not null it would be used unchanged. But that's not what I'm seeing happen.
I'm seeing duplicate records, with different GUIDs, instead of primary key violation exceptions, because the GUIDs I'm providing in my EF entities are being ignored.
Why could this be happening?
Please tell me this isn't by design!
===
OK, this does seem to be by design.
First, this isn't in SQL server. If I try to insert a record with the id field set, it inserts, or fails with a primary key failure if there is already a record with that id. It only creates a new GUID for the id field if the provided id field is null.
But in EF, the value in the id field is ignored, and a new GUID is generated every time.
It was suggested to me that EF was behaving this way so as to follow the same pattern as when using autoincrement keys. And that does seem to be the case.
In SQL Server, if you try to provide a value to an autoincrement key field on an insert, you get an error:
Cannot insert explicit value for identity column in table
But in EF, the value you provide is ignored, and a new value is generated.
So in this respect, Entity Framework is consistent. Consistently wrong, but consistent.

First step I'd look at to narrow this down is to capture a Profiler/Extended Event trace at the database to see exactly what EF is sending to the database.
Your expectation of the database behaviour is correct - so I'd want to understand where it is breaking down first

Related

Convert VARBINARY to Int or BigInt

My question is very simple and I understand that few Old days DB design is not good as we espect these days.
My legacy table does not have Primary key to perform Delta load. Hence, I'm trying to use Hashing concept to create Unique key. As "HASHBYTES" return VarBinary and I can not use VarBinary type as
primay key (not sure about this)
Ref URL on MSDN:
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/94231bb4-ccab-4626-a9fb-325264bb883f/can-varbinary700-column-be-used-as-primary-key?forum=transactsql
hence, I'm converting this to INT or BigInt. The problem is it gives both negative as well as positive value(due to the range).
My Question is:
How can I convert VARBINARY(100) type to integer or BigInt (+ve value) and Set this as a Primary key in one of my table?
Edit Note:
I tried to use VARBINARY as Primary key for Delta load in SSIS Lookup task. I got the error:
"Violation of PRIMARY KEY constraint 'PK__DMIN__607056C02FB7E7DE'. Cannot insert duplicate key in object 'dbo.DMIN_'. The duplicate key value is (0x00001195764c40525bcaf6baa922091696cd8886).".
However, when I checked for duplicate key from the table. Table does not have duplicate key. Then why this error is showing up?
Please note that, the 1st time of SSIS execution worked fine. However, it shows error during 2nd execution [during "lookup match output"].
Please help. Thanks.
In projects I've worked on before we've always used GUIDs as our primary keys, utilising the unique identifier type in SQL Server.
The main problem with this however, is that using a uniqueidentifier type as your clustered index can degrade the performance of your database after some time, so recently we've taken the following approach (based on this article):
Create column: guid, uniqueidentifier, nonnull, default value newsequentialid(), PK
Create column: id, bigint, nonnull, identity(1,1)
Create a non clustered index on the guid column, unique
Create a clustered index on the id column, unique
That way when you insert into this new table, you don't have to worry about the keys or identities.
If you need some form of reference between the old database and the new and you CAN modify the structure of the old database, you can create a uniqueidentifier column in that (or char(36) if it doesn't support uniqueidentifier) and assign a guid to each of those and THEN create an additional uniqueidentifier column in the new database so you have that reference and insert that value into it. If that makes sense.

Postgres INSERT INTO... SELECT violates foreign key constraint

I'm having a really, really strange issue with postgres. I'm trying to generate GUIDs for business objects in my database, and I'm using a new schema for this. I've done this with several business objects already; the code I'm using here has been tested and has worked in other scenarios.
Here's the definition for the new table:
CREATE TABLE guid.public_obj
(
guid uuid NOT NULL DEFAULT uuid_generate_v4(),
id integer NOT NULL,
CONSTRAINT obj_guid_pkey PRIMARY KEY (guid),
CONSTRAINT obj_id_fkey FOREIGN KEY (id)
REFERENCES obj (obj_id)
ON UPDATE CASCADE ON DELETE CASCADE
)
However when I try to backfill this using the following code, I get a SQL state 23503 claiming that I'm violating the foreign key constraint.
INSERT INTO guid.public_obj (guid, id)
SELECT uuid_generate_v4(), o.obj_id
FROM obj o;
ERROR: insert or update on table "public_obj" violates foreign key constraint "obj_id_fkey"
SQL state: 23503
Detail: Key (id)=(-2) is not present in table "obj".
However, if I do a SELECT on the source table, the value is definitely present:
SELECT uuid_generate_v4(), o.obj_id
FROM obj o
WHERE obj_id = -2;
"0f218286-5b55-4836-8d70-54cfb117d836";-2
I'm baffled as to why postgres might think I'm violating the fkey constraint when I'm pulling the value directly out of the corresponding table. The only constraint on obj_id in the source table definition is that it's the primary key. It's defined as a serial; the select returns it as an integer. Please help!
Okay, apparently the reason this is failing is because unbeknownst to me the table (which, I stress, does not contain many elements) is partitioned. If I do a SELECT COUNT(*) FROM obj; it returns 348, but if I do a SELECT COUNT(*) FROM ONLY obj; it returns 44. Thus, there are two problems: first, some of the data in the table has not been partitioned correctly (there exists unpartitioned data in the parent table), and second, the data I'm interested in is split out across multiple child tables and the fkey constraint on the parent table fails because the data isn't actually in the parent table. (As a note, this is not my architecture; I'm having to work with something that's been around for quite some time.)
The partitioning is by implicit type (there are three partitions, each of which contains rows relating to a specific subtype of obj) and I think the eventual solution is going to be creating GUID tables for each of the subtypes. I'm going to have to handle the stuff that's actually in the obj table probably by selecting it into a temp table, dropping the rows from the obj table, then reinserting them so that they can be partitioned properly.

Integrity constraint violation in cakephp

I have created table with Composite Primary Key, but while edit giving Integrity constraint violation error in cakephp.
Integrity constraint violation : 1062 Duplicate entry while saving Composite Primary Key data of model
Integrity constraint violations mean that you are trying to save a duplicate of a Unique value in the database. Primary Keys have to be Unique.
Do you have your Primary Key field in your database set to auto increment? If you do not, that may be your problem.
Otherwise, when you insert a record, it's probably going to insert a row with PK of 0. Then when it tries to insert another record, it will try to insert another row with PK of 0, thus not being unique, and throwing the Integrity Constraint violation.
However
You mentioned that you are doing an edit. If you are doing an edit, then you are not passing the edited rows Primary Key when you are saving it to the database, so cake tries to do a CREATE instead, thus resulting in another duplicate row ID.
Make sure you do this:
$this->Model->id = $id; // Where $id is the Primary Key of the row being edited.
Conversely, you can also do this:
$data['Model']['id'] = $id;
$this->Model->save($data);
You can capture the $id by either storing it as a hidden field in your edit form, or as a URL parameter passed to the action.

Creating New Foreign Key (SQL Server)

I am having a bit of trouble creating a foreign key in my DB. Here is a paraphrased model of what my tables look like:
NOTE
* (PK) NOTE_ID BIGINT
* TITLE VARCHAR(200)
* DATE DATETIME
* SERIES_ID BIGINT
SERIES
* (PK) SERIES_ID BIGINT
* TITLE VARCHAR(200)
* DESCR VARCHAR(1000)
I am trying to create a "has a" relationship between NOTE and SERIES by SERIES_ID. I thought that setting up a foreign key between the two tables by SERIES_ID would be the solution, but when I attempt to create it I get the following error:
ERROR: There are no primary or candidate keys in the referenced table 'dbo.SERIES' that match the referencing column list in the
foreign key 'FK_SERIES_NOTE'. Could not create constraint
I'm using the web database manager that comes with the GoDaddy SQL Server I set up, so I'm not sure what the underlying query it's trying to use or I would post it.
At the end of the day, this is all to create a relationship so that the NHibernate mappings for my Note object will contain a one-to-one relationship to a Series object. I may not even be trying to tackle this the correct way with the foreign key, though.
Am I going about this the correct way?
EDIT:
In an attempt to pair down the tables to a more simple example, I removed what I thought to be several non-critical columns. However, I ended up leaving a field that was actually a part of the composite primary key on the series table. So, because I was trying to assign the foreign key to only one part of the composite key, it was not allowing me to do so.
In the end, I have taken another look at the structure of my table and found that I don't actually need the other piece of the composite key - and after removing, the assignment of the foreign key works great now.
If you can, you may try running the following statement in a query analyzer and see the resulting error message (I guess #Damien_The_Unbeliever is right ) :
ALTER TABLE NOTE ADD CONSTRAINT FK_SERIES_NOTE
FOREIGN KEY (SERIES_ID) REFERENCES SERIES(SERIES_ID)
--ON DELETE CASCADE
-- uncomment the preceding line if you want a delete on a serie
-- to automatically delete all notes on this serie
Hope this will help

Create nonclustered primary keys using NHibernate and SchemaExport

We're using SchemaExport via ActiveRecord. By default it generates a table like this:
create table List (
Id UNIQUEIDENTIFIER not null,
Name NVARCHAR(255) null,
OwnerId UNIQUEIDENTIFIER null,
primary key ( Id ))
SQL Server then defaults to adding a clustered index for the primary key. But I want this to be nonclustered. I want to add a clustered index to OwnerId as this will be much more efficient.
Now, I could run a script afterwards to create a non-clustered index. This would involve dropping the original primary key constraint and adding a non-clustered one. However, SchemaExport has already helpfully created all my foreign key constraints which stop me dropping the primary key.
So I need to drop the foreign keys, which have an unhelpful name like FK4BAD9607D2BEDDB5, then recreate them (can I do this again automatically?). It's all a bit of a headache.
It would be a lot easier if I could just get in there somehow and add a nonclustered specification to the primary key as it generates it. Is there a relevant bit of the export tool I can override to do this?
Thanks
I believe your best option is using SchemaExport to create the script, and modify it manually.
Otherwise, you'll need to override Dialect.GetAddPrimaryKeyConstraintString.

Resources