I've got a Breeze Context Provider talking to a EF 6.1.1, database-first, application with SQL Server that I'm having some trouble with. I can INSERT a new record, but when I update it, not all the changed columns get written to the database.
I have a generated POCO that looks like this:
public partial class Inventory
{
public Inventory()
{
}
public int Id { get; set; }
public System.DateTime EnteredAt { get; set; }
public string EnteredBy { get; set; }
public string UpdatedBy { get; set; }
public Nullable<System.DateTime> UpdatedAt { get; set; }
public string Comment { get; set; }
}
When I go and update an entity on client side (setting Comment property) and send it to Breeze, I do some very simple sets in a EFContextProvider::BeforeSaveEntity override:
protected override Dictionary<Type, List<EntityInfo>> BeforeSaveEntities(Dictionary<Type, List<EntityInfo>> saveMap)
{
// only one inventory is ever sent in
if (saveMap.ContainsKey(typeof(Inventory)))
{
var source = saveMap[typeof(Inventory)].First().Entity as Inventory;
// set up the user and time fields
if (source.Id <= 0)
{
source.EnteredBy = _defaultUserName;
source.EnteredAt = DateTime.Now;
}
else
{
source.UpdatedBy = _defaultUserName;
source.UpdatedAt = DateTime.Now;
}
}
}
But when the change gets committed, the changed UpdatedBy value never gets in to the database.
I turned on EF6 SQL logging and sure enough, the UPDATE statement completely misses the property.
UPDATE [dbo].[Inventory]
SET [Comment] = #0, [UpdatedAt] = #1
WHERE ([Id] = #2)
-- #0: '1532' (Type = AnsiString, Size = 250)
-- #1: '2/4/2016 10:32:58 PM' (Type = DateTime2)
-- #2: '100344' (Type = Int32)
-- Executing at 2/4/2016 3:33:06 PM -07:00
-- Completed in 7 ms with result: 1
Of course, UpdatedBy is NULL in the database for this update.
I can't figure out why this particular column will not do through when an 'adjacent' column, set at the same time, does. I also don't know if this is a Breeze problem or an EF problem since I can go back in and just using EF, DBContext all works fine.
I also tried deleting the table from the EDMX file and re-adding it to no avail. I re-verified that the column is in the table in the EDMX file.
As a hack, what I have to do is go back in, re-read the changed record directly, update it, then send it back.
Any advice would be appreciated.
Can you post the BeforeSaveEntity method and a sample manipulation of the entity on the client side into your question?
Secondly, make sure you're updating the propertyMap for the entity info in your BeforeSave method like so:
source.UpdatedBy = "Joe User";
source.UpdatedAt = DateTime.Now;
source.Comment = "1532";
entityInfo.OriginalValuesMap["UpdatedBy"] = null;
entityInfo.OriginalValuesMap["UpdatedAt"] = null;
entityInfo.OriginalValuesMap["Comment"] = null;
I'm wondering if the "UpdatedAt" property is manipulated on the client before calling saveChanges, and therefore this property would already by identified as 'Modified' in the property map.
please Refer this Question you will get answer.
how to secure add or delete entities with breezejs
Two option here :
1. change in actual saveMap entity not in new entity.
2. create new entity as 'source' and replace it with actual entity.
Related
I have the following two models within my Blazor Server project:
Vergadering:
public class Vergadering
{
[Key]
public int Id { get; set; }
public string Naam { get; set; }
public DateTime DatumTijd { get; set; }
public ICollection<Bestuurslid> Aanwezigen { get; set; }
public string? Notulen { get; set; }
public ICollection<Vergadering>? HoofdVergadering { get; set; }
public ICollection<Vergadering>? GekoppeldeVergaderingen { get; set; }
public ICollection<Bestand>? Bestanden { get; set; }
public string? UserLastEditId { get; set; }
public IdentityUser? UserLastEdit { get; set; }
public DateTime? LastEdit { get; set; }
public ICollection<VergaderingAgendaItem>? vergaderingAgendaItems { get; set; }
}
VergaderingAgendaItem:
public class VergaderingAgendaItem
{
public int Id { get; set; }
public string Omschrijving { get; set; }
public bool Afgerond { get; set; }
public int? ParentId { get; set; }
public VergaderingAgendaItem? Parent { get; set; }
public int VergaderingId { get; set; }
public Vergadering Vergadering { get; set; }
public string? UserAangedragenId { get; set; }
public IdentityUser? UserAangedragen { get; set; }
}
This results in three tables:
Vergaderingen
VergaderingAgendaItems
VergaderingVergadering
In my repository I have the following update method:
public async Task ChangeAfgerondStatusAsync(VergaderingAgendaItem item)
{
using (var _db = _factory.CreateDbContext())
{
_db.VergaderingAgendaItems.Update(item);
await _db.SaveChangesAsync();
}
}
Whenever the Vergadering does not have a GekoppeldeVergadering this update method does not create any problem.
But whenever the Vergadering does have a GekoppeldeVergadering and I update a VergaderingAgendaItem of that Vergadering I get this error:
An error occurred while saving the entity changes. See the inner exception for details.
Looking at the command prompt that opens up while running the project I saw the following query and error.
Queries:
Error:
An exception occurred in the database while saving changes for context type 'AVA_ZICHT.Data.ApplicationDbContext'.
Microsoft.EntityFrameworkCore.DbUpdateException: An error occurred while saving the entity changes. See the inner exception for details.
Microsoft.Data.SqlClient.SqlException (0x80131904): Violation of PRIMARY KEY constraint 'PK_VergaderingVergadering'. Cannot insert duplicate key in object 'dbo.VergaderingVergadering'. The duplicate key value is (4, 3).
How is it that EF Core tries to update the GekoppeldeVergadering in VergaderingVergadering table. My method states VergaderingAgendaItem.Update()?
When handed a detached entity and told to Update it, EF will consider any associated entities as well. Since those references aren't tracked by the DbContext, the context will see those entities as new items to be inserted. This can result in duplicate key exceptions (as you are seeing) or inserting duplicate data with new PKs if those keys are set up as Identity columns.
One way to get around this issue is to use Automapper configured to just update the columns you expect to change:
public async Task ChangeAfgerondStatusAsync(VergaderingAgendaItem item)
{
using (var _db = _factory.CreateDbContext())
{
var existingItem = _db.VergaderingAgendaItems.Single(x => x.Id == item.Id);
Mapper.Map(item, existingItem);
await _db.SaveChangesAsync();
}
}
Alternatively this can be done manually by copying values from item to existingItem. existingItem is tracked entity so once it's updated, just call SaveChanges. The advantage of this over Update is that the resulting UPDATE SQL statement will only be for any columns that have actually changed, and it won't execute an UPDATE if nothing has actually changed.
This assumes we only want to copy fields from that entity, and none of the child/related entities. If you want to alter the collections/associations then you will need to eager load them and handle these separately. For instance changing the UserLastEdit reference, this is likely something you would want to eager-load so that it can be updated with the current User record.
My general advice is to avoid working with detached entities for concerns like this and instead use POCO view models. The trouble with using detached entities is that these are often incomplete representations of entity state, at worst, something deserialized from view state and cast into an Entity object. View Models can also be scaled down to just the data your client needs and what data is allowed to change. When it gets back to the server there is no confusion about what it is vs. what it pretends to be. Another consideration of applying updates which is important in multi-user systems is detecting stale data. Writing updates like this applies a "last in wins" approach where you should ideally check that the current DB data state concurrency token matches the token/version at the time that this user's original version was read. The attraction of using detached entities is the thought of avoiding a round-trip to the DB when performing an update, but in all honesty you should justify a round trip to ensure that the record is actually valid, the user actually can update that record, and the record hasn't been updated by someone else in the time this user was editing it.
I am creating a website for a warehouse using ASP.NET Core MVC and Entity Framework. There are over 5000 tools and equipment in this warehouse.
I have a model class like this:
public class Tool
{
[Key]
public long Id { get; set; }
public string Name { get; set; }
public string Description { get; set; }
}
I have another table which keeps the log of all of the inputs and outputs of tools which is like:
public class Transaction
{
public long Id { get; set; }
public string FormId { get; set; }
public DateTime Date { get; set; }
public bool IsInput { get; set; } // if input 1 if output 0
public float Quantity { get; set; }
public Tool Item { get; set; } //equipment
public string Description { get; set; }
}
Each day there will at least be 300 rows added to the Transaction table and in 3 years it will be over 300,000 rows. To get the quantity of an individual tool I did something like:
database.getTools()
.where(x => x.Id == ID)
.where(x => x.IsInput == true)
.select(x => x.quantity).sum() - //all inputs for this tool
database.getTools()
.where(x => x.Id == ID)
.where(x => x.IsInput == false)
.select(x => x.quantity).sum(); //all outputs for this tool
I am concerned that after sometime (few years) this function will be really time consuming especially if it has to iterate through all tools of the warehouse. One of the best ways is to make a fresh new table at the end of the warehouse counting period and initialize all of the tools with their closing stock quantity. This will make sure that the number of rows of the transaction table do not grow indefinitely.
But how to do this? What should I search for?
If my approach is not correct please correct me.
One more thing is that this is not the only purpose of the website which means that there are other tables doing some other things that I don't want to be affected by this process, what I mean is I cannot make a new database.
I am new to all of this so please keep it as simple as possible.
Thanks
well, First you are using a Database Management System which handles the numbers of rows you mentioned easily.
second, you don't need to use the "where" method two times, you don't even need it.
Also, you don't need "sum" as you filtering using the ID
when you use where to filter by ID it returns an IEnumerable(list) of type your class that contains one object.
I would use "SingleOrDefault" instead of "Where" method to achieve this, it returns an object for you
Here is an example
Transaction trans = _context.Transactions.SingleOrDefault(x=> x.Id == ID && x.IsInput == true);
where _context is your context object and Transactions is your DbSet property
Then you can use the object to access the quantity.
var quantity = trans.Quantity
Ok I’m at a loss, being new to breeze I’m still learning the ropes. My project uses the hot towel template for AngularJs and breeze from John Papa.
He's what I’m trying to achieve: I have a master\slave tables in my database. An "Agency" has many people it can "Notify". Here are the EF classes for the server side:
public class Agency {
public Agency() {
this.Notifies = new HashSet<Notify>();
}
public long Id { get; set; }
[Required, MaxLength(50)]
public string Name { get; set; }
<<removed unneeded details>>
public bool Active { get; set; }
public virtual ICollection<Notify> Notifies { get; set; }
}
public class Notify
{
public long Id { get; set; }
public long? AgencyId { get; set; }
public string Name { get; set; }
<<removed unneeded details>>
public virtual Agency Agency { get; set; }
}
Now the Maps:
public class AgencyMaps : EntityTypeConfiguration<Agency>
{
internal AgencyMaps()
{
HasKey(x => x.Id);
}
}
public class NotifyMap : EntityTypeConfiguration<Notify>
{
internal NotifyMap()
{
HasKey(x => x.Id);
HasOptional(x => x.Agency)
.WithMany(p => p.Notifies)
.HasForeignKey(i => i.AgencyId);
}
}
Now on the client side I use breeze to create new entities like this:
// create a new entity
function create() {
return manager.createEntity(entityName);
}
// create a new notify entity
function createNotify(){
return manager.createEntity(entityNameNotify);
}
Then there are two scenarios I need to achieve:
- First is where I retrieve an existing agency and add additional
people to notify
- Second is where I create a new agency and add people
to notify
Both fail in the same place.
Note: I’m using SQL server and my Id fields are bigint (long) at this point in time.
I’m retrieving the “Agency” entity and placing it in a variable called “vm.agency”. “vm.agency” has a navigation called “notifies” with an entity type of “Notify”. So when I want to create and add a new person I’m calling this function:
function addNotifyRec(){
if (vm.agency !== undefined){
var notifyRec = datacontext.agency.createNotify(); //<<< fails here
notifyRec.agencyId = vm.agency.id;
notifyRec.name = vm.notify.name;
<<removed unneeded details>>
vm.agency.notifies.push(notifyRec);
logSuccess(“New person to notify added”);
}
else
{ logError(“Agency is undefined”); }
}
As soon as the createNotify() is called I get the “Ids can not be autogenerated for entities with multipart keys” error.
So I’m stuck. It seems to me this is a pretty common scenario. I am obviously not understanding the breeze framework well enough to implement this. If you can point me in the right directions I’d appreciate your help.
UPDATE 4/9/2014
I'm thinking i could eliminate this issue altogether if i switch over to guid id and generate them client side. Is this correct thinking?
What's interesting here is that Breeze thinks that Notify.Id and Notify.AgencyId are multi part primary keys but they are actually not. Id is a PK and AgencyId is an FK. The only thing that I can think of is try removing the EntityTypeConfiguration for both Agency and Notify, specifically the part where it specifies HasKey and HasForeignKey. This Fluent API configuration shouldn't be required as EF will match your configuration by convention instead.
I took a different approach on working around my issue. Since i have the luxury to change out the id types, i swapped out the bigint ids to Uuid types and removed the auto generation of the ids in sql. Now i'm just creating my own ids using breeze.core.getUuid() when a new record is created. Not sure this is the most efficient way to work around the issue, but it seems to be working fine.
I have two tables: Word and Adjective, both with some properties. Primary key of both tables is ID, Adjective.ID also references Word.ID as foreign key so there is a 1-1 relationship.
I also have a repository for any kind of table with an Update function.
public void Update(T entity) {
var entry = DatabaseContext.Entry(entity);
DatabaseSet.Attach(entity);
entry.State = EntityState.Modified;
}
I take a value from the database, convert it into a ViewModel looking like this (of course it's actually a little more complex):
public class WordModel {
public int ID { get; set; }
public string OriginalWord { get; set; }
}
public class AdjectiveModel : WordModel {
public string Translation { get; set; }
}
Then I alter the values of properties Word and Translation, convert and write it back. After conversion I have an object like this:
Word = {
ID = 1
OriginalWord = y
Adjective = {
ID = 1
Translation = z
}
}
Upon updating however, only one table gets updated.
Database.Words.Update(Word) only updates the OriginalWord value in the Word table,
Database.Adjectives.Update(Word.Adjective) only updates the Translation value in the Adjective table.
When running the updates for both tables sequentially I get an InvalidOperationException: An object with the same key already exists in the ObjectStateManager. The ObjectStateManager cannot track multiple objects with the same key.
Creating a new database entry works perfectly.
I cannot believe I have to update both tables on their own and then save the context for each. I have created the database Repository via a Tutorial that obviously didn't explain well enough what's going on with the DbSet and the DbContext, which leaves me a little helpless here.
Sadly I have no link (it is quite a while ago I created the database project)
So, what am I doing wrong here?
You entity Word contains an entity Adjective, it is then the root of the object graph. Now generally here's what you should keep in mind in the following situations :
All objects in the graph are new (new word and new adjective)
use myDbContext.Words.Add(myNewWordObjectGraph); to have the correct state you want.
Only root is new (new word and a pre-existing non modified adjective)
use myDbContext.Entry(myNewWord).state = EntityState.Added; to have the correct state you want.
Root is modified and some nodes are modified (word and adjective both exist in the DB and both have been modified)
use myDbContext.Entry(myWord).State = EntityState.Modified; and myDbContext.Entry(myAdjective).State = EntityState.Modified; to have the correct state you want. i.e. call myDbContext.Entry(myObject).State = EntityState.Modified; for each modified object in the graph whether it's the root or some other node.
Root is unchanged and/or Modified and some nodes are added, others are also unchanged and/or modified
use myDbContext.MyRootObjectDbSet.Add(myRootObject); ; this will mark all the objects in the graph as EntityState.Added including the unchanged and/or modified objects. so the next call should be for each unchanged and/or modified object in order to correct its state : myDbContext.Entry(myObject).State = ThisObjectSCorrectState;.
I Hope that helps
EDIT
Calling DbSet.Attach(...) just adds the object to the objects tracked by EF. If you modify an object before calling DbSet.Attach(...), the modifications won't be persisted to DB when you call SaveChages(), so attaching an object as is before modification, calling DbSet.Attach(...) and then modifying the object is the way to make EF aware of the modifications.
Based on the way your update method's defined I would assume your repository looks something like this maybe?
//Not threadsafe as it contains a transient object 'DbContext'.
public class Repository<T> : IRespository<T> where T : class
{
private readonly MyDbContext context;
public Repository(MtDbContext context)
{
this.context = context
}
//...
public void Update(T entity) {... }
public void Commit() { context.SaveChanges(); }
}
I would suggest changing the update method to the following :
public void Update(T entity)
{
context.Entry(entity).State = EntityState.Modified;
}
And this update method would be called for each object you updated in the graph using the same instance of the repository enclosing the DbContext.
After I read one question in attached link, I got a sense of how to set DateCreated and DateModified columns in Entity Framework and use it in my application. In the old SQL way though, the trigger way is more popular because is more secure from DBA point of view.
So any advice on which way is the best practice? should it be set in entity framework for the purpose of application integrity? or should use trigger as it make more sense from data security point of view? Or is there a way to compose trigger in entity framework? Thanks.
EF CodeFirst: Rails-style created and modified columns
BTW, even though it doesn't matter much, I am building this app using ASP.NET MVC C#.
Opinion: Triggers are like hidden behaviour, unless you go looking for them you usually won't realise they are there. I also like to keep the DB as 'dumb' as possible when using EF, since I'm using EF so my team wont need to maintain SQL code.
For my solution (mix of ASP.NET WebForms and MVC in C# with Business Logic in another project that also contains the DataContext):
I recently had a similar issue, and although for my situation it was more complex (DatabaseFirst, so required a custom TT file), the solution is mostly the same.
I created an interface:
public interface ITrackableEntity
{
DateTime CreatedDateTime { get; set; }
int CreatedUserID { get; set; }
DateTime ModifiedDateTime { get; set; }
int ModifiedUserID { get; set; }
}
Then I just implemented that interface on any entities I needed to (because my solution was DatabaseFirst, I updated the TT file to check if the table had those four columns, and if so added the interface to the output).
UPDATE: here's my changes to the TT file, where I updated the EntityClassOpening() method:
public string EntityClassOpening(EntityType entity)
{
var trackableEntityPropNames = new string[] { "CreatedUserID", "CreatedDateTime", "ModifiedUserID", "ModifiedDateTime" };
var propNames = entity.Properties.Select(p => p.Name);
var isTrackable = trackableEntityPropNames.All(s => propNames.Contains(s));
var inherits = new List<string>();
if (!String.IsNullOrEmpty(_typeMapper.GetTypeName(entity.BaseType)))
{
inherits.Add(_typeMapper.GetTypeName(entity.BaseType));
}
if (isTrackable)
{
inherits.Add("ITrackableEntity");
}
return string.Format(
CultureInfo.InvariantCulture,
"{0} {1}partial class {2}{3}",
Accessibility.ForType(entity),
_code.SpaceAfter(_code.AbstractOption(entity)),
_code.Escape(entity),
_code.StringBefore(" : ", String.Join(", ", inherits)));
}
The only thing left was to add the following to my partial DataContext class:
public override int SaveChanges()
{
// fix trackable entities
var trackables = ChangeTracker.Entries<ITrackableEntity>();
if (trackables != null)
{
// added
foreach (var item in trackables.Where(t => t.State == EntityState.Added))
{
item.Entity.CreatedDateTime = System.DateTime.Now;
item.Entity.CreatedUserID = _userID;
item.Entity.ModifiedDateTime = System.DateTime.Now;
item.Entity.ModifiedUserID = _userID;
}
// modified
foreach (var item in trackables.Where(t => t.State == EntityState.Modified))
{
item.Entity.ModifiedDateTime = System.DateTime.Now;
item.Entity.ModifiedUserID = _userID;
}
}
return base.SaveChanges();
}
Note that I saved the current user ID in a private field on the DataContext class each time I created it.
As for DateCreated, I would just add a default constraint on that column set to SYSDATETIME() that takes effect when inserting a new row into the table.
For DateModified, personally, I would probably use triggers on those tables.
In my opinion, the trigger approach:
makes it easier; I don't have to worry about and remember every time I save an entity to set that DateModified
makes it "safer" in that it will also apply the DateModified if someone finds a way around my application to modify data in the database directly (using e.g. Access or Excel or something).
Entity Framework 6 has interceptors which can be used to set created and modified. I wrote an article how to do it: http://marisks.net/2016/02/27/entity-framework-soft-delete-and-automatic-created-modified-dates/
I agree with marc_s - much safer to have the trigger(s) in the database. In my company's databases, I require each field to have a Date_Modified, Date_Created field, and I even have a utility function to automatically create the necessary triggers.
When using with Entity Framework, I found I needed to use the [DatabaseGenerated] annotation with my POCO classes:
[Column(TypeName = "datetime2")]
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public DateTime? Date_Modified { get; set; }
[Column(TypeName = "datetime2")]
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public DateTime? Date_Created { get; set; }
I was attempting to use stored procedure mapping on an entity, and EF was creating #Date_Modified, #Date_Created parameters on my insert/update sprocs getting the error
Procedure or function has too many arguments specified.
Most of the examples show using [NotMapped], which will allow select/insert to work, but then those fields will not show up when that entity is loaded!
Alternately you can just make sure any sprocs contain the #Date_Modified, #Date_Created parameters, but this goes against the design of using triggers in the first place.