How do I get this column as similar to a PERSISTED COMPUTED column in the database?
My current attempt (it loads all CompCol rows with null in seed) :
public class Call
{
public Call()
{
}
[Key]
public int Id { get; set; }
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public string CompCol
{
get
{
return "ABC-" + Convert.ToString(Id).PadLeft(5, '0');
}
protected set {}
}
}
The solution I found was to :
Make sure auto migrations are turned off. This is so that VS will generate a script (fluent api code) for us to further customise instead of just running it. So in the configuration class :
public Configuration()
{
AutomaticMigrationsEnabled = false;
}
Add the field to the class and set it as computed like so, the setter is private because we obviously cannot write to a computed field :
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public string BreakdownNo { get; private set; }
Then do an add-migration [xyz-name] in the Package Manager Console to generate the migration code, which will appear under the migrations folder with the given name.
Inside the migration comment out the code in Up() and add custom SQL like so :
public override void Up()
{
//AddColumn("dbo.Calls", "BreakdownNo", c => c.String());
Sql("ALTER TABLE dbo.Calls ADD BreakdownNo AS ('BD'+RIGHT('00000'+ CAST(Id AS VARCHAR), 6))");
}
Do an update-database in the PM and it should add the computed column properly.
FURTHER NOTES : If you get the formula wrong then you will have to revert back the migration by doing an update-database -targetMigration: [name of migration to go back to] then do another add-migration name and amend your formula there, finishing off with update-database. There may be a better way but this is what I found and used.
I did not however find a way to make the field persisted yet.
Why not calling sql like that:
public class demo
{
void demoMethod()
{
Model1 model = new Model1();//Model1 : DbContext
model.Database.ExecuteSqlCommand("alter table Results drop column Total; alter table Results add Total AS (Arabic + English + Math + Science)");
}
}
I ran into some trouble using the method proposed in the accepted answer. I offer an alternative solution that worked for me.
I encountered a failure when running this query:
oDb.LogEntries.SingleOrDefault(Function(LogEntry) LogEntry.LogTime = dDate)
The error message:
The 'MinutesOffline' property on 'LogEntry' could not be set to a 'System.Int32' value. You must set this property to a non-null value of type 'System.Single'.
As we can see, EF 6.2 is trying to write a value to the property. Whether this is due to EF internally attempting to write to the Private Set, I don't know. It almost seems like it. But the end result is what matters: the statement fails.
Instead of setting the column to DatabaseGeneratedOption.Computed, I ended up ignoring it altogether: Builder.Entity(Of LogEntry).Ignore(Function(LogEntry) LogEntry.MinutesOffline).
This enabled me to create a read-only property:
Public ReadOnly Property MinutesOffline As Single
Get
Return IIf(Me.Scale < 1, 5, 0)
End Get
End Property
It also has the added benefit that we don't have to comment out any lines in the generated migration.
We still have to make the custom Sql() call in Up():
ALTER TABLE [LogEntries] ADD [MinutesOffline] AS (IIF([Scale] < 1, 5, 0)) PERSISTED
...and the PERSISTED keyword does work here. This becomes a persisted computed column.
YMMV
--EDIT--
I found out why I was getting the casting error; it had nothing to do with migrations and everything to do with my code. I wasn't properly casting the computed column at creation time:
ALTER TABLE [LogEntries] ADD [MinutesOffline] AS (IIF([Scale] < 1, 5, 0)) PERSISTED
The proper syntax for doing so is this:
ALTER TABLE [LogEntries] ADD [MinutesOffline] AS (CAST((IIF([Scale] < 1, 5, 0)) AS REAL)) PERSISTED
Accordingly, I've reverted the Ignore() calls and switched everything back over to the method proposed in the accepted answer.
Hat tip to JotaBe for the assistance.
Related
I'm new to using EF to handle data in SQL. In a MVC Core project we're testing EF (Microsoft.EntityFrameworkCore, version 2.2.3) to handle data.
When trying to update data and update failed for some reason (missing fields etc) it seemed like EF actually deleted the record from the database (MSSQL 2014) instead of throwing an update error...
Is it possible?
Code for updating:
public void Update(Contact contact)
{
_dbContext.Update(contact);
_dbContext.SaveChanges();
}
When trying to update data and update failed for some reason (missing fields etc) it seemed like EF actually deleted the record from the database (MSSQL 2014) instead of throwing an update error...
Is it possible?
It should not.
test it, try to debug here
_dbContext.Update(contact);
_dbContext.SaveChanges();
var updated = _dbContext.Contacts.FirstOrDefault(x => x.Id == contact.Id); //debug here
check if it has a value, if still none, these are the scenarios i can think of that may have caused your problem
investigate the missing field specially if it is not nullable.
is the _dbContext used here is the same connection string used with everything?
is the [Key] attribute listed on your Contact entity?
public class Contact
{
[Key]
public int Id
}
overridden the SaveChanges function?
is what you are passing Contact contains a Key and it is not 0?
is a delete function called after Update?
try using SQL Profiler to look at the Linq to SQL if it really generated an update query and if it is really pointing at the right [Key]
but if it is still not working properly, you could do
public void Update(Contact contact)
{
var selectedContactToBeUpdated = _dbContext.Contacts.FirstOrDefault(x => x.Id == contact.Id);
if (selectedContactToBeUpdated != null)
{
selectedContactToBeUpdated.PropertyToBeUpdated1 = newValue;
selectedContactToBeUpdated.PropertyToBeUpdated2 = newValue2;
//additional Properties
_dbContext.SaveChanges();
}
}
in the scenario above, it will only generate an Update statement with fields you have changed.
I am beginning to use Dapper and love it so far. However as i venture further into complexity, i have ran into a big issue with it. The fact that you can pass an entire custom object as a parameter is great. However, when i add another custom object a a property, it no longer works as it tries to map the object as a SQL parameter. Is there any way to have it ignore custom objects that are properties of the main object being passed thru? Example below
public class CarMaker
{
public string Name { get; set; }
public Car Mycar { get; set; }
}
propery Name maps fine but property MyCar fails because it is a custom object. I will have to restructure my entire project if Dapper can't handle this which...well blows haha
Dapper extensions has a way to create custom maps, which allows you to ignore properties:
public class MyModelMapper : ClassMapper<MyModel>
{
public MyModelMapper()
{
//use a custom schema
Schema("not_dbo_schema");
//have a custom primary key
Map(x => x.ThePrimaryKey).Key(KeyType.Assigned);
//Use a different name property from database column
Map(x=> x.Foo).Column("Bar");
//Ignore this property entirely
Map(x=> x.SecretDataMan).Ignore();
//optional, map all other columns
AutoMap();
}
}
Here is a link
There is a much simpler solution to this problem.
If the property MyCar is not in the database, and it is probably not, then simple remove the {get;set;} and the "property" becomes a field and is automatically ignored by DapperExtensions. If you are actually storing this information in a database and it is a multi-valued property that is not serialized into a JSON or similar format, I think you are probably asking for complexity that you don't want. There is no sql equivalent of the object "Car", and the properties in your model must map to something that sql recognizes.
UPDATE:
If "Car" is part of a table in your database, then you can read it into the CarMaker object using Dapper's QueryMultiple.
I use it in this fashion:
dynamic reader = dbConnection.QueryMultiple("Request_s", param: new { id = id }, commandType: CommandType.StoredProcedure);
if (reader != null)
{
result = reader.Read<Models.Request>()[0] as Models.Request;
result.reviews = reader.Read<Models.Review>() as IEnumerable<Models.Review>;
}
The Request Class has a field as such:
public IEnumerable<Models.Review> reviews;
The stored procedure looks like this:
ALTER PROCEDURE [dbo].[Request_s]
(
#id int = null
)
AS
BEGIN
SELECT *
FROM [biospecimen].requests as bn
where bn.id=coalesce(#id, bn.id)
order by bn.id desc;
if #id is not null
begin
SELECT
*
FROM [biospecimen].reviews as bn
where bn.request_id = #id;
end
END
In the first read, Dapper ignores the field reviews, and in the second read, Dapper loads the information into the field. If a null set is returned, Dapper will load the field with a null set just like it will load the parent class with null contents.
The second select statement then reads the collection needed to complete the object, and Dapper stores the output as shown.
I have been implementing this in my Repository classes in situations where a target parent class has several child classes that are being displayed at the same time.
This prevents multiple trips to the database.
You can also use this approach when the target class is a child class and you need information about the parent class it is related to.
After I read one question in attached link, I got a sense of how to set DateCreated and DateModified columns in Entity Framework and use it in my application. In the old SQL way though, the trigger way is more popular because is more secure from DBA point of view.
So any advice on which way is the best practice? should it be set in entity framework for the purpose of application integrity? or should use trigger as it make more sense from data security point of view? Or is there a way to compose trigger in entity framework? Thanks.
EF CodeFirst: Rails-style created and modified columns
BTW, even though it doesn't matter much, I am building this app using ASP.NET MVC C#.
Opinion: Triggers are like hidden behaviour, unless you go looking for them you usually won't realise they are there. I also like to keep the DB as 'dumb' as possible when using EF, since I'm using EF so my team wont need to maintain SQL code.
For my solution (mix of ASP.NET WebForms and MVC in C# with Business Logic in another project that also contains the DataContext):
I recently had a similar issue, and although for my situation it was more complex (DatabaseFirst, so required a custom TT file), the solution is mostly the same.
I created an interface:
public interface ITrackableEntity
{
DateTime CreatedDateTime { get; set; }
int CreatedUserID { get; set; }
DateTime ModifiedDateTime { get; set; }
int ModifiedUserID { get; set; }
}
Then I just implemented that interface on any entities I needed to (because my solution was DatabaseFirst, I updated the TT file to check if the table had those four columns, and if so added the interface to the output).
UPDATE: here's my changes to the TT file, where I updated the EntityClassOpening() method:
public string EntityClassOpening(EntityType entity)
{
var trackableEntityPropNames = new string[] { "CreatedUserID", "CreatedDateTime", "ModifiedUserID", "ModifiedDateTime" };
var propNames = entity.Properties.Select(p => p.Name);
var isTrackable = trackableEntityPropNames.All(s => propNames.Contains(s));
var inherits = new List<string>();
if (!String.IsNullOrEmpty(_typeMapper.GetTypeName(entity.BaseType)))
{
inherits.Add(_typeMapper.GetTypeName(entity.BaseType));
}
if (isTrackable)
{
inherits.Add("ITrackableEntity");
}
return string.Format(
CultureInfo.InvariantCulture,
"{0} {1}partial class {2}{3}",
Accessibility.ForType(entity),
_code.SpaceAfter(_code.AbstractOption(entity)),
_code.Escape(entity),
_code.StringBefore(" : ", String.Join(", ", inherits)));
}
The only thing left was to add the following to my partial DataContext class:
public override int SaveChanges()
{
// fix trackable entities
var trackables = ChangeTracker.Entries<ITrackableEntity>();
if (trackables != null)
{
// added
foreach (var item in trackables.Where(t => t.State == EntityState.Added))
{
item.Entity.CreatedDateTime = System.DateTime.Now;
item.Entity.CreatedUserID = _userID;
item.Entity.ModifiedDateTime = System.DateTime.Now;
item.Entity.ModifiedUserID = _userID;
}
// modified
foreach (var item in trackables.Where(t => t.State == EntityState.Modified))
{
item.Entity.ModifiedDateTime = System.DateTime.Now;
item.Entity.ModifiedUserID = _userID;
}
}
return base.SaveChanges();
}
Note that I saved the current user ID in a private field on the DataContext class each time I created it.
As for DateCreated, I would just add a default constraint on that column set to SYSDATETIME() that takes effect when inserting a new row into the table.
For DateModified, personally, I would probably use triggers on those tables.
In my opinion, the trigger approach:
makes it easier; I don't have to worry about and remember every time I save an entity to set that DateModified
makes it "safer" in that it will also apply the DateModified if someone finds a way around my application to modify data in the database directly (using e.g. Access or Excel or something).
Entity Framework 6 has interceptors which can be used to set created and modified. I wrote an article how to do it: http://marisks.net/2016/02/27/entity-framework-soft-delete-and-automatic-created-modified-dates/
I agree with marc_s - much safer to have the trigger(s) in the database. In my company's databases, I require each field to have a Date_Modified, Date_Created field, and I even have a utility function to automatically create the necessary triggers.
When using with Entity Framework, I found I needed to use the [DatabaseGenerated] annotation with my POCO classes:
[Column(TypeName = "datetime2")]
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public DateTime? Date_Modified { get; set; }
[Column(TypeName = "datetime2")]
[DatabaseGenerated(DatabaseGeneratedOption.Computed)]
public DateTime? Date_Created { get; set; }
I was attempting to use stored procedure mapping on an entity, and EF was creating #Date_Modified, #Date_Created parameters on my insert/update sprocs getting the error
Procedure or function has too many arguments specified.
Most of the examples show using [NotMapped], which will allow select/insert to work, but then those fields will not show up when that entity is loaded!
Alternately you can just make sure any sprocs contain the #Date_Modified, #Date_Created parameters, but this goes against the design of using triggers in the first place.
I've got a one-to-many relationship set up. (Ex. A Person with many Phone Numbers). In my get query i have this.ObjectContext.Person.Include("PhoneNumbers") and the in generated MetaData including public EntityCollection<PhoneNumbers> PhoneNumbers{ get; set; } I have also set up a DTO with this and other properties i need.
[Include]
[Association("Name","thisKey","otherKey")]
public IEnumerable<PhoneNumbers> PNums { get; set; }
I can retrieve all the data alright, and display it in silverlight, but when I create a new one I run into problems. I've got this kind of thing going on:
private void Button_Click(object sender, System.Windows.RoutedEventArgs e)
{
if (dgMMs.SelectedItem != null)
{
PhoneNumbers wb = new PhoneNumbers ();
wb.this = tbThis.Text;
wb.that = tbThat.Text;
wb.other = tbOther.Text;
wb.whatnot = tbwhatnot.Text;
((Person)dgMMs.SelectedItem).PNums.Add(wb);
}
}
Then I get this error when calling TDataSource.SubmitChanges();:
Message = "Submit operation failed
validation. Please inspect
Entity.ValidationErrors for each
entity in EntitiesInError for more
information."
Alright, So i did that, and sure enough there is an error, but I don't quite understand why there is. I have a non-nullable field in the database for a last_modified_by field which i didn't set when I created it and added it to the entityCollection, and I guess this would be causing it, but my question comes from why RIA doesn't call my Insert method in my service that I've created because I want to set that field there. Like so:
public void InsertPhoneNumber(PhoneNumbers pnum)
{
pnum.last_modified = DateTime.Today;
pnum.last_modified_by = Thread.CurrentPrincipal.Identity.Name;
if ((pnum.EntityState != EntityState.Detached))
{
this.ObjectContext.ObjectStateManager.ChangeObjectState(pnum, EntityState.Added);
}
else
{
this.ObjectContext.PhoneNumbers.AddObject(pnum);
}
}
But it's like RIA adds my object and calls it own Insert Method. So I rolled with it at first, and just set the property up in the UI, then it would give me this error:
Message = "Submit operation failed. An
error occurred while updating the
entries. See the inner exception for
details. Inner exception message:
Cannot insert explicit value for
identity column in table
'iset_trkr_writeback' when
IDENTITY_INSERT is set to OFF."
I never set the identity field to anything, I thought RIA would do this for me. But when i debug and take a look, it has a 0 for the value. But at least this time it calls my insert method in my service... Maybe I'm missing a big something for my process, but I really could use some help. Thanks:)
You using Entity Framework? If so, you need a [Key] attribute on at least one field in your metadata. Or create an identity/PK column (int/guid), and then update the metadata.
does anyone know a way how I could set through mapping the default value of a column so for e.g. when I generate DB from mappings I would have DateTime column having getdate() as default value?
I tried so far this (looks exactlly like what I need) but it doesn't work
this.Map(x => x.LastPersistedOn, "DateModified")
.Access.Property()
.Default("getdate()");
I just tried setting some default values and it worked as expected. I am using Fluent as retrieve from Git the 24.05.2010 so updating your copy may solve your problem.
Mapping
public class SampleEntity
{
public virtual DateTime DateTimeProperty { get; set; }
}
With
public class SampleEntityMap
: ClassMap<SampleEntity>
{
public SampleEntityMap()
{
Map(x => x.DateTimeProperty, "DateTimeColumn")
.Access.Property() //actually not necessary
.Not.Nullable()
.Default("getDate()");
}
}
this will produce the following sql (from output to the console)
create table SampleEntity(
DateTimeColumn DATETIME default getDate() not null
)
--
Dom
The way to do this is to assign the current DateTime in code rather than using default value in the database. Then treat it as a normal column. Seemed a bit strange to me at first coming from a model-driven design background, but managing default values at the POCO level is the DDD way to do it.
Would be good to hear others' opinions too