Feel free to tell me that this question needs to be moved and I will move it. I just don't know where else to go for help.
My current work flow is:
Create the database first (database Actual)
Run scaffold command which creates my models
Create a Visual Studio Database project
Import the database (database project)
Whenever I need to make a change to the database I follow the below:
Change the database project
Run a Schema Compare
Verify and update the database Actual
rerun the scaffold command with a -Force to rebuild all the models.
What (if any) type of problems am I leaving myself open to down the road?
I am not seeing the value of database migrations as I am updating the database first but using the database project to provide source control and some protection.
I always used to use the graphic database tool, but obviously with Core that is no longer an option.
I have also considered Devart's Entity Developer as a ORM.
Your thoughts and feedback are VERY much appreciated.
So the biggest problem is what happens when I need to make changes to the model.
So something simple like:
public partial class UserInfo
{
public int Id { get; set; }
[Required]
public string FirstName { get; set; }
public string LastName { get; set; }
public string UserName { get; set; }
public string Password { get; set; }
public DateTime RecordCreated { get; set; }
}
My '[Required]' will obliviously be gone after a -force.
Joe
That is the correct "database first" workflow for EF Core, and you would not use migrations in that scenario. Be sure to place customizations to your entities or DbContext in separate partial class files so they don't get clobbered when you regenerate the entities.
always used to use the graphic database tool, but obviously with Core that is no longer an option.
With this workflow you can use any graphical design tool you want for your database schema.
Related
I droped catalog_filename coulmn from database and I see this error :
Schema specified is not valid. Errors:
The relationship 'J_DBModel.FK__CATALOG_T__CATEG__41B8C09B' was not loaded because the type 'J_DBModel.CATALOG_TBL' is not available.
The following information may be useful in resolving the previous error:
The required property 'CATALOG_FILENAME' does not exist on the type 'Javad_New.Models.CATALOG_TBL'.
In my model catalog_filename is still existed .
public long CATALOG_ID { get; set; }
public Nullable<long> CATEGORY_FK { get; set; }
public string CATALOG_TITLE { get; set; }
public string CATALOG_DESC { get; set; }
public string CATALOG_CATEGORY { get; set; }
public Nullable<int> CATALOG_PAGENO { get; set; }
public Nullable<bool> CATALOG_RTL_FLAG { get; set; }
public string CATALOG_FILENAME { get; set; }
public Nullable<System.DateTime> CATALOG_DATE { get; set; }
public virtual CATEGORY_TBL CATEGORY_TBL { get; set; }
In code first pattern never (delete / change) in sql.
you should manage things from migration
in your case, just add column to table from sql then delete from c# class and add new migration
Make changes to the database
The first step is to make the necessary changes to your database. In this example, the only change being made is to a column name. Simple enough. It is important to note that if you have any stored procedures, etc, that rely on that column name, those should be updated as well.
Open the .edmx file
The model exists in your project as an .edmx file. Open the Solution Explorer (ctrl + alt + L) and type .edmx into the search bar at the top. Your model should appear (it’s the only one with that extension). Open it up and you will see a diagram of connected tables.
Delete the old table from the model
Before updating the model, it is necessary to delete the existing version of the table(s) you changed from the model. In the .edmx, right-click in the area between tables and select Model Browser. It will open a file tree of all the tables contained in your model. Delete every instance of the table you changed in your database. I find it helpful to enter the name of the modified table in the search bar at the top of the model browser, which will show you all tables in the model with that name. Delete them! Note that attempting to Update Model from Database without deleting the old tables first will not truly update the model, and your app may not compile correctly.
Update Model from Database
Now that your old table is deleted, right-click in the .edmx and select Update Model From Database. You will see the following screen:
If I had a simple model
public class Company
{
[JsonProperty("id")]
public int Id { get; set; }
[JsonProperty("name")]
public string Name { get; set; }
}
How would I convert this model into a SQL Server table with Entity Framework code-first?
Answer to your question would actually end up being a full tutorial.
That said, I suggest that you visit the following website and get yourself familiar with Entity Framework Code First concepts:
http://www.entityframeworktutorial.net/code-first/entity-framework-code-first.aspx
Mentioned tutorial has both conceptual overview as well as code samples.
Once you are done with tutorial, you will understand the following code snippet:
public class ApplicationContext : DbContext
{
public DbSet<Company> Companies { get; set; }
}
Then you need to create adequate migration and apply it to your database which will end up with creation of "Companies" or "Company" table (depending on EF version you are using).
I need to design domain that has two simple entities:
public class User
{
public virtual int Id { get; protected set; }
public virtual string Email { get; protected set; }
public virtual Country Country { get; protected set; }
...
}
public class Country
{
public virtual int Id { get; protected set; }
public virtual string Name { get; protected set; }
...
}
It's all nice and clear in domain world but the problem is that User and Country persisted in two different databases on two different servers (tho they are both MSSQL 2005 servers).
So, how should I correctly implement persistance of entites across different sql servers in NHibernate?
Using IDs instead of objects in references? Yeah, thats simple but it's hitting hard on the whole domain thing making domain object more like DTO. And it will require that IUserRepository get it's hands on ICountryRepository to load User entity.
Linked servers? Hm... Somehow I don't like it (distributed transactions and no XML columns). And what I should be aware in case of using them and more importantly how should I configure NHibernate to work effectively with linked servers?
Maybe some other solution?
I've heard of people using the schema property in a class mapping to contain the linked server name (like otherserver.dbo), but I don't know anyone that hasn't ran into one problem or another when doing that.
There are a few DDD bootstrapping frameworks that allow you to transparently map entities to different databases (resulting in multiple ISessionFactories, which it will manage for you). NCommon is one I would recommend. This assumes, however, that Country only exists in one database, and User only exists in another.
As for transactions... well, if you use a TransactionScope and configure DTS, that might work. NCommon uses a UnitOfWork API that also wraps TransactionScope.
You would have to change User so that Country is just an ID. Here's why. You'd end up with two session factories, one that has a mapping for Country and the other that has a mapping for User. If you don't make that change, NHibernate will complain that there is no mapping for Country when you save User (since they are stored in two different DBs).
Now you could instruct NHibernate to ignore Country property, and keep Country so your domain doesn't change. However, when you load User from the database next time, Country will be null.
You could use NHibernate.Shards from NHContrib.
I'm looking at different options for storing log entries for easier querying/reporting.
Currently I write scripts that parse and find the data, but the data is becoming more and more in demand, so it's becoming worth it to put the log data in a database.
Log entries are composed of key-value pairs, such as{"timestamp":"2012-04-24 12:34:56.789", "Msg":"OK" (simplified example).
I'm sure that eventually the log format will be extended to, say {"timestamp":"2012-04-24 12:34:56.789", "Msg":"OK", "Hostname":"Bubba", which means that the "schema" or "document definition" will need to change. Also, we're a Windows + .NET shop.
Hence, I was primarily looking for some NoSQL engine and found RavenDB attractive to use from .NET.
However, I have a hard time finding information about how it, and other NoSQL databases, work with heterogeneous records.
What would be a good fit in your opinion?
With RavenDB you can just store the different types of docs and it will be able to handle the "changes" in schema. Because it is in fact "schema-free", you can write indexes that will only index the fields that are there. See this blog post for some extra info. It's talking about migrations, but the same applies here.
Also the dynamic fields option will help you here. So given doc with arbitrary properties:
public class LogEntry
{
public string Id { get; set; }
public List<Attribute> Attributes { get; set; }
}
public class Attribute
{
public string Name { get; set; }
public string Value { get; set; }
}
You can write queries like this:
var logs = session.Advanced.LuceneQuery<LogEntry>("LogEntry/ByAttribute")
.WhereEquals("Msg", "OK")
.ToList();
I am working on a silverlight application and I am using RIA data services and nHibernate.
Currently, I have an entity with a one to many relationship to another entity.
public class Employer {
[Key]
public virtual int Id { get; set; }
public virtual string Name { get; set; }
}
public class Person {
[Key]
public virtual int Id { get; set; }
public virtual string Name { get; set; }
[Include]
[Association("PersonCurrentEmployer", "CurrentEmployerId", "Id", IsForeignKey = true)]
public virtual Employer CurrentEmployer { get; set; }
public virtual int? CurrentEmployerId { get; set; }
}
The property CurrentEmployerId is set for no insert and no update in the mappings.
On the Silverlight side, I set the CurrentEmployer property of the person to an existing employer on the client side submit the changes.
personEntity.CurrentEmployer = megaEmployer;
dataContext.SubmitChanges();
On the server side, the person entity's CurrentEmployerId is set to megaEmployer.Id but the CurrentEmployer is null. Because I am using the CurrentEmployer property and not the CurrentEmployerId to save the relationship, the relationship isn't changed.
Is there a way to force RIA to send the CurrentEmployer object with the save or do I have to use the CurrentEmployerId on the server side to load the employer and set it to the CurrentEmployer?
The reason you're not seeing your CurrentEmployer on the client side is because you don't have your association setup correctly.
RIA services doesn't work with references in the usual way so referencing your Employer on the client side doesnt work. RIA services works with entity sets and creates the "references" based on the association attributes. Your employer needs a property with an association back to the Person as follows.
public class Employer
{
private Person person;
[Key]
public virtual int Id { get; set; }
public virtual string Name { get; set; }
public virtual int PersonID { get; set; }
[Include]
[Association("PersonCurrentEmployer", "PersonID", "Id", IsForeignKey = false)]
public virtual Person Person {
get
{
return this.person;
}
set
{
this.person = value;
if (value != null)
{
this.PersonID = value.Id;
}
}
}
}
Is there a way to force RIA to send the CurrentEmployer object with the save or do I have to use the CurrentEmployerId on the server side to load the employer and set it to the CurrentEmployer?
I'm running into this problem as well. Basically, you either have to use the [Composition] attribute (which I wouldnt' recommend), or load the entity from the database, server-side. Composition muddies up the client data model and doesn't take care of all cases you need to worry about. (there is a lot more on Composition in the RIA forums.silverlight.net)
[UPDATE] Once you implement 2nd level cache, the worry of reading supporting entities from the database mostly goes away, as they will be loaded from cache. Also, if you only need a proxy for NHibernate to not complain, then look into Get/Load (can never remember which) .. which will return an NH proxy and will result in a single-column-and-entity select from the database. (If you try to access another property of the proxy, NH will select the rest. you can find more on this on Ayende's blog..)[/UPDATE]
The biggest problem I'm having is getting NHib to actually save and load the relationship. (I'm also using Fluent). The response from the responsible parties has so far been "waah, you can't do that. it looks like RIA wasn't developed with NHib in mind" .. which is a crap answer, IMHO. Instead of helping me figure out how to map it, they're telling me i'm doing it wrong for having a ForeignKey in my entity (NHib shouldn't care that i have my FK in my entity) ...
I want to share what I did to make this work, because 'official' support for this scenario was ... let's just say unhelpful at best, and downright rude at worst.
Incidentally, you had the same idea I had: making the Foreign Key not insert/update. BUT, I've also made it Generated.Always(). this way it will always read the value back.
Additionally, I override DomainService.Submit() and DomainService.ExecuteChangeSet(). I start an NHibernate Transaction in the Submit (though I'm not yet sure this does what I expect it does).
Instead of putting my save logic in the InsertSomeEntity() or UpdateSomeEntity() methods, I'm doing it all inside ExecuteChangeSet. this is because of NHibernate, and its NEED to have the entity graph fully-bi-directional and hydrated out prior to performing actions in NHibernate. This includes loading of entities from the database or session when a child item comes across the wire from RIA services. (I started down the path of writing methods to get the various other pieces of the graph as those specialized methods needed them, but I found it easier to do it all in a single method. Moreover, I was running into the problem of RIA wanting me to perform the insert/updates against the child objects first, which for new items is a problem.)
I want to make a comment about the composition attribute. I still stand by my previous comment about not recommending it for standard child entity collections, HOWEVER, it works GREAT for supporting NHibernate Components, because otherwise RIA will never send back the parent instance (of the composition), which is required for NHibernate to work right.
I didn't provide any code here because i would have to do some heavy redacting, but it's not a problem for me to do if you would like to see it.