Proper change-logging impossible with Entity Framework? - database

I'd like to log all changes made an SQL Azure database using Entity Framework 4.
However, I failed to find a proper solution so far..
So far I can track all entities themself by overriding SaveChanges() and using the ObjectStateManager to retrieve all added, modified and deleted entities. This works fine. Unfortunately I don't seem to be able to retrieve any useful information out of RelationshipEntries. In our database model we some many-to-many relationships, where I want to log new / modified / deleted entries too.
I want to store all changes in an Azure Storage, to be able to follow changes made by a user and perhaps roll back to a previous version of an entity.
Is there any good way to accomplish this?
Edit:
Our scenario is that we're hosting a RESTful WebService that contains all business logic and stores the data in the Azure SQL Database. A client must be authenticated as a user with the WebService, and I'd need to store the information which user changed the data.

See FrameLog, an Entity Framework logging library that I wrote for this purpose. It is open-source, including for commercial use.
Even if you don't want to use the library, you can look at the code to see one way of handling logging relationships. It handles all relationship multiplicities.
Particularly, see the code for the private methods logRelationshipChange, and logForeignKeyChange in the ChangeLogger class.

You can do it with a tracing provider.

You may want to consider just using a database trigger for this. Whenever a value in a table is changed, copy the row to another Archive table. It has worked pretty well for me.

Related

Managing multiple datasources in CakePHP

I'm planning to develop a web application in CakePHP that shows information in graphics and cards. I chose CakePHP because the information that we need to show is very structured, so the model approach makes easier to manage data; also I have some experience with MVC from ASP.NET and I like how simple is to use the routing.
So, my problem is that the multiple organizations that could use the app would have their own database with a different schema that the one we need. I can't just set their string connection in the app.php file because their database won't match my model.
And the organization datasource couldn't fit my model for a lot of reasons: the tables don't have the same name, the schema is different, the fields of my entity are in separated tables, maybe they have the info in different databases or also in different DBMS!
I want to know if there's a way to make an interface that achieves this
In such a way that cakephp Model/Entity can use data regardless of the source. Do you have any suggestions of how to do that? Does CakePHP have an option to make this possible? Should I use PHP with some kind of markup language like JSON or XML? Maybe MySQL has an utility to transform data from different sources into a view and I can make CakePHP use the view instead of the table?
In case you have an answer be as detailed as you can.
This other options are possible if it's impossible to make the interface:
- Usw another framework that can handle this easier and has the features I mentioned above.
- Make the organization change their database so it matches my model (I don't like this one, and probably they won't do it).
- Transfer the data in the application own database.
Additional information:
The data shown in graphics are from students in university. Any university has its own database with their own structure and applications using the db, that's why isn't that easy to change structure. I just want to make it as easy as possible to any school to configure their own db.
EDIT:
The version is CakePHP 3.2.
An important appointment is that it doesn't need all CRUD operations, only "reading". Hope that makes the solution easier.
I don't think your "question" can be answered properly, it doesn't contain enough information, not enough details. I guess there is something that will stay the same for all organizations but their data and business logic will be different. But I'll try it.
And the organization datasource couldn't fit my model for a lot of reasons: the tables don't have the same name, the schema is different, the fields of my entity are in separated tables, maybe they have the info in different databases or also in different DBMS!
Model is a whole layer, so if you have completely different table schemas your business logic, which is part of that layer, will be different as well. Simply changing the database connection alone won't help you then. The data needs to be shown in the views as well and the views must be different as well then.
So what you could try to do and what your 2nd image shows is, that you implement a layer that contains interfaces and base classes. Then create a Cake plugin for each of the organizations that uses these interfaces and base classes and write some code that will conditionally use the plugin depending on whatever criteria (guess domain or sub-domain) is checked. You will have to define the intermediate interfaces in a way that you can access any organization the same way on the API level.
And one technical thing: You can define the connection of a table object in the model layer. Any entity knows about it's origin but you should not implement business logic inside an entity nor change the connection through an entity.
EDIT: The version is CakePHP 3.2. An important appointment is that it doesn't need all CRUD operations, only "reading". Hope that makes the solution easier.
If that's true either use the CRUD plugin (yes, you can use only the R part of it) or write some code, like a class that describes the organization and will be used to create your table objects and views on the fly.
Overall it's a pretty interesting problem but IMHO to broad for a simple answer or solution that can be given here. I think this would require some discussion and analysis to find the best solution. If you're interested in consulting you can contact me, check my profile.
I found a way without coding any interface. In fact, it's using some features already included in the DBMS and CakePHP.
In the case that the schema doesn't fit the model, you can create views to match de table names and column names from the model. By definition, views work as a table so CakePHP searches for the same table name and columns and the DBMS makes the work.
I made a test with views in MySQL and it worked fine. You can also combine the data from different tables.
MySQL views
SQL Server views.
If the user uses another DBMS you just change the datasource in app.php, and make the views if it's necessary
If the data is distributed in different DBMS, CakePHP let's you set a datasource for each table, you just add it to app.php and call it in the table if it's required.
Finally, in case you just need the "reading" option, create a user with limited access to the views and only with SELECT privileges.
USING:
CakePHP 3.2
SQL SERVER 2016
MySQL5.7

Standard practice/API for sharing database data without giving direct database access

We would like to give some of our customers the option to read data from our central database. The data is live and new records are being added every few seconds. Our database is MySQL running on Amazon RDS.
I was wondering what is the common practice for doing so.
One option would be to give them select right from specific tables, in that case they would be able to access other customers' data as well.
I have tried searching for database, interface, and API key words and some other key words, but I couldn't find a good answer.
Thanks!
Use REST for exposing specific tables to do CRUD operations. You can control the access on it too.

asp.net code first automatic database updates

I am creating an application in C# Asp.net using Code First Entity Framework that will be using a different databases for different customers (in other words every customer has its own database, that will be generated on first time use).
I am trying to figure out a way to update all these databases automatically whenever I apply changes to my objects. In other words, how would I approach a cleanstep system in Code First EF?
Currently I am using InitializerIfModelChange to define a simple database that allows me to test my application whenever a schema change occurs. However, this method drops the database, which obviously is unacceptable in case of customer databases.
I must assume hundreds of customers so updating all databases by hand is not an option
I do not mind writing code that copies the data into a new database.
I think the best solution would be a way to backup a database somehow and then reinsert all data into the newly created database. Even better would be a way that automatically updates the schema without dropping the database. However I have no idea how to approach this. Can anyone point me in the right direction?
The link posted by Joakim was helpful. It requires you to update to EF 4.3.1 (dont forget your references in other projects if you have them) after which you can run the command that enables the migration. To automatically update the schema from code you can use
Configuration configuration = new Configuration();
DbMigrator migrator = new DbMigrator(configuration);
migrator.Update();
Database.SetInitializer<DbContext>(null);

Finding best approach to SaaS with multiple databases, a shared resource, and EF

First please excuse me for my grammar mistakes.
Ok, this is what I already know :-),
I want to use EF and MVC 4, UI with angularJs, I need a Database per user \ group of users,
my application growth may come to 5000+ users, they all have also a shared resource which is a single
database, when the user search for something the results will come both from the shared resource
and from the user own database.
Performance is extremely important.
In my research I found that EF can connect to different databases but i couldn't find any proper way of doing so without writing tons of code.
Scenarios :
New user registers, the system builds a new database for him.
New user logs in, the system returns data from his database and the shared database.
New user logs in, BUT, the system database got upgraded, users db should too.
Now I know that there is no easy method to achieve all of my goals,
but can you please direct me to what suits me best?
Again sorry for my English!
Thank you! :-)
IMHO, we have worked in several SaaS Applications that have been using a shared database [central repository] that will contain all the user [tenant] data and that there will be an application database [tenant based] for every user group.
This will work with ease in EF and there will be no performance overheads. You should not be using cross database queries and instead focus on the optimization of the EF code that you may have in the data access layer and then you can have separate services that will handle the task of merging the user data from the shared and separate databases.
May be you should analyze the application and then find the data that may be non-frequently updated and cache them and get them rendered using a distributed cache like Appfabric.
With respect to the synchronization of the User db and the centralized database, in the service tier, you can get this job done by wrapping these calls in a .Net Transaction scope and then the preserve the atomicity.
Please post your understanding and any further clarifications in my reply.

multi_schema and side effect problems

I am working on a project in which we need to define agencies in other cities. We have the same application but separate database schema for each agency. I used one session factory. For each request we get the person's username and therefore we can recognize which agency they belongs to. We change the PostgreSQL search_path for that.
The problem is now with cache. Since we are changing schema constantly it seems cache does not work.
Our jobs (which are written with quartz scheduling) seem to have problem because we are changing schema constantly.
Any ideas?
Can you not use a session factory per schema?

Resources