It's easy to use SQL Server Profiler to trace stored procedures activity. But how to trace SQL queries issued by LINQ via Entity Framework? I need to identify such queries (LINQ code) that consume a lot of time, are called most frequently and therefore are the first candidates for optimization.
Add this key to your connection string:
Application Name=EntityFramework
And filter by this in Profiler
Adding #ErikEJ's answer : if you are using .net Core, so you are using EFCore. There are no Database.Log property. You should use OnConfiguring override of your DbContext class and then
optionsBuilder.LogTo(Console.WriteLine);
Sample :
public class AppDbContext : DbContext
{
public DbSet<User> Users { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.LogTo(Console.WriteLine);
}
}
I've found useful DbContext.Database.Log property.
MSDN article Logging and Intercepting Database Operations
The DbContext.Database.Log property can be set to a delegate for any method that takes a string. Most commonly it is used with any TextWriter by setting it to the “Write” method of that TextWriter. All SQL generated by the current context will be logged to that writer. For example, the following code will log SQL to the console:
using (var context = new BlogContext())
{
context.Database.Log = Console.Write;
// Your code here...
}
What gets logged?
When the Log property is set all of the following will be logged:
The approximate amount of time it took to execute the command. Note that this is the time from sending the command to getting the result object back. It does not include time to read the results.
SQL for all different kinds of commands. For example:
Queries, including normal LINQ queries, eSQL queries, and raw queries from methods such as SqlQuery
Inserts, updates, and deletes generated as part of SaveChanges
Relationship loading queries such as those generated by lazy loading
Parameters
Whether or not the command is being executed asynchronously
A timestamp indicating when the command started executing
Whether or not the command completed successfully, failed by throwing an exception, or, for async, was canceled
Some indication of the result value
Related
Is there a central location in the JDBCTemplate (or related) where SQL manipulations can be performed immediately before they are sent to the DB?
I want to prepend a comment-line to each and every SQL statement that gets issued to the RDBMS.
Hope there is a dedicated extension point. Otherwise, I would need to write my own class that inherits from JDBCTemplate and adds my custom logic, which I want to avoid.
Is there a central location in the JDBCTemplate (or related) where SQL manipulations can be performed immediately before they are sent to the DB?
Yes, the DataSource.
With datasource-proxy you can intercept queries on DataSource level using a custom QueryTransformer:
private static class MyQueryTransformer implements QueryTransformer {
#Override
public String transformQuery(TransformInfo transformInfo) {
String query = transformInfo.getQuery();
// transform query
return query;
}
}
and supplying it into ProxyDataSourceBuilder:
ProxyDataSourceBuilder.create()
...
.queryTransformer(new MyQueryTransformer())
...
See also datasource-proxy-examples
I am attempting to transition to using MongoDB Transactions via Spring Data Mongo now that MongoDB 4.0 supports transactions, and Spring Data Mongo 2.1.5.Release supports it as well.
According to the Spring Data Mongo Documentation, you should be able to use the Spring MongoTransactionManager and have the MongoTemplate recognize and participate in ongoing transactions: https://docs.spring.io/spring-data/mongodb/docs/2.1.5.RELEASE/reference/html/#_transactions_with_mongotransactionmanager
However, this following test fails:
#Autowired
private TestEntityRepository testEntityRepository;
#Autowired
private MongoTemplate mongoTemplate;
#BeforeTransaction
public void beforeTranscation() {
cleanAndInitDatabase();
}
#Test
#Transactional
public void transactionViaAnnotation() {
TestEntityA entity1 = new TestEntityA();
entity1.setValueA("a");
TestEntityA entity2 = new TestEntityA();
entity2.setValueA("b");
testEntityRepository.save(entity1);
testEntityRepository.save(entity2);
// throw new RuntimeException("prevent commit");
List<TestEntityA> entities = testEntityRepository.findAll(Example.of(entity1));
Assertions.assertEquals(1, entities.size()); // SUCCEEDS
entities = testEntityRepository.findAll(Example.of(entity2));
Assertions.assertEquals(1, entities.size()); // SUCCEEDS
entities = mongoTemplate.findAll(TestEntityA.class);
Assertions.assertEquals(2, entities.size()); // FAILS - expected: <2> but was: <0>
}
It appears that the testEntityRepository works fine with the transaction. The asserts succeed, and if I uncomment the exception line, neither of the records are persisted to the database.
However, trying to use the mongoTemplate directly to do a query doesn't work as it appears to not participate in the transaction.
The documentation I have linked shows using the template directly within a #Transactional method like I am attempting. However, the text says
MongoTemplate can also participate in other, ongoing transactions.
which could be interpreted to mean the template can be used with different transactions, and not necessarily the implicit transaction. But that is not what the example would indicate.
Any ideas what is happening or how to get the template to participate in the same implicit transaction?
Let’s assume that the primary components in your application are an Angular client, which calls an ASP.NET Web API, which uses Entity Framework to perform CRUD operations on your database. So, for example, in your API controllers, the Post (Add) method adds a new entity to the database context and then commits it to the database by calling the Entity Framework SaveChanges method.
This works fine when only one record needs to be added to the database at a time.
But, what if, for example, you want to add several records of different entity types to your database in one transaction? Where do you implement the Database.BeginTransaction and Database.CommitTransaction/RollbackTransaction? If you add a service layer to accomplish this, then what does the Angular client call?
PLEASE SEE BELOW FOR FURTHER DETAIL AND QUESTIONS.
I want to provide more detail about my current approach to solving this problem and ask the following questions:
(1) Is this a good approach, or is there a better way?
(2) My approach does not port to .NET Core, since .NET Core does not support OData yet (see https://github.com/OData/WebApi/issues/229). Any thoughts or ideas about this?
I have stated the problems that I faced and the solutions that I chose below. I will use a simple scenario where a customer is placing an order for several items – so, there is one Order record with several OrderDetail records. The Order record and associated OrderDetail records must be committed to the database in a single transaction.
Problem #1: What is the best way to send the Order and OrderDetail records from the Angular client to the ASP.NET Web API?
Solution #1: I decided to use OData batching, so that I could send all the records in one POST. I am using the datajs library to perform the batching (https://www.nuget.org/packages/datajs).
Problem #2: How do I wrap a single transaction around the Order and OrderDetail records?
Solution #2: I set up an OData batch endpoint in my Web API, which involved the following:
(1) In the client, configure a batch request route.
// Configure the batch request route.
config.Routes.MapODataServiceRoute(
routeName: "batch",
routePrefix: "batch",
model: builder.GetEdmModel(),
pathHandler: new DefaultODataPathHandler(),
routingConventions: conventions,
batchHandler: new TransactionalBatchHandler(GlobalConfiguration.DefaultServer));
}
(2) In the Web API, implement a custom batch handler, which wraps a database transaction around the given OData batch. The batch handler starts the transaction, calls the appropriate ODataController to perform the CRUD operation, and then commits/rolls back the transaction, depending on the results.
/// <summary>
/// Custom batch handler specialized to execute batch changeset in OData $batch requests with transactions.
/// The requests will be executed in the order they arrive, that means that the client is responsible for
/// correctly ordering the operations to satisfy referential constraints.
/// </summary>
public class TransactionalBatchHandler : DefaultODataBatchHandler
{
public TransactionalBatchHandler(HttpServer httpServer)
: base(httpServer)
{
}
/// <summary>
/// Executes the batch request and wraps the execution of the whole changeset within a transaction.
/// </summary>
/// <param name="requests">The <see cref="ODataBatchRequestItem"/> instances of this batch request.</param>
/// <param name="cancellation">The <see cref="CancellationToken"/> associated with the request.</param>
/// <returns>The list of responses associated with the batch request.</returns>
public async override Task<IList<ODataBatchResponseItem>> ExecuteRequestMessagesAsync(
IEnumerable<ODataBatchRequestItem> requests,
CancellationToken cancellation)
{
if (requests == null)
{
throw new ArgumentNullException("requests");
}
IList<ODataBatchResponseItem> responses = new List<ODataBatchResponseItem>();
try
{
foreach (ODataBatchRequestItem request in requests)
{
OperationRequestItem operation = request as OperationRequestItem;
if (operation != null)
{
responses.Add(await request.SendRequestAsync(Invoker, cancellation));
}
else
{
await ExecuteChangeSet((ChangeSetRequestItem)request, responses, cancellation);
}
}
}
catch
{
foreach (ODataBatchResponseItem response in responses)
{
if (response != null)
{
response.Dispose();
}
}
throw;
}
return responses;
}
private async Task ExecuteChangeSet(
ChangeSetRequestItem changeSet,
IList<ODataBatchResponseItem> responses,
CancellationToken cancellation)
{
ChangeSetResponseItem changeSetResponse;
// Since IUnitOfWorkAsync is a singleton (Unity PerRequestLifetimeManager) used by all our ODataControllers,
// we simply need to get a reference to it and use it for managing transactions. The ODataControllers
// will perform IUnitOfWorkAsync.SaveChanges(), but the changes won't get committed to the DB until the
// IUnitOfWorkAsync.Commit() is performed (in the code directly below).
var unitOfWorkAsync = GlobalConfiguration.Configuration.DependencyResolver.GetService(typeof(IUnitOfWorkAsync)) as IUnitOfWorkAsync;
unitOfWorkAsync.BeginTransaction();
// This sends each request in the changeSet to the appropriate ODataController.
changeSetResponse = (ChangeSetResponseItem)await changeSet.SendRequestAsync(Invoker, cancellation);
responses.Add(changeSetResponse);
if (changeSetResponse.Responses.All(r => r.IsSuccessStatusCode))
{
unitOfWorkAsync.Commit();
}
else
{
unitOfWorkAsync.Rollback();
}
}
}
You do not need to implement Database.BeginTransaction and Database.CommitTransaction/RollbackTransaction if you are using Entity Framework. Entity Framework implements UnitOfWork. The only thing that you should care about is to work with a different instance of DbContext for every web request, but exaclty 1 instance for 1 request and call SaveChanges only 1 time when you made all the changes you need.
In case of any Exception during SaveChanges all the changes will be rolled back.
The angular client should not care about this, it only sends the data and checks if everything was fine.
This is very easy to do if you use an IoC framework, like Unity and let your DbContext injected in your Controller or Service.
In this case you should use the following settings (if you use Unity):
container.RegisterType<DbContext, YourDbContext>(new PerRequestLifetimeManager(), ...);
Then you can do this if you want to use it in a Controller:
public class YourController : Controller
{
private YourDbContext _db;
public YourController(DbContext context)
{
_db = context;
}
...
No need to over-complicate things. Add the code to the WebApi project. Pass around your Transaction object and re-use it. See https://msdn.microsoft.com/en-us/library/dn456843(v=vs.113).aspx for an example.
I really have no luck with the EF every time. This time it looks like this:
First time i created a context i had an error like:
Invalid object name 'dbo.TableName1'
After this setting the Database Initializer to CreateDatabaseIfNotExists T it did the trick.
Next i created a different context, which was changed at some point of the development. So this time i keep getting this error:
The model backing the 'NewContext' context has changed since the database was created.
I found a solution to set the Database Initializer to null, but after this i keep getting the first error:
Invalid object name 'dbo.TableName2'
I also tried to set the the initializer to DropCreateDatabaseIfModelChanges and DropCreateDatabaseAlways, and these settings throw the exception:
Cannot drop database "DatbaseName" because it is currently in use
I think I already tried everything i found on the Web (there are many topics of this kind) but none helped me with it.
Dropping the Database didn't help, nor changing from Local File Database to SQL Server 2014 Exspress. The same exception is thrown. Any ideas?
Edit1:
Working context:
public class ProfilesContext : DbContext
{
public ProfilesContext() : base("DefaultConnection")
{
}
public DbSet<Profile> Profiles { get; set; }
public DbSet<PrevilegedContact> PrevilegedContacts { get; set; }
}
Failing context:
public class PlacesContext : DbContext
{
public PlacesContext()
: base("DefaultConnection")
{
}
public DbSet<Place> Places { get; set; }
public DbSet<Price> Prices { get; set; }
public DbSet<Photo> Photos { get; set; }
public DbSet<Comment> Comments { get; set; }
}
For what you are doing, it seems you have 2 options for your initializer. There is Migrations and there is DropCreateDatabaseIfModelChanges.
Migrations will give control over updating the database tables when your models change and allow you to preserve data. You can configure it to allow data loss or not. Very useful during development time if you already have test data in there.
DropCreateDatabaseIfModelChanges will simply drop the database each time you make model changes and recreate it with the new schemas.
You are getting the error message
Cannot drop database "DatbaseName" because it is currently in use
because you have more than likely browsed the table in your server explorer and have an open connection to the database. You can close the connection using the right click context menu in server explorer.
If you want a video overview on Migrations there is a free video by Scott Allen & Pluralsight. This is MVC4 but the Entity Framework section does cover Initalizers. If you want an updated one for MVC5 to include multiple Contexts etc, it does exist but you would need to take the Pluralsight free trial to get access to it.
I agree with James. Good point about checking Server Explorer. Right-click the database and choose Close Connection or at least check that there is a red X indicating the connection is closed. The link below is a great tutorial on the basics with Code First Migrations. The link is directly to the page within the tutorial that talks about making model changes and how to make EF happy again! http://www.asp.net/mvc/tutorials/mvc-5/introduction/adding-a-new-field. I prefer using Migrations. You would use the Package Manager Console and 1) Enable Migrations, 2) Add a migration, 3) Update the database. If in development, use LocalDb if possible, and use the Seed method in the Configuration class to prepopulate the database.
I have mapped classes with custom sql (insert, delete, update) through procedure calls. But, I noticed that when my insert procedure fails raising exception, the GenericAdoException from NHibernate doesn't have my message raised from the procedure.
But, all raised exceptions from procedures for delete and update is catched well, only the insert procedure hasn't its exception message catched.
Is that a limitation or a bug of NHibernate 3.2.4 when we use "native" generator for ids combined with custom sql ?
I'm searching also ways to get some out parameters from that procedures like a timestamp to each event (insert, delete and update), the timestamp is generated inside procedures.
EDIT: OUT PARAMs - I found the "generated" option over properties mapping options which we can ask to NHibernate to get params from procedures. This means that these properties have genarated values. So I tried to use generated="always" and works for insert, update and delete operations. Example: <property name="MyProp" generated="always"/>
I found that sql server driver doesn't put the messages raised by stored procedures into the SqlException when you run these stored procedures with ExecuteReader(). On the other hand NHibernate executes the custom sql-insert with ExecuteReader() (I debbuged its source code) and I guess it's right and necessary to get the key when it's mapped with native (or identity), my case.
Well, and now what to do ? I found also (hard to found) that the SqlConnection has a event called "InfoMessage" in which you can receive (catch) all messages sent from your stored procedures (raiserror). Now this is possible to "catch" these messages, but how to make them cross NHibernate core and be received by our application when we insert something session.save() ?
Altough we have access to session and so to the connection (SqlConnection) the messages was already lost, because them are only received by the delegate assigned to the event SqlConnection.InfoMessage before of its occurrence.
To solve this, I tried two approaches:
In the first I projected a way to register the delegate inside DriverConnectionProvider.GetConnection() and this delegate would store the messages on the thread context associating it with the connection, so these messages could be getted later.
In the second and the one choosed, I implemented IDbConnection and IDbCommand wrapping inside them the SqlConnection and SqlCommand (but I think the NHibernate has a bug because in some places it references DbConnection instead IDbConnection - like in ManagedProviderConnectionHelper, so I had to extend from DbConnection and DbCommand instead).
Inside my CustomSqlConnection I register the delegate and store the messages for later use.
This is working ! Work as standalone driver (ADO) either as a NHibernate driver.
The idea is:
public class CustomSqlConnection : DbConnection, IDbConnection {
private SqlConnection con;
private StringBuilder str = new StringBuilder(0);
public CustomSqlConnection() {
con = new SqlConnection();
con.InfoMessage += OnInfoMessage;
}
private void OnInfoMessage(object sender, SqlInfoMessageEventArgs e) {
if (str.Length > 0) {
str.Append("\n");
}
str.Append(e.Message);
}
public string FetchMessage() {
string msg = Message;
str.Clear();
return msg;
}
...
...
}
EDIT: The hard step is to implement all operations from DdConnection and Dbcommand, repassing the call to the sql instance (look the field con above), so:
...
public override void Open() {
con.Open();
}
...