EF6 (6.1.3/ net45) Logging feature does not seem to work - sql-server

When I attempt to run the line:
MyDBContext.Database.Log = Console.Write
The compiler smiles and tells me I don't know what I am doing...
The app won't compile because of the line and the error on that line is:
Overload resolution failed because no accessible Write accepts this number of arguments.
That makes sense. 'Console.Write' returns nothing and I am setting it equal to a System.Action(Of String)
This just seems kind of half baked.
I tried numerous ways to fix it including delegates, and some of the other 'new possibilities' moving this off the Context is supposed to offer but still no dice.
What am I missing? Is it something that was changed at the last minute?
I have two large edmx files (one connects to SQL Server and the other to Oracle) in the solution and all of that is working great.
Here are my version numbers if that can help.
EntityFramework 6.0.0.0 (folder is ...\EntityFramework.6.1.3\lib\net45\EntityFramework.dll)
EntityFramework.SqlServer 6.0.0.0 (folder is ...\EntityFramework.6.1.3\lib\net45\EntityFramework.dll)
Oracle.ManagedDataAccess.EntityFramework 6.121.2.0
I have a tool I created that lets me paste the output of the L2S 'mycontext.log' into it and it then parses it and creates SSMS ready SQL with variables... it has been incredibly useful. This has been one of my favorite features of L2S.
Please help me understand why this isn't working.
Thanks in advance.

This technique works for me:
public override int SaveChanges()
{
SetIStateInfo();
#if DEBUG
Database.Log = s => Debug.WriteLine(s);
#endif
return base.SaveChanges();
}
http://blogs.msdn.com/b/mpeder/archive/2014/06/16/how-to-see-the-actual-sql-query-generated-by-entity-framework.aspx

Well, the answer was to research the Action(T) Delegate which showed me how to do it.
#If DEBUG Then
myctx.Database.Log = AddressOf Console.Write
#End If
Just needed the AddressOf and I was back in business.

Related

How to Execute (Vector) Storage.getInstance().readObject(filePath); in Java 8 Swing Package

Windows 10 Pro
Latest Simulator
Java Swing Project
I would like to execute "Vector a1 = (Vector) Storage.getInstance().readObject(filePath);"
In a Java Swing Application running on Windows 10 platform, I tried import CodenameOne.jar in Swing package, however when executing above code, get null pointer exception in Storage.getInstance()
Is there a way to execute this in Swing?
Thoughts?
Best Regards.
Thanks, I did not init the Display, however "Display.init(Object m)" requires an Object Argument and the Init method is deprecated.
Can you please provide me the codenameone Display dependencies?
And perhaps a java Swing snippet of code to initialize Display in order to execute Storage.getInstance().readObject(filePath)
Thoughts?
Best Regards
Thanks, Passing init(working directory) solved the Exception thrown.
Here is the Code snippet used to allow me to execute:
Storage.getInstance().readObject(filePath).
String filePath = incSrv.Pwd();// gets working directory
try {
javax.swing.SwingUtilities.invokeLater(new Runnable() {
public void run() {
Display.init(filePath);
String fileName = "A1-MMA.properties";
Vector a1 = (Vector) Storage.getInstance().readObject(filePath);
}
});
} catch (Exception e) {
}
And it does appear to work,
However I am left with a blacked out form that appears modal.
How can I avoid this or dispose it?.
FYI: What I am creating here is a work around for serializing Vector in Codenameone. I Save Vector to file using "Storage.getInstance().writeObject(Path, Vector)"
I convert the file to bytes and write it to the Swing Server VIA socket.
Using Storage.getInstance().readObject(file) on the Swing Server I have deserialize the object into the Vector from my app.
This appears to work well and is more efficient than the current method I use to deliver complex Vectors from the app to the Swing Server.
Can you please let me know if you see a red flag with this workaround?
Like The ability to Storage.getInstance().readObject(file) on the Swing Server may go away?
This method will save a lot of time in movind Vector data to and from the App/Server.
Thoughts Best Regards
Storage.getInstance().readObject(file) // (A1ServiceSrv.java:571)
Caused this Exception:
java.lang.NullPointerException
at com.codename1.io.Storage.init(Storage.java:89)
at com.codename1.io.Storage.getInstance(Storage.java:112)
at Main.A1ServiceSrv.loadVectorFromFile(A1ServiceSrv.java:571)
Regards
12/11/2021:
Thanks Shai,
I am including in my classpath CodenameOne.jar with update date of 12/11/2021 after CN1 refresh.
Getting Same null pointer exception.
Passing in Path "C:\Src1\A1-Arms\A1-Server\A1-MMA.properties" (Absolute Path)
Also Tried "A1-MMA.properties", however I don't think Codenameone knows where my home path is since we are not initializing it as we did with
Display.init("Current Working Directory where files reside");
This is the Fresh Stack Trace w/o calling Display.init (12-20-2021)
java.lang.NullPointerException at
com.codename1.ui.Display.getResourceAsStream(Display.java:3086)
at com.codename1.io.Log.print(Log.java:327)
at com.codename1.io.Log.logThrowable(Log.java:299)
at com.codename1.io.Log.e(Log.java:285)
at com.codename1.io.Storage.readObject(Storage.java:271)
at Main.A1ServiceSrv.loadVectorFromFile(A1ServiceSrv.java:596)
vector = (Vector) Storage.getInstance().readObject(filePath); // (A1ServiceSrv.java:596)
This is hopefully fixed by this commit: https://github.com/codenameone/CodenameOne/commit/72bf283bdaaefe5207bb9fd6787578e3ef61522c if not let me know with a fresh stack

Is there any way to trace\log the sql using Dapper?

Is there a way to dump the generated sql to the Debug log or something? I'm using it in a winforms solution so the mini-profiler idea won't work for me.
I got the same issue and implemented some code after doing some search but having no ready-to-use stuff. There is a package on nuget MiniProfiler.Integrations I would like to share.
Update V2: it supports to work with other database servers, for MySQL it requires to have MiniProfiler.Integrations.MySql
Below are steps to work with SQL Server:
1.Instantiate the connection
var factory = new SqlServerDbConnectionFactory(_connectionString);
using (var connection = ProfiledDbConnectionFactory.New(factory, CustomDbProfiler.Current))
{
// your code
}
2.After all works done, write all commands to a file if you want
File.WriteAllText("SqlScripts.txt", CustomDbProfiler.Current.ProfilerContext.BuildCommands());
Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this. However, if it helps, the core parts of mini-profiler are actually designed to be architecture neutral, and I know of other people using it with winforms, wpf, wcf, etc - which would give you access to the profiling / tracing connection wrapper.
In theory, it would be perfectly possible to add some blanket capture-point, but I'm concerned about two things:
(primarily) security: since dapper doesn't have a concept of a context, it would be really really easy for malign code to attach quietly to sniff all sql traffic that goes via dapper; I really don't like the sound of that (this isn't an issue with the "decorator" approach, as the caller owns the connection, hence the logging context)
(secondary) performance: but... in truth, it is hard to say that a simple delegate-check (which would presumably be null in most cases) would have much impact
Of course, the other thing you could do is: steal the connection wrapper code from mini-profiler, and replace the profiler-context stuff with just: Debug.WriteLine etc.
You should consider using SQL profiler located in the menu of SQL Management Studio → Extras → SQL Server Profiler (no Dapper extensions needed - may work with other RDBMS when they got a SQL profiler tool too).
Then, start a new session.
You'll get something like this for example (you see all parameters and the complete SQL string):
exec sp_executesql N'SELECT * FROM Updates WHERE CAST(Product_ID as VARCHAR(50)) = #appId AND (Blocked IS NULL OR Blocked = 0)
AND (Beta IS NULL OR Beta = 0 OR #includeBeta = 1) AND (LangCode IS NULL OR LangCode IN (SELECT * FROM STRING_SPLIT(#langCode, '','')))',N'#appId nvarchar(4000),#includeBeta bit,#langCode nvarchar(4000)',#appId=N'fea5b0a7-1da6-4394-b8c8-05e7cb979161',#includeBeta=0,#langCode=N'de'
Try Dapper.Logging.
You can get it from NuGet. The way it works is you pass your code that creates your actual database connection into a factory that creates wrapped connections. Whenever a wrapped connection is opened or closed or you run a query against it, it will be logged. You can configure the logging message templates and other settings like whether SQL parameters are saved. Elapsed time is also saved.
In my opinion, the only downside is that the documentation is sparse, but I think that's just because it's a new project (as of this writing). I had to dig through the repo for a bit to understand it and to get it configured to my liking, but now it's working great.
From the documentation:
The tool consists of simple decorators for the DbConnection and
DbCommand which track the execution time and write messages to the
ILogger<T>. The ILogger<T> can be handled by any logging framework
(e.g. Serilog). The result is similar to the default EF Core logging
behavior.
The lib declares a helper method for registering the
IDbConnectionFactory in the IoC container. The connection factory is
SQL Provider agnostic. That's why you have to specify the real factory
method:
services.AddDbConnectionFactory(prv => new SqlConnection(conStr));
After registration, the IDbConnectionFactory can be injected into
classes that need a SQL connection.
private readonly IDbConnectionFactory _connectionFactory;
public GetProductsHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
The IDbConnectionFactory.CreateConnection will return a decorated
version that logs the activity.
using (DbConnection db = _connectionFactory.CreateConnection())
{
//...
}
This is not exhaustive and is essentially a bit of hack, but if you have your SQL and you want to initialize your parameters, it's useful for basic debugging. Set up this extension method, then call it anywhere as desired.
public static class DapperExtensions
{
public static string ArgsAsSql(this DynamicParameters args)
{
if (args is null) throw new ArgumentNullException(nameof(args));
var sb = new StringBuilder();
foreach (var name in args.ParameterNames)
{
var pValue = args.Get<dynamic>(name);
var type = pValue.GetType();
if (type == typeof(DateTime))
sb.AppendFormat("DECLARE #{0} DATETIME ='{1}'\n", name, pValue.ToString("yyyy-MM-dd HH:mm:ss.fff"));
else if (type == typeof(bool))
sb.AppendFormat("DECLARE #{0} BIT = {1}\n", name, (bool)pValue ? 1 : 0);
else if (type == typeof(int))
sb.AppendFormat("DECLARE #{0} INT = {1}\n", name, pValue);
else if (type == typeof(List<int>))
sb.AppendFormat("-- REPLACE #{0} IN SQL: ({1})\n", name, string.Join(",", (List<int>)pValue));
else
sb.AppendFormat("DECLARE #{0} NVARCHAR(MAX) = '{1}'\n", name, pValue.ToString());
}
return sb.ToString();
}
}
You can then just use this in the immediate or watch windows to grab the SQL.
Just to add an update here since I see this question still get's quite a few hits - these days I use either Glimpse (seems it's dead now) or Stackify Prefix which both have sql command trace capabilities.
It's not exactly what I was looking for when I asked the original question but solve the same problem.

Visual Studio 2012 - Debugger Jumps around randomly

I've been having this issue for a while now an I just cant work past it anymore. I use to just recreate the project but now that solution no longer works for me either and i've had no luck with google at all.
I have a silverlight 5 website with a WCF service. Basically what happens is I'll have a piece of code i need to debug e.g.
using (var connection = new SqlConnection(AccessCardConnection))
{
connection.Open();
var command = connection.CreateCommand();
command.CommandType = CommandType.StoredProcedure;
command.CommandText = "GetAccessCardTimes";
command.CommandTimeout = 300;
command.Parameters.AddWithValue("#EmployeeName", fullName);
command.Parameters.AddWithValue("#FirstDate", firstDate);
command.Parameters.AddWithValue("#LastDate", lastDate);
var accessCardReader = command.ExecuteReader();
while (accessCardReader.Read())
{
var time = TimeSpan.Parse(accessCardReader["TotalOnSiteTime"].ToString());
duration += time;
}
return (duration.TotalHours);
}
However once the first break point hits its anybodies guess which line it will jump to.. It doesnt follow any kind of logical order it, if i set a break point on connection.Open() for example it may jump back into the connection intialization then back down again and jump any which way it pleases, up, down, stay on the same line etc. Jumping in and out of methods as it seems to have no regard for what its doing.
None of the solutions I've found online have helped:
Why do my debugger randomly jump out of debug mode? (Fix is for visual studio 2008)
Visual Studio 2010 Debugger jumps around (Deleting the /bin and /obj files to regenerate the symbols didnt help)
Please can someone give me a hint as to what it may be I cant work like this :(
Thanks for your assistance!
Sounds like you have multiple calls hitting the service. WCF spawns new worker threads for each request so what you are seeing is the debugger "jumping" from thread to thread.

Connections with Entity Framework and Transient Fault Handling Block?

We're migrating SQL to Azure. Our DAL is Entity Framework 4.x based. We're wanting to use the Transient Fault Handling Block to add retry logic for SQL Azure.
Overall, we're looking for the best 80/20 rule (or maybe more of a 95/5 but you get the point) - we're not looking to spend weeks refactoring/rewriting code (there's a LOT of it). I'm fine re-implementing our DAL's framework but not all of the code written and generated against it anymore than we have to since this is already here only to address a minority case. Mitigation >>> elimination of this edge case for us.
Looking at the possible options explained here at MSDN, it seems Case #3 there is the "quickest" to implement, but only at first glance. Upon pondering this solution a bit, it struck me that we might have problems with connection management since this circumvent's Entity Framework's built-in processes for managing connections (i.e. always closing them). It seems to me that the "solution" is to make sure 100% of our Contexts that we instantiate use Using blocks, but with our architecture, this would be difficult.
So my question: Going with Case #3 from that link, are hanging connections a problem or is there some magic somewhere that's going on that I don't know about?
I've done some experimenting and it turns out that this brings us back to the old "managing connections" situation we're used to from the past, only this time the connections are abstracted away from us a bit and we must now "manage Contexts" similarly.
Let's say we have the following OnContextCreated implementation:
private void OnContextCreated()
{
const int maxRetries = 4;
const int initialDelayInMilliseconds = 100;
const int maxDelayInMilliseconds = 5000;
const int deltaBackoffInMilliseconds = initialDelayInMilliseconds;
var policy = new RetryPolicy<SqlAzureTransientErrorDetectionStrategy>(maxRetries,
TimeSpan.FromMilliseconds(initialDelayInMilliseconds),
TimeSpan.FromMilliseconds(maxDelayInMilliseconds),
TimeSpan.FromMilliseconds(deltaBackoffInMilliseconds));
policy.ExecuteAction(() =>
{
try
{
Connection.Open();
var storeConnection = (SqlConnection) ((EntityConnection) Connection).StoreConnection;
new SqlCommand("declare #i int", storeConnection).ExecuteNonQuery();
//Connection.Close();
// throw new ApplicationException("Test only");
}
catch (Exception e)
{
Connection.Close();
Trace.TraceWarning("Attempted to open connection but failed: " + e.Message);
throw;
}
}
);
}
In this scenario, we forcibly open the Connection (which was the goal here). Because of this, the Context keeps it open across many calls. Because of that, we must tell the Context when to close the connection. Our primary mechanism for doing that is calling the Dispose method on the Context. So if we just allow garbage collection to clean up our contexts, then we allow connections to remain hanging open.
I tested this by toggling the comments on the Connection.Close() in the try block and running a bunch of unit tests against our database. Without calling Close, we jumped up to ~275-300 active connections (from SQL Server's perspective). By calling Close, that number hovered at ~12. I then reproduced with a small number of unit tests both with and without a using block for the Context and reproduced the same result (different numbers - I forget what they were).
I was using the following query to count my connections:
SELECT s.session_id, s.login_name, e.connection_id,
s.last_request_end_time, s.cpu_time,
e.connect_time
FROM sys.dm_exec_sessions AS s
INNER JOIN sys.dm_exec_connections AS e
ON s.session_id = e.session_id
WHERE login_name='myuser'
ORDER BY s.login_name
Conclusion: If you call Connection.Open() with this work-around to enable the Transient Fault Handling Block, then you MUST use using blocks for all contexts you work with, otherwise you will have problems (that with SQL Azure, will cause your database to be "throttled" and ultimately taken offline for hours!).
The problem with this approach is it only takes care of connection retries and not command retries.
If you use Entity Framework 6 (currently in alpha) then there is some new in-built support for transient retries with Azure SQL Database (with a little bit of configuration): http://entityframework.codeplex.com/wikipage?title=Connection%20Resiliency%20Spec
I've created a library which allows you to configure Entity Framework to retry using the Fault Handling block without needing to change every database call - generally you will only need to change your config file and possibly one or two lines of code.
This allows you to use it for Entity Framework or Linq To Sql.
https://github.com/robdmoore/ReliableDbProvider

Creating Windows service without Visual Studio

So creating a Windows service using Visual Studio is fairly trivial. My question goes a bit deeper as to what actually makes an executable installable as a service & how to write a service as a straight C application. I couldn't find a lot of references on this, but I'm presuming there has to be some interface I can implement so my .exe can be installed as a service.
Setting up your executable as a service is part of it, but realistically it's usually handled by whatever installation software you're using. You can use the command line SC tool while testing (or if you don't need an installer).
The important thing is that your program has to call StartServiceCtrlDispatcher() upon startup. This connects your service to the service control manager and sets up a ServiceMain routine which is your services main entry point.
ServiceMain (you can call it whatever you like actually, but it always seems to be ServiceMain) should then call RegisterServiceCtrlHandlerEx() to define a callback routine so that the OS can notify your service when certain events occur.
Here are some snippets from a service I wrote a few years ago:
set up as service:
SERVICE_TABLE_ENTRY ServiceStartTable[] =
{
{ "ServiceName", ServiceMain },
{ 0, 0 }
};
if (!StartServiceCtrlDispatcher(ServiceStartTable))
{
DWORD err = GetLastError();
if (err == ERROR_FAILED_SERVICE_CONTROLLER_CONNECT)
return false;
}
ServiceMain:
void WINAPI ServiceMain(DWORD, LPTSTR*)
{
hServiceStatus = RegisterServiceCtrlHandlerEx("ServiceName", ServiceHandlerProc, 0);
service handler:
DWORD WINAPI ServiceHandlerProc(DWORD ControlCode, DWORD, void*, void*)
{
switch (ControlCode)
{
case SERVICE_CONTROL_INTERROGATE :
// update OS about our status
case SERVICE_CONTROL_STOP :
// shut down service
}
return 0;
}
Hope this helps:
http://support.microsoft.com/kb/251192
It would seem that you simple need to run this exe against a binary executable to register it as a service.
Basically there are some registry settings you have to set as well as some interfaces to implement.
Check out this: http://msdn.microsoft.com/en-us/library/ms685141.aspx
You are interested in the SCM (Service Control Manager).
I know I'm a bit late to the party, but I've recently had this same question, and had to struggle through the interwebs looking for answers.
I managed to find this article in MSDN that does in fact lay the groundwork. I ended up combining many of the files here into a single exe that contains all of the commands I need, and added in my own "void run()" method that loops for the entirely life of the service for my own needs.
This would be a great start to someone else with exactly this question, so for future searchers out there, check it out:
The Complete Service Sample
http://msdn.microsoft.com/en-us/library/bb540476(VS.85).aspx

Resources