I am looking for a way to let my C# (4.0) app send data to a SQL Server 2008 instance, asynchronously. I kind of like what I saw at http://nayyeri.net/asynchronous-command-execution-in-net-2-0 but that is not quite what I am looking for.
I have code like this:
// myDataTable is a .NET DataTable object
SqlCommand sc= new SqlCommand("dbo.ExtBegin", conn);
SqlParameter param1 = sc.Parameters.AddWithValue("#param1", "a");
SqlParameter param2 = sc.Parameters.AddWithValue("#param2", "b");
SqlParameter param3 = sc.Parameters.AddWithValue("#param3", myDataTable);
param3.SqlDbType = SqlDbType.Structured;
param3.TypeName = "dbo.MyTableType";
int execState = sc.ExecuteNonQuery();
And because the myDataTable is potentially large, I don't want the console app to hang while it sends the data to the server, if there are 6 big loads I want them all going at the same time without blocking at the console app. I don't want to send serially.
All ideas appreciated, thanks!
set the AsynchronousProcessing property on the connection string to True.
Use BeginExecuteNonQuery
But what is dbo.ExtBegin doing? It all depends on it, as the calls may well serialize on locks in the database (at best) or, at worst, you may get incorrect results if the procedure is not properly designed for concurency.
Create a thread and execute the query within that thread, make sure not to have subsequent database calls that would cause race conditions.
My first thought would be to spawn a new thread for the inserts, and have the main thread check the spawned thread's execution with AutoResetEvent, TimerCallback, and Timer objects.
I do it in Silverlight all the time.
Take a look at using Service Broker Activation. This will allow you to call a stored proc and have it run on it's own thread why you continue on the current thread.
Here is an excellent article that goes over how to do this.
Related
I want to store data in database in every minute . For the same what should I use Service, AsyncTask or anything else. I go through various link which made me more confused .
I read the developer guide and came to know about getWritableDatabase
Database upgrade may take a long time, you should not call this method from the application main thread,
Then first I think I will use AsyncTask then about this
AsyncTasks should ideally be used for short operations (a few seconds at the most.)
After that I think I can use Service then about Service
A Service is not a thread. It is not a means itself to do work off of the main thread (to avoid Application Not Responding errors).
Here I am not able to understand what should I use to store data in database periodically. Please help me here as struck badly.
Thanks in advance
you cant do a lot work on the UI thread, so making database operations you could choose different approaches, few of them that I prefer to use are listed below;
Create a thread pool and execute each database operation via a thread, this reduces load on UI thread, also it never initializes lot of threads.
You can use services for updating the database operations. since services running on UI thread you cant write your operations in Services, so that you have to create a separate thread inside service method. or you can use Intent service directly since it is not working on UI Thread.
here is developer documentation on thread pool in android
and this is the documentation for IntentService
UPDATE
This will send an intent to your service every minute without using any processor time in your activity in between
Intent myIntent = new Intent(context, MyServiceReceiver.class);
PendingIntent pendingIntent = PendingIntent.getBroadcast(context, 0, myIntent, 0);
AlarmManager alarmManager = (AlarmManager)context.getSystemService(Context.ALARM_SERVICE);
Calendar calendar = Calendar.getInstance();
calendar.setTimeInMillis(System.currentTimeMillis());
calendar.add(Calendar.SECOND, 60); // first time
long frequency= 60 * 1000; // in ms
alarmManager.setRepeating(AlarmManager.RTC_WAKEUP, calendar.getTimeInMillis(), frequency, pendingIntent);
Before that check if you really need a service to be started in each minute. or if you can have one service which checks for the data changes in each minute, starting new service would consume maybe more resources than checking itself.
UPDATE 2
private ping() {
// periodic action here.
scheduleNext();
}
private scheduleNext() {
mHandler.postDelayed(new Runnable() {
public void run() { ping(); }
}, 60000);
}
int onStartCommand(Intent intent, int x, int y) {
mHandler = new android.os.Handler();
ping();
return STICKY;
}
this is a simple example like that you can do
Is there a way to dump the generated sql to the Debug log or something? I'm using it in a winforms solution so the mini-profiler idea won't work for me.
I got the same issue and implemented some code after doing some search but having no ready-to-use stuff. There is a package on nuget MiniProfiler.Integrations I would like to share.
Update V2: it supports to work with other database servers, for MySQL it requires to have MiniProfiler.Integrations.MySql
Below are steps to work with SQL Server:
1.Instantiate the connection
var factory = new SqlServerDbConnectionFactory(_connectionString);
using (var connection = ProfiledDbConnectionFactory.New(factory, CustomDbProfiler.Current))
{
// your code
}
2.After all works done, write all commands to a file if you want
File.WriteAllText("SqlScripts.txt", CustomDbProfiler.Current.ProfilerContext.BuildCommands());
Dapper does not currently have an instrumentation point here. This is perhaps due, as you note, to the fact that we (as the authors) use mini-profiler to handle this. However, if it helps, the core parts of mini-profiler are actually designed to be architecture neutral, and I know of other people using it with winforms, wpf, wcf, etc - which would give you access to the profiling / tracing connection wrapper.
In theory, it would be perfectly possible to add some blanket capture-point, but I'm concerned about two things:
(primarily) security: since dapper doesn't have a concept of a context, it would be really really easy for malign code to attach quietly to sniff all sql traffic that goes via dapper; I really don't like the sound of that (this isn't an issue with the "decorator" approach, as the caller owns the connection, hence the logging context)
(secondary) performance: but... in truth, it is hard to say that a simple delegate-check (which would presumably be null in most cases) would have much impact
Of course, the other thing you could do is: steal the connection wrapper code from mini-profiler, and replace the profiler-context stuff with just: Debug.WriteLine etc.
You should consider using SQL profiler located in the menu of SQL Management Studio → Extras → SQL Server Profiler (no Dapper extensions needed - may work with other RDBMS when they got a SQL profiler tool too).
Then, start a new session.
You'll get something like this for example (you see all parameters and the complete SQL string):
exec sp_executesql N'SELECT * FROM Updates WHERE CAST(Product_ID as VARCHAR(50)) = #appId AND (Blocked IS NULL OR Blocked = 0)
AND (Beta IS NULL OR Beta = 0 OR #includeBeta = 1) AND (LangCode IS NULL OR LangCode IN (SELECT * FROM STRING_SPLIT(#langCode, '','')))',N'#appId nvarchar(4000),#includeBeta bit,#langCode nvarchar(4000)',#appId=N'fea5b0a7-1da6-4394-b8c8-05e7cb979161',#includeBeta=0,#langCode=N'de'
Try Dapper.Logging.
You can get it from NuGet. The way it works is you pass your code that creates your actual database connection into a factory that creates wrapped connections. Whenever a wrapped connection is opened or closed or you run a query against it, it will be logged. You can configure the logging message templates and other settings like whether SQL parameters are saved. Elapsed time is also saved.
In my opinion, the only downside is that the documentation is sparse, but I think that's just because it's a new project (as of this writing). I had to dig through the repo for a bit to understand it and to get it configured to my liking, but now it's working great.
From the documentation:
The tool consists of simple decorators for the DbConnection and
DbCommand which track the execution time and write messages to the
ILogger<T>. The ILogger<T> can be handled by any logging framework
(e.g. Serilog). The result is similar to the default EF Core logging
behavior.
The lib declares a helper method for registering the
IDbConnectionFactory in the IoC container. The connection factory is
SQL Provider agnostic. That's why you have to specify the real factory
method:
services.AddDbConnectionFactory(prv => new SqlConnection(conStr));
After registration, the IDbConnectionFactory can be injected into
classes that need a SQL connection.
private readonly IDbConnectionFactory _connectionFactory;
public GetProductsHandler(IDbConnectionFactory connectionFactory)
{
_connectionFactory = connectionFactory;
}
The IDbConnectionFactory.CreateConnection will return a decorated
version that logs the activity.
using (DbConnection db = _connectionFactory.CreateConnection())
{
//...
}
This is not exhaustive and is essentially a bit of hack, but if you have your SQL and you want to initialize your parameters, it's useful for basic debugging. Set up this extension method, then call it anywhere as desired.
public static class DapperExtensions
{
public static string ArgsAsSql(this DynamicParameters args)
{
if (args is null) throw new ArgumentNullException(nameof(args));
var sb = new StringBuilder();
foreach (var name in args.ParameterNames)
{
var pValue = args.Get<dynamic>(name);
var type = pValue.GetType();
if (type == typeof(DateTime))
sb.AppendFormat("DECLARE #{0} DATETIME ='{1}'\n", name, pValue.ToString("yyyy-MM-dd HH:mm:ss.fff"));
else if (type == typeof(bool))
sb.AppendFormat("DECLARE #{0} BIT = {1}\n", name, (bool)pValue ? 1 : 0);
else if (type == typeof(int))
sb.AppendFormat("DECLARE #{0} INT = {1}\n", name, pValue);
else if (type == typeof(List<int>))
sb.AppendFormat("-- REPLACE #{0} IN SQL: ({1})\n", name, string.Join(",", (List<int>)pValue));
else
sb.AppendFormat("DECLARE #{0} NVARCHAR(MAX) = '{1}'\n", name, pValue.ToString());
}
return sb.ToString();
}
}
You can then just use this in the immediate or watch windows to grab the SQL.
Just to add an update here since I see this question still get's quite a few hits - these days I use either Glimpse (seems it's dead now) or Stackify Prefix which both have sql command trace capabilities.
It's not exactly what I was looking for when I asked the original question but solve the same problem.
We're migrating SQL to Azure. Our DAL is Entity Framework 4.x based. We're wanting to use the Transient Fault Handling Block to add retry logic for SQL Azure.
Overall, we're looking for the best 80/20 rule (or maybe more of a 95/5 but you get the point) - we're not looking to spend weeks refactoring/rewriting code (there's a LOT of it). I'm fine re-implementing our DAL's framework but not all of the code written and generated against it anymore than we have to since this is already here only to address a minority case. Mitigation >>> elimination of this edge case for us.
Looking at the possible options explained here at MSDN, it seems Case #3 there is the "quickest" to implement, but only at first glance. Upon pondering this solution a bit, it struck me that we might have problems with connection management since this circumvent's Entity Framework's built-in processes for managing connections (i.e. always closing them). It seems to me that the "solution" is to make sure 100% of our Contexts that we instantiate use Using blocks, but with our architecture, this would be difficult.
So my question: Going with Case #3 from that link, are hanging connections a problem or is there some magic somewhere that's going on that I don't know about?
I've done some experimenting and it turns out that this brings us back to the old "managing connections" situation we're used to from the past, only this time the connections are abstracted away from us a bit and we must now "manage Contexts" similarly.
Let's say we have the following OnContextCreated implementation:
private void OnContextCreated()
{
const int maxRetries = 4;
const int initialDelayInMilliseconds = 100;
const int maxDelayInMilliseconds = 5000;
const int deltaBackoffInMilliseconds = initialDelayInMilliseconds;
var policy = new RetryPolicy<SqlAzureTransientErrorDetectionStrategy>(maxRetries,
TimeSpan.FromMilliseconds(initialDelayInMilliseconds),
TimeSpan.FromMilliseconds(maxDelayInMilliseconds),
TimeSpan.FromMilliseconds(deltaBackoffInMilliseconds));
policy.ExecuteAction(() =>
{
try
{
Connection.Open();
var storeConnection = (SqlConnection) ((EntityConnection) Connection).StoreConnection;
new SqlCommand("declare #i int", storeConnection).ExecuteNonQuery();
//Connection.Close();
// throw new ApplicationException("Test only");
}
catch (Exception e)
{
Connection.Close();
Trace.TraceWarning("Attempted to open connection but failed: " + e.Message);
throw;
}
}
);
}
In this scenario, we forcibly open the Connection (which was the goal here). Because of this, the Context keeps it open across many calls. Because of that, we must tell the Context when to close the connection. Our primary mechanism for doing that is calling the Dispose method on the Context. So if we just allow garbage collection to clean up our contexts, then we allow connections to remain hanging open.
I tested this by toggling the comments on the Connection.Close() in the try block and running a bunch of unit tests against our database. Without calling Close, we jumped up to ~275-300 active connections (from SQL Server's perspective). By calling Close, that number hovered at ~12. I then reproduced with a small number of unit tests both with and without a using block for the Context and reproduced the same result (different numbers - I forget what they were).
I was using the following query to count my connections:
SELECT s.session_id, s.login_name, e.connection_id,
s.last_request_end_time, s.cpu_time,
e.connect_time
FROM sys.dm_exec_sessions AS s
INNER JOIN sys.dm_exec_connections AS e
ON s.session_id = e.session_id
WHERE login_name='myuser'
ORDER BY s.login_name
Conclusion: If you call Connection.Open() with this work-around to enable the Transient Fault Handling Block, then you MUST use using blocks for all contexts you work with, otherwise you will have problems (that with SQL Azure, will cause your database to be "throttled" and ultimately taken offline for hours!).
The problem with this approach is it only takes care of connection retries and not command retries.
If you use Entity Framework 6 (currently in alpha) then there is some new in-built support for transient retries with Azure SQL Database (with a little bit of configuration): http://entityframework.codeplex.com/wikipage?title=Connection%20Resiliency%20Spec
I've created a library which allows you to configure Entity Framework to retry using the Fault Handling block without needing to change every database call - generally you will only need to change your config file and possibly one or two lines of code.
This allows you to use it for Entity Framework or Linq To Sql.
https://github.com/robdmoore/ReliableDbProvider
I am building a WPF which has a button that execute a sql query in sql server (the query could take a long time to run).
I want to use TPL for doing that.
This code:
var result = Task.Factory.StartNew(() => { command.ExecuteNonQuery(); });
gives this exception:
ExecuteNonQuery requires an open and available Connection. The connection's current state is closed.
I guess this is due to the fact that the query runs on a different thread and is not aware of the open connection.
I have 2 questions:
1. How do I make the new thread know of this open connection?
2. After solving this ,How do I get the window not to freeze due to this query.
Thanks
You will have to create and open the connection for this command within the Task's body. Either that or don't close the connection outside the Task, which I assume is what you're doing here, but can't tell from the one line of code you pasted.
I would personally do it all inside the Task body. Why should the user have to wait for you to even get the connection/command setup if they don't have to? Also there's the chance that you connection is a shared instance and that won't work across threads.
Once you get the DB work into a Task it will be executed on a Thread Pool thread by default which will free up the WPF dispatcher thread to go back to processing UI events preventing the "freezing". Most likely you will want to update the UI after that DB task has completed and to do that you would hpjust add a continuation task, but in order to be able to manipulate the UI from that continuation task you need to make sure it's explicitly scheduled to run on the Dispatcher thread. This is done by explicitly specifying a TaskScheduler for the current synchronization context while scheduling the continuation. That would look something like this:
Task backgroundDBTask = Task.Factory.StartNew(() =>
{
... DB work here ...
});
backgroundDBTask.ContinueWith((t) =>
{
... UI update work here ...
},
TaskScheduler.FromCurrentSynchronizationContext());
The magic here is the use of the TaskScheduler::FromCurrentSynchronizationContext method which will schedule the continuation to be executed on the Dispatcher thread of the current call.
In addition to #Drew Marsh answer,
To avoid Exception:
The current SynchronizationContext may not be used as a TaskScheduler
You can use check for Synchronization Content Exists:
private static TaskScheduler GetSyncronizationContent() =>
SynchronizationContext.Current != null ?
TaskScheduler.FromCurrentSynchronizationContext() :
TaskScheduler.Current;
And use it instead:
Task backgroundDBTask = Task.Factory.StartNew(() =>
{
//... DB work here ...
});
backgroundDBTask.ContinueWith((t) =>
{
//... UI update work here ...
},
GetSyncronizationContent());
I have an application that should use an application role from the database.
I'm trying to make this work with queries that are actually run using Subsonic (2).
To do this, I created my own DataProvider, which inherits from Subsonic's SqlDataProvider.
It overrides the CreateConnection function, and calls sp_appsetrole to set the application role after the connection is created.
This part works fine, and I'm able to get data using the application role.
The problem comes when I try to unset the application role. I couldn't find any place in the code where my provider is called after the query is done, so I tried to add my own, by changing SubSonic code. The problem is that Subsonic uses a data reader. It loads data from the data reader, and then closes it.
If I unset the application role before the data reader is closed, I get an error saying: There is already an open DataReader associated with this Command which must be closed first.
If I unset the application role after the data reader is closed, I get an error saying ExecuteNonQuery requires an open and available Connection. The connection's current state is closed.
I can't seem to find a way to close the data reader without closing the connection.
Do you have to use the role for every query?
If not you can use a SharedDbConnectionScope()
using(var scope = new SharedDbConnectionScope())
{
// within this using block you have a single connection
// that isn't closed until scope.Dispose() is called
// (happens automatically while leaving this block)
// and you have access to scope.CurrentConnection
// Do your init stuff
SetRole(scope.CurrentConnection);
var product = new Product();
product.Code = "12345";
product.Save();
// Revert to normal
UnsetRole(scope.CurrentConnection);
}
The problem is that Subsonic executes its reader with CloseConnection.
If I make it not close the connection I can unset the application role after the reader is closed.
May be you can subscribe to the event like this:
Connection.StateChange += new System.Data.StateChangeEventHandler(Connection_StateChange);
And then do some actions according to the new state of this connection:
if(e.CurrentState== System.Data.ConnectionState.Open)
dbworker.ExecuteCommand("EXEC sp_setapprole application, 'password'");