I'm exploring query notifications with the SQLDependency class. Building a simple working example is easy, but I feel like I'm missing something. Once I step past a simple one-table/one-dependency example I'm left wondering how can I figure out which dependency triggered my callback?
I'm having a bit of trouble explaining, so I included the simple example below. When AChange() is called I cannot look at the sql inside the dependency, and i don't have a reference to the associated cache object.
So what's a boy to do?
Option 1 - create a distinct function for each object i want to track and hard code the cache-key (or relevant information) in the callback. This feels dirty & eliminates the posibility of adding new cache items without deploying new code--ewww.
Option 2 - Use the Dependency Id property and a parallel tracking structure
Am I just missing something? Is this a deficiency in the SQLDependency structure? I've I've looked at 20 different articles on the topic and all of them seem to have the same hole. Suggestions?
Code Sample
public class DependencyCache{
public static string cacheName = "Client1";
public static MemoryCache memCache = new MemoryCache(cacheName);
public DependencyCache() {
SqlDependency.Start(connString);
}
private static string GetSQL() {
return "select someString FROM dbo.TestTable";
}
public void DoTest() {
if (memCache["TEST_KEY"] != null ) {
Debug.WriteLine("resources found in cache");
return;
}
Cache_GetData();
}
private void Cache_GetData() {
SqlConnection oConn;
SqlCommand oCmd;
SqlDependency oDep;
SqlDataReader oRS;
List<string> stuff = new List<string>();
CacheItemPolicy policy = new CacheItemPolicy();
SqlDependency.Start(connString);
using (oConn = new SqlConnection(connString) ) {
using (oCmd = new SqlCommand(GetSQL(), oConn) ) {
oDep = new SqlDependency(oCmd);
oConn.Open();
oRS = oCmd.ExecuteReader();
while(oRS.Read() ) {
resources.Add( oRS.GetString(0) );
}
oDep.OnChange += new OnChangeEventHandler (AChange);
}
}
memCache.Set("TEST_KEY", stuff, policy);
}
private void AChange( object sender, SqlNotificationEventArgs e) {
string msg= "Dependency Change \nINFO: {0} : SOURCE {1} :TYPE: {2}";
Debug.WriteLine(String.Format(msg, e.Info, e.Source, e.Type));
// If multiple queries use this as a callback how can i figure
// out WHAT QUERY TRIGGERED the change?
// I can't figure out how to tell multiple dependency objects apart
((SqlDependency)sender).OnChange -= Cache_SqlDependency_OnChange;
Cache_GetData(); //reload data
}
}
First and foremost: the handler has to be set up before the command is executed:
oDep = new SqlDependency(oCmd);
oConn.Open();
oDep.OnChange += new OnChangeEventHandler (AChange);
oRS = oCmd.ExecuteReader();
while(oRS.Read() ) {
resources.Add( oRS.GetString(0) );
}
Otherwise you have a window when the notification may be lost and your callback never invoked.
Now about your question: you should use a separate callback for each query. While this may seem cumbersome, is actually trivial by using a lambda. Something like the following:
oDep = new SqlDependency(oCmd);
oConn.Open();
oDep.OnChange += (sender, e) =>
{
string msg = "Dependency Change \nINFO: {0} : SOURCE {1} :TYPE: {2}";
Debug.WriteLine(String.Format(msg, e.Info, e.Source, e.Type));
// The command that trigger the notification is captured in the context:
// is oCmd
//
// You can now call a handler passing in the relevant info:
//
Reload_Data(oCmd, ...);
};
oRS = oCmd.ExecuteReader();
...
And remember to always check the notification source, info and type. Otherwise you run the risk of spinning ad-nauseam when you are notified for reasons other than data change, like invalid query. As a side comment I would add that a good cache design does not refresh the cache on invalidation, but simply invalidates the cached item and lets the next request actually fetch a fresh item. With your 'proactive' approach you are refreshing cached items even when not needed, refresh multiple times before they are accessed etc etc. I left out from the example error handling and proper thread synchronization (both required).
Finally, have a look at LinqtoCache which does pretty much what you're trying to do, but for LINQ queries.
Related
I have an object which, although it has a text representation (i.e. could be stored in a string of about 1000 printable characters), is expensive to generate. I also have a tree control which shows "summaries" of the objects. I want to drag/drop these objects not only within my own application, but also to other applications that accept CF_TEXT or CF_UNICODETEXT, at which point the textual representation is inserted into the drop target.
I've been thinking of delaying the "rendering" the text representation of my object so that it only takes place when the object is dropped or pasted. However, it seems that Winforms is eagerly calling the GetData() method at the start of the drag, which causes a painful multi-second delay at the start of the drag.
Is there any way ensure that the GetData() happens only at drop time? Alternatively, what is the right mechanism for implementing this deferred drop mechanism in a Winforms program?
After some research, I was able to figure out how to do this without having to implement the COM interface IDataObject (with all of its FORMATETC gunk). I thought it might be of interest to others in the same quandary, so I've written up my solution. If it can be done more cleverly, I'm all eyes/ears!
The System.Windows.Forms.DataObject class has this constructor:
public DataObject(string format, object data)
I was calling it like this:
string expensive = GenerateStringVerySlowly();
var dataObject = new DataObject(
DataFormats.UnicodeText,
expensive);
DoDragDrop(dataObject, DragDropEffects.Copy);
The code above will put the string data into an HGLOBAL during the copy operation. However, you can also call the constructor like this:
string expensive = GenerateStringVerySlowly();
var dataObject = new DataObject(
DataFormats.UnicodeText,
new MemoryStream(Encoding.Unicode.GetBytes(expensive)));
DoDragDrop(dataObject, DragDropEffects.Copy);
Rather than copying the data via an HGLOBAL, this later call has the nice effect of copying the data via a (COM) IStream. Apparently some magic is going on in the .NET interop layer that handles mapping between COM IStream and .NET System.IO.Stream.
All I had to do now was to write a class that deferred the creation of the stream until the very last minute (Lazy object pattern), when the drop target starts calling Length, Read etc. It looks like this: (parts edited for brevity)
public class DeferredStream : Stream
{
private Func<string> generator;
private Stream stm;
public DeferredStream(Func<string> expensiveGenerator)
{
this.generator = expensiveGenerator;
}
private Stream EnsureStream()
{
if (stm == null)
stm = new MemoryStream(Encoding.Unicode.GetBytes(generator()));
return stm;
}
public override long Length
{
get { return EnsureStream().Length; }
}
public override long Position
{
get { return EnsureStream().Position; }
set { EnsureStream().Position = value; }
}
public override int Read(byte[] buffer, int offset, int count)
{
return EnsureStream().Read(buffer, offset, count);
}
// Remaining Stream methods elided for brevity.
}
Note that the expensive data is only generated when the EnsureStream method is called for the first time. This doesn't happen until the drop target starts wanting to suck down the data in the IStream. Finally, I changed the calling code to:
var dataObject = new DataObject(
DataFormats.UnicodeText,
new DeferredStream(GenerateStringVerySlowly));
DoDragDrop(dataObject, DragDropEffects.Copy);
This was exactly what I needed to make this work. However, I am relying on the good behaviour of the drop target here. Misbehaving drop targets that eagerly call, say, the Read method, say, will cause the expensive operation to happen earlier.
In the earlier versions of Entity Framework, we were able to reach the Context out of ObjectQuery in order to read Parameters, Connection, etc. as below:
var query = (ObjectQuery<T>)source;
cmd.Connection = (SqlConnection)((EntityConnection)query.Context.Connection).StoreConnection;
cmd.Parameters.AddRange(
query.Parameters.Select(x => new SqlParameter(
x.Name, x.Value ?? DBNull.Value)
).ToArray()
);
When I look at the DbSet<T> object, I am unable to find any equivalent of this. My purpose here is to create extensions which will manipulate the query and get the result out of it.
Here is an instance: http://philsversion.com/2011/09/07/async-entity-framework-queries
Or should I write the extension for DbContext class and work with Set method?
Any idea?
Edit
Here is what I did so far. Basic implementation so far but certainly not ready for production. Any suggestions on this?
public static async Task<IEnumerable<T>> QueryAsync<T>(this DbContext #this, System.Linq.Expressions.Expression<Func<T, bool>> predicate = null)
where T : class {
var query = (predicate != null) ? #this.Set<T>().Where(predicate) : #this.Set<T>();
var cmd = new SqlCommand();
cmd.Connection = (SqlConnection)(#this.Database.Connection);
cmd.CommandText = query.ToString();
if (cmd.Connection.State == System.Data.ConnectionState.Closed) {
cmd.Connection.ConnectionString = new SqlConnectionStringBuilder(cmd.Connection.ConnectionString) {
AsynchronousProcessing = true
}.ToString();
cmd.Connection.Open();
}
cmd.Disposed += (o, e) => {
cmd.Clone();
};
var source = ((IObjectContextAdapter)#this).ObjectContext.Translate<T>(
await cmd.ExecuteReaderAsync()
);
return source;
}
This is a nice workaround, although I don't think you can make it much more generally applicable than what you already have.
A few things to keep in mind:
- Depending on the EF query, e.g. if you are using Include or not, the columns returned in the reader might not match the properties in the type T you are passsing.
- Depending on whether you have inheritance in your model, the T that you pass to translate may not always be the right thing to materialize for every row returned.
- After the task returned by ExecuteReaderAsync completes, you still have to retrieve each row, which depending on the execution plan for the query and the latency you are getting with the server is potentially also a blocking operation.
Async support is not coming to EF in 5.0 but we worked with other teams to make sure we have all the necessary building blocks included in .NET 4.5 and the feature is pretty high in our priority list. I encourage you to vote for it in our UserVoice site.
For the love of heaven and earth I really wish someone could help me out with this issue. It seems everyone has something to say about EF but nothing about Linq-to-SQL.
I am trying to grab some data from my table via a stored procedure, believe me, that's all.
I added the Linq-to-SQL model (LAMP.dbml)
added the stored procedure (getAffectedParcel) from the server explorer. getAffectedParcel takes 2 strings as parameters
Build the application.
Added a domain service class (LAMPService)
Selected the (LAMPDataContext) as the data context class (normally I would tick generate metadata, but since I am not working with tables it's not enabled for ticking)
Added the following function to the LAMPService.cs:
public IEnumerable < getAffectedParcelResult > GetTheAffectedParcels(String v, String vf)
{
return this.DataContext.getAffectedParcel(v, vf).AsEnumerable();
}
Added the following code to a Silverlight page in an attempt to consume the stored procedure:
LAMPContext db = new LAMPContext();
try
{
var q = db.GetTheAffectedParcels("18606004005", "").Value;
foreach (getAffectedParcelResult GAP in q)
{
MessageBox.Show(GAP.Owner);
}
}
catch (Exception ex)
{
MessageBox.Show (ex.Message.ToString());
}
Build and run application. An error occurs stating:
Object reference not set to an instance of an object.
I have tried ~1000,000 ways to see if this thing would work, but to no avail. Please don't tell me to use Entity Framework, I want to use Linq-to-SQL. Can someone (anyone) help me out here.
//houdini
Calling a stored procedure from the Silverlight client happens in the Async world. Let's consider an example from the AdventureWorks database...
Here's what the Domain Service method looks like. It is calling the EF on a stored procedure in the database called 'BillOfMaterials'.
public IQueryable<BillOfMaterial> GetBillOfMaterials()
{
return this.ObjectContext.BillOfMaterials;
}
Back on the client side, here is the code for setting up the call...
public GetSp()
{
InitializeComponent();
DomainService1 ds1 = new DomainService1();
var lo = ds1.Load(ds1.GetBillOfMaterialsQuery());
lo.Completed += LoCompleted;
}
First, the Domain Service is created, and then it is used to load the results of the stored procedure. In this particular case, the result of this is an instance of 'LoadOperation'. These things are async, so the LoadOperation needs to have a callback for when it is finished. The callback code looks like this...
public ObservableCollection<BillOfMaterial> MyList { get; set; }
void LoCompleted(object sender, EventArgs e)
{
LoadOperation lo = sender as LoadOperation;
if(lo!=null)
{
MyList = new ObservableCollection<BillOfMaterial>();
foreach(BillOfMaterial bi in lo.AllEntities)
{
MyList.Add(bi);
}
dataGrid1.ItemsSource = MyList;
}
}
In this method, the 'sender' is dereferenced into the LoadOperation instance, and then all the goodies from the database can be accessed. In this trivial example, a list is built and passed to DataGrid as the ItemsSource. It's good for understanding, but you would probably do something else in practice.
That should solve your problem. :)
The best advice I can give on Silverlight and RIA is never do ANYTHING on your own until you have tried it in AdventureWorks. You will just waste your time and beat your head against the wall.
Firstly, it seems like your DomainService code is written for Invoke() rather than Query(). You should use Query as it enables you to update data back to the server.
Solution: you should add a [Query] attribute to GetTheAffectedParcels on the domain service.
[Query]
public IQueryable<Parcel>
GetTheAffectedParcels(string ParcelNumber, string LotNumber)
{
// etc.
}
Secondly, RIA Services needs to know which is the primary key on the Parcel class.
Solution: Apply a MetadataType attribute to the Parcel class, which allows you to add metadata to the Parcel class indirectly, since it is generated by Linq2Sql and you couldn't add annotations directly to the ParcelId - it'd get wiped away.
[MetadataType(typeof(ParcelMetadata)]
public partial class Parcel
{
}
public class ParcelMetadata
{
[System.ComponentModel.DataAnnotations.Key]
public int ParcelId {get; set; }
}
Thirdly, modify your client like this. Instead try this on the Silverlight client:
LAMPContext db = new LAMPContext();
try
{
var q = db.GetTheAffectedParcelsQuery("18606004005", "");
db.Load(q, (op) =>
{
if (op.HasError)
{
label1.Text = op.Error.Message;
op.MarkErrorAsHandled();
}
else
{
foreach (var parcel in op.Entities)
{
// your code here
}
}
}
}
catch (Exception ex)
{
label1.Text = op.ex.Message;
}
Much thanks to Chui and Garry who practically kicked me in the right direction :) [thanks guys...ouch]
This is the procedure I finally undertook:
-After adding the data model(LINQ2SQL) and the domain service, I created a partial class [as suggested by Chui] and included the following metadata info therein:
[MetadataTypeAttribute(typeof(getAffectedParcelResult.getAffectedParcelResultMetadata))]
public partial class getAffectedParcelResult
{
internal sealed class getAffectedParcelResultMetadata
{
[Key]
public string PENumber { get; set; }
}
}
Then, Adjusted the Domain Service to include the following:
[Query]
public IQueryable<getAffectedParcelResult> GetTheAffectedParcels(string v, string vf)
{
// IEnumerable<getAffectedParcelResult> ap = this.DataContext.getAffectedParcel(v, vf);
return this.DataContext.getAffectedParcel(v, vf).AsQueryable();
}
Then Build the app, afterwhich the getAffectedParcelResult store procedure appeared in the Data Sources panel. I wanted to access this via code however. Therefore, I accessed it in silverlight [.xaml page] via the following:
LAMPContext db = new LAMPContext();
var q = db.GetTheAffectedParcelsQuery("18606004005", "");
db.Load(q, (op) =>
{
if (op.HasError)
{
MessageBox.Show(op.Error.Message);
op.MarkErrorAsHandled();
}
else
{
foreach (getAffectedParcelResult gap in op.Entities)
{
ownerTextBlock.Text = gap.Owner.ToString();
}
}
},false);
This worked nicely. The thing is, my stored procedure returns a complex type so to speak. As of such, it was not possible to map it to any particular entity.
Oh and by the way this article helped out as well:
http://onmick.com/Home/tabid/154/articleType/ArticleView/articleId/2/Pulling-Data-from-Stored-Procedures-in-WCF-RIA-Services-for-Silverlight.aspx
There is tons of information on this, but even after reading for hours and hours I can't seem to get this to work the way I want.
I'm trying to update a User object by passing in a User object and generically comparing changes to a User object I pull out of the database. I always end up getting the NotSupportedException when using this method:
An attempt has been made to Attach or
Add an entity that is not new, perhaps
having been loaded from another
DataContext. This is not supported.
Here is how I am trying to do it:
public void SaveUser(User User)
{
using (DataContext dataContext = new DataContext(WebConfigurationManager.ConnectionStrings["database"].ConnectionString))
{
// New user
if (User.UserID == 0)
{
dataContext.Users.InsertOnSubmit(User);
}
// Existing user
else
{
User dbUser = dataContext.Users.Single(u => u.UserID.Equals(User.UserID));
Type t = dbUser.GetType();
foreach (PropertyInfo p in t.GetProperties())
{
if (p.CanWrite & p.GetValue(dbUser, null) != p.GetValue(User, null))
{
p.SetValue(dbUser, p.GetValue(User, null), null);
}
}
//dataContext.Refresh(RefreshMode.KeepCurrentValues, dbUser);
}
dataContext.SubmitChanges();
}
}
The commented out line I tried uncommented too, but it was no help.
If I comment out the foreach() loop and add a line like dbUser.UserName = "Cheese"; it will update the User's name in the database fine. That leads me to believe it is something with how the foreach() loop changing the dbUser object that causes this to fail.
When I debug the dbUser object, it appears to correctly acquire all the changes from the User object that was passed as an argument.
I also did some reading on optimistic concurrency and added a column to the table of data type timestamp, but that didn't seem to have any effect either.
What exactly am I doing wrong here?
How can I get this to generically detect what has changed and correctly persist the changes to the database?
My guess is there's a foreign key relation that you are trying to copy over that was not initially loaded (because of lazy-loading) During the copying, it's attempting to load it, but the DataContext has already been disposed.
I've been working on a similar problem. I ended up using AutoMapper to handle copying the properties for me. I have configured AutoMapper to ignore the primary key field as well as any relations. Something like:
public void Update(User user)
{
using (var db = new DataContext(...))
{
var userFromDb = db.Users.Where(x => x.Id == user.Id).Single();
AutoMapper.Mapper.Map(user, userFromDb);
db.SubmitChanges();
}
}
My automapper configuration is something like
AutoMapper.Mapper.Create<User, User>().ForMember(dest => dest.Id, opt => opt.Ignore())
.ForMember(dest => dest.SomeRelation, opt => opt.Ignore());
You can find AutoMapper here: http://automapper.codeplex.com/
I keep my repo pretty lean, it's only job is to interact with the database. I build a Service layer on top of the repo that does a little more work
public class EventRepository : IEventRepository
{
private DBDataContext dc;
public EventRepository()
{
dc = new DBDataContext();
}
public void Create(Event #event)
{
dc.Events.InsertOnSubmit(#event);
}
public System.Linq.IQueryable<Event> Read()
{
object events = (from e in dc.Eventse);
return events.AsQueryable;
}
public void SubmitChanges()
{
dc.SubmitChanges();
}
}
Then the corresponding call from the service layer looks like this
public void AddEvent(Event #event)
{
_EventRepository.Create(#event);
}
public void SubmitChanges()
{
_EventRepository.SubmitChanges();
}
And I call it from my controller.
// AutoMapper will allow us to map the ViewModel with the DomainModel
Mapper.CreateMap<Domain.ViewModels.EventsAddViewModel, Domain.Event>();
object #event = Mapper.Map<Domain.ViewModels.EventsAddViewModel, Domain.Event>(eventToAdd);
// Add the event to the database
EventService.AddEvent(#event);
EventService.SubmitChanges();
I have a MVVM pattern test application that is tossing a lot of
System.Windows.Data Error: 17 : Cannot get 'Item[]' value (type 'ChuteGroup') from 'Groups' (type 'ChuteGroupsModel'). BindingExpression:Path=Groups[0]; DataItem='MainViewModel' (HashCode=41802290); target element is 'ChuteView' (Name=''); target property is 'DataContext' (type 'Object') ArgumentOutOfRangeException:'System.ArgumentOutOfRangeException: Specified argument was out of the range of valid values. Parameter name: index'
This happens when I enter my "onRefresh" routine in the viewmodel. I have an observable collection called "Current" and the first thing I do in the refresh routine is clear the entries from the Current collection. I then get a slew of these data error 17 messages because I think in the background the bindings are trying to update and now there isn't anything in the collection until I re-fill and re-create each entry into the observable collection.
Is there a better way of doing this? Runtime performance doesn't seem to be affected by this but I don't like errors in my output window. I found that if I didn't clear the collection it just doubled in size each time the viewmodel refreshed itself. Since this collection is used in conjunction with 54 UI elements, which are binded by index, the collection can't double in size or everything wont point to the correct UI element.
private void FetchData()
{
ChuteGroupsModel.isDuringRefresh = true;
DataSet sqldata = new DataSet();
SqlConnection conn = new SqlConnection("Data Source=(local);Initial Catalog=ScratchPaper;User ID=somecacct;Password=somepassword;Connect Timeout=5");
SqlCommand cmd = new SqlCommand("Select chuteGroup, chuteGroupDef, TotalChutes, UpPackWave, UpColorId, UpPackWaveTS, DownPackWave, DownColorId, DownPackWaveTS from ChuteGroups Order by chuteGroup asc",conn);
SqlDataAdapter da = new SqlDataAdapter(cmd);
try
{
da.Fill(sqldata);
}
catch (Exception Ex){ MessageBox.Show(Ex.ToString());}
//DataSet sqldata = this.DataLayer.getDataSet("Select * from AvailableColors Order by ID asc", CommandType.Text, null);
foreach (DataRow row in sqldata.Tables[0].Rows)
{
ChuteGroup group = new ChuteGroup((int)row.ItemArray[0], (string)row.ItemArray[1], (int)row.ItemArray[2], (string)row.ItemArray[3],(string)row.ItemArray[4], (DateTime)row.ItemArray[5], (string)row.ItemArray[6], (string)row.ItemArray[7], (DateTime)row.ItemArray[8]);
Add(group);
}
ChuteGroupsModel.isDuringRefresh = false;
}
private void onRefresh(object sender, System.EventArgs e)
{
try
{
if (ChuteGroupsModel.isDuringRefresh)
{
return;
}
Refresh();
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex.Message.ToString());
}
}
public void Refresh()
{
Current.Clear(); //Current is a static reference to my collection
FetchData();
}
You are right in assuming that the binding errors are due to your collection being cleared. As you aren't experiencing any performance penalties, I wouldn't worry about it too much.
If you are really annoyed by them, you can create a new observable collection type to allow you to set the new values. If you crack open the ObservableCollection class in Reflector, you'd copy the implementation, and add one new method like so:
public class ObservableList<T> : Collection<T>, INotifyCollectionChanged, INotifyPropertyChanged
{
// ObservableCollection implementation here
...
public void SetItems(IEnumerable<T> items)
{
this.CheckReentrancy();
base.ClearItems();
int i = 0;
foreach (var item in items)
{
base.InsertItem(i, item);
++i;
}
this.OnPropertyChanged("Count");
this.OnPropertyChanged("Item[]");
this.OnCollectionReset();
}
}
Then, instead of clearing your data and adding it one row at a time, you'd call the SetItems method.
I know it's been a while since this question has been asked, but one method that has solved this type of issue for me is using RemoveAt(0) (and/or whatever locations you need cleared) and subsequently Adding your data to the collection. There seems to be some sort of timing issue with Clear() in that for a few brief instants, calls from the bindings for the data returns an empty collection, thus the output messages. This may seem a bit detailed and perhaps brutish in comparison with a more generic clear, but it does remove the errors and noticeably improves performance.