Dapper.Contrib: How to get a row by filtering on column other than ID? - dapper

My class is like below:
[Table("tblUser")]
public class User
{
[Key]
public int Id { get; set; }
public string Title { get; set; }
}
Using Dapper.Contrib, is there a way to get the User record by Title instead of Id?
If I query like below, it works. But, I want to filter on Title column which is not a key.
await connection.GetAsync<User>(Id);

Looking at the documentation, Dapper.Contrib does not support retrieval of records using criteria other than key. In other words, it does not support any kind of predicate system in its current implementation.
Using GetAll, you can further filter it with linq. But remember that this is not being executed on RDBMS. It will be executed on application side or in memory. That means, entire data will be loaded first and then it will be filtered.
Personally, I will choose to use Dapper (bypassing Contrib) for such specific scenario. Other part of the project will still use Contrib.

Related

best event sourcing db strategy

I want to setup a small event sourcing lib.
I read a few tutorials online, everything understood so far.
The only problem is, in these different tutorials, there are two different database strategies, but without any comments why they use the one they use.
So, I want to ask for your opinion.
And important, why do you prefer the solution you choose.
Solution is the db structure where you create one table for each event.
Solution is the db structure where you create only one generic table, and save the events as serialized string to one column.
In both cases I'm not sure how they handle event changes, maybe they create a whole new one.
Kind regards
I built my own event sourcing lib and I opted for option 2 and here's why.
You query the event stream by aggregate id not event type.
Reproducing the events in order would be a pain if they are all in different tables
It would make upgrading events a bit of pain
There is an argument to say you can store events on a per aggregate but that depends of the requirements of the project.
I do have some posts about how event streams are used that you may find helpful.
6 Code Smells With Your CQRS Events and How to Avoid Them
Aggregate Root – How to Build One for CQRS and Event Sourcing
How to Upgrade CQRS Events Without Busting Your Event Stream
Solution is the db structure where you create only one generic table, and save the events as serialized string to one column
This is by far the best approach as replaying events is simpler. Now my two cents on event sourcing: It is a great pattern, but you should be careful because not everything is as simple as it seems. In a system I was working on we saved the stream of events per aggregate but we still had a set of normalized tables, because we just could not accept that in order to get the latest state of an object we would have to run all the events (snapshots help but are not a perfect solution). So yes event sourcing is a fine pattern, it gives you a complete versioning of your entities and a full auditing log, and it should be used just for that, not as a replacement of a set of normalized tables, but this is just my two cents.
I think best solution will be to go with #2. And even you can save your current state together with the related event at the same time if you use a transactional db like mysql.
I realy dont like and recommend the solution #1.
If your concern for #1 is about event versioning/upgrading; then declare a new class for each new change. Dont be too lazy; or be obsess with reusing. Let the subscribers know about changes; give them the event version.
If your concers for #1 is about something like querying/interpreting events; then later you can easily push your events to an nosqldb or eventstore at any time (from original db).
Also; the pattern I use for eventsourcing lib is something like that:
public interface IUserCreated : IEventModel
{
}
public class UserCreatedV1 : IUserCreated
{
public string Email { get; set; }
public string Password { get; set; }
}
public class UserCreatedV2 : IUserCreated
{
// Fullname added to user creation. Wrt issue: OA-143
public string Email { get; set; }
public string Password { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
}
public class EventRecord<T> where T : IEventModel
{
public string SessionId { get; set; } // Can be set in emitter.
public string RequestId { get; set; } // Can be set in emitter.
public DateTime CreatedDate { get; set; } // Can be set in emitter.
public string EventName { get; set; } // Extract from class or interface name.
public string EventVersion { get; set; } // Extract from class name
public T EventModel { get; set; } // Can be set in emitter.
}
public interface IEventModel { }
So; make event versioning and upgrading explicit; both in domain and codebase. Implement handling of new events in subscribers before deploying origin of new events. And; if not required, dont allow direct consuming of domain events from external subscribers; put an integration layer or something like that.
I wish my thoughts will be useful for you.
I read about an event-sourcing approach that consists in:
having two tables: aggregate and event;
base on you use cases either:
a. creates and registry on aggregate table, generating an ID, version = 0 and a event type and create an event on event table;
b. retrieve from aggregate table, events by ID or event type, apply business cases and then update aggregate table (version and event type) and then create an event on event table.
although I this approach updates some fields on aggregate table, it leaves event table as append only and improves performace as you have the latest version of an aggregate in aggregate table.
I would go with #2, and if you really want to have an efficient way of search via event type, I would just add an index on that column.
Here are the two strategies to access the data about a subject involved in this case.
1) current state and 2) event sequencing.
With current state we process the events but keep only the last state of the subject.
With event sequencing we keep the events and rebuild the current state by processing the events every time we need the state.
Event sequencing is more reliable as we can track everything that happened causing the current state but it's definitely not efficient. It's a common sense to keep also intermediate states (snapshots) not only the last one to avoid reprocessing all the events all the time. Now we have reliability and performance.
In crypto currencies there are the event sequencing and local snapshots - the local in the name is because blockchains are distributed and data are replicated.

VB6 - Populate User Defined Type Array from Stored Procedure then Find Item in Array

I am coming from more of a .NET background and need to make some changes to a very old VB6 application.
The .NET equivalent of what I'm trying to do now in VB6 is, define a (model) class with 3 properties
public class MyClass
{
public string Ref { get; set; }
public string OldNumber { get; set; }
public string NewNumber { get; set; }
}
In .NET I would then call a stored procedure to return a set of results (there could be a few thousand records) and assign them to, for example, an instance of List<MyClass>.
I could then, whenever I need to, attempt to find an item within this List, where the 'Ref' property is 'blah', and use this item/its other properties (OldNumber and NewNumber).
However, in VB6, I don't know how this same process is best achieved. Can anyone please help?
If you are using ADO you can cache results by querying into a static cursor client-side Recordset and then disconnecting it.
You can use Sort, Find, Filter, etc. and move through the rows as needed. You can even improve searches by building a local index within the Recordset after opening and disconnecting it by using the Field object's Optimize dynamic property. See:
Optimize Property-Dynamic (ADO)

Which nosql database for heterogeneous records?

I'm looking at different options for storing log entries for easier querying/reporting.
Currently I write scripts that parse and find the data, but the data is becoming more and more in demand, so it's becoming worth it to put the log data in a database.
Log entries are composed of key-value pairs, such as{"timestamp":"2012-04-24 12:34:56.789", "Msg":"OK" (simplified example).
I'm sure that eventually the log format will be extended to, say {"timestamp":"2012-04-24 12:34:56.789", "Msg":"OK", "Hostname":"Bubba", which means that the "schema" or "document definition" will need to change. Also, we're a Windows + .NET shop.
Hence, I was primarily looking for some NoSQL engine and found RavenDB attractive to use from .NET.
However, I have a hard time finding information about how it, and other NoSQL databases, work with heterogeneous records.
What would be a good fit in your opinion?
With RavenDB you can just store the different types of docs and it will be able to handle the "changes" in schema. Because it is in fact "schema-free", you can write indexes that will only index the fields that are there. See this blog post for some extra info. It's talking about migrations, but the same applies here.
Also the dynamic fields option will help you here. So given doc with arbitrary properties:
public class LogEntry
{
public string Id { get; set; }
public List<Attribute> Attributes { get; set; }
}
public class Attribute
{
public string Name { get; set; }
public string Value { get; set; }
}
You can write queries like this:
var logs = session.Advanced.LuceneQuery<LogEntry>("LogEntry/ByAttribute")
.WhereEquals("Msg", "OK")
.ToList();

WCF RIA Services / Linq-to-SQL: include property from foreign tables

Say you have the following related tables (Stores -> Categories -> Products)
Stores
Categories
Products
And I want to create a grid to edit Products. This is straightforward with RIA Services. But what if I also want to show StoreName from Stores and CategoryName from Categories in my Products list? The two extra columns should be readonly.
How can this be implemented?
Update: I'm trying to do this in it's simplest form. That is no ViewModel, only drag'n drop, code (if any) will go in codebehind. I'm using Ling2Sql and returning the default implementation for the GetProducts query.
Regards
Larsi
How do you have this set up? Are you binding to a ViewModel or just using the code behind? Is the web service sending back a list of Product LINQ object or are you doing something else?
There are a variety of options but it really depends on what you're trying to do.
The simplest way to do it is to annotate your metadata file for the products and let the grid generate the columns for you.
For instance, your tables will probably look something like this:
Product
int Id;
string ProductName;
int CategoryId;
Category
int Id;
string CategoryName;
int StoreId;
Store
int Id;
string StoreName;
Now, when you create your service, you can include the 3 tables/entities from your domain model and have it generate the metadata file for you. In that file, annotate the objects correctly like so:
internal sealed class ProductMetadata
{
[Key]
[Bindable(false)]
[Display(AutogenerateField=false)]
public int Id { get; set; }
[Bindable(true, BindingDirection.TwoWay)]
[Display(Name="Product")]
[StringLength(20, MinimumLength=3)]
public string ProductName { get; set; }
[Bindable(false)]
[Display(AuteogenerateField=false)]
public Category Category { get; set; }
[Required]
[Bindable(false)]
[Display(AutogenerateField=false)]
public CategoryId { get; set; }
}
You can do the same to your other objects' metadata.
The only other thing you might have to do is add 2 other columns to your grid, and have them map to Product.Category.CategoryName and Product.Category.Store.StoreName

Fetch Tags and Tag count using HQL on SQL Server 2008

I'm implementing tagging on a particular entity, using NHibernate on SQL Server 2008. The structure I have now is, simplifying, like this:
public class Entity {
public Guid Id { get; set; }
}
public class Tag {
public Guid Id { get; set; }
public string Name { get; set; }
}
public class TagAssoc {
public Tag LinkedTag { get; set; }
public Entity LinkedEntity { get; set; }
//User
//Other properties
}
Nothing exotic: an entity can be tagged multiple times with the same tag, since the association also includes data about the user that tagged the entity and other stuff.
Now, I'm trying to fetch a list of tags of a particular entity, with the counts of how many times the tag has been applied. Something like this in HQL:
select tass.LinkedTag, count(tass.LinkedTag)
from TagAssoc as tass left outer join tass.LinkedTag as t
group by tass.LinkedTag
This generates the following SQL query:
select tag1_.Id as Id0_, tag1_.Name as Name0_, tag1_.Id as x0_0_, count_big(tag1_.Id) as x1_0_
from BandTags tagassoc0_ left outer join Tags tag1_ on tagassoc0_.TagId=tag1_.Id
group by tag1_.Id
This looks correct, but won't work in SQL Server 2008, because the Name property of Tag is not included in a "group by" clause. To make it work, I have to manually adjust the group by clause in order to include all properties of the Tag class:
select tass.LinkedTag, count(tass.LinkedTag)
from TagAssoc as tass left outer join tass.LinkedTag as t
group by tass.LinkedTag.Id, tass.LinkedTag.Name
But this depends on the properties of the Tag class and therefore would have to be updated every time the class is changed.
Is there some other way to make the first HQL query work? Perhaps some HQL syntax that automatically makes the "group by" properties explicit?
Thanks
It doesn't appear that there is any way to make NHibernate determine the group by properties automatically. The documentation even seems to imply this in the example HQL they give for an aggregate function:
select cat, count( elements(cat.Kittens) )
from Eg.Cat cat group by cat.Id, cat.Weight, ...
There they also explicitly specify the properties of Cat.
If you want to dynamically build a query that does not need an update every time the class changes, I think you're stuck with Reflection and the Criteria interface.
ProjectionList list = Projections.ProjectionList();
foreach (PropertyInfo prop in typeof(Tag).GetProperties())
{
list.Add(Projections.GroupProperty(prop.Name));
}
list.Add(Projections.Property("LinkedTag"));
list.Add(Projections.Count("LinkedTag"));
session.CreateCriteria(typeof(TagAssoc)).SetProjection(list).List();
I haven't tried this so it may or may not work or might need some tweaking, but you get the idea. You might decide the Tag class won't change enough to be worth the trouble.

Resources