PreparedStatement equivalent to JDBCTemplate.update(String, Object[])? - prepared-statement

So I have been lead to believe that this is the most efficient way of getting an auto-generated ID value from a database using a JDBCTemplate:
KeyHolder keyHolder = new GeneratedKeyHolder();
jdbcTemplate.update(
new PreparedStatementCreator() {
public PreparedStatement createPreparedStatement(Connection connection) throws SQLException {
PreparedStatement ps =
connection.prepareStatement(INSERT_SQL, new String[] {"ID_FIELD"});
// Configure the PreparedStatement HERE!
return ps;
}
},
keyHolder);
My problem is that I'm often inserting a variable number of values (JDBCTemplate.update(String, Object[]) is actually exactly what I need), and it looks like PreparedStatement allows insertion of one at a time (setString and the like). Looping through the array seems to be so... inelegant.

Well, since this is a tumbleweed, I'm guessing that there are no other ways to accomplish this. I ended up creating a class to handle this so that I can get around the requirement for final.

Related

Writing a generic InsertData method using Dapper.Contrib and InsertAsync

I am fairly new to C# and Dapper. I am trying to write a generic Insert Data method for my project. I come from a Delphi environment so I am still finding my way around C#.
Dapper seems fairly straight forward to use but I am experiencing some challenges. I have tried every which way to get the syntax right with the following code but have been unsuccessful.
Issues seem to be around the T (I still quite don't understand what T is) and all the combinations I have tried don't work.
public async Task<int> InsertData<T>(T list)
{
string connectionString = _config.GetConnectionString(ConnectionStringName);
using (IDbConnection connection = new SqlConnection(connectionString))
{
return await connection.InsertAsync<int>(list);
}
}
The following code does work, so where am I going wrong?
public async Task SaveData<T>(string sql, T parameters)
{
string connectionString = _config.GetConnectionString(ConnectionStringName);
using (IDbConnection connection = new SqlConnection(connectionString))
{
await connection.ExecuteAsync(sql, parameters);
}
}
Your second code (await connection.ExecuteAsync(sql, parameters);) works because you are simply executing your hand written SQL statement. The method ExecuteAsync belongs to Dapper; NOT Dapper.Contrib. The generic T is used for parameters, not for the object you are trying to insert.
With your first code (return await connection.InsertAsync<int>(list);), you are actually using Dapper.Contrib; you are not writing the SQL statement by hand. Dapper.Contrib generates it for you.
Your following code seems the problem:
return await connection.InsertAsync<int>(list);
You are passing generic parameter <int> to the method which does not make sense.
I have not tested this but I hope changing that line to below one should work:
return await connection.InsertAsync<T>(list);
Also, you have to make sure the generic type T is class by adding where T : class to it.
Following generic method should serve your purpose; you need to convert it to async to match with your current code:
public void InsertData<T>(T entity) where T : class
{
string connectionString = ....;
using(IDbConnection connection = new SqlConnection(connectionString))
{
long result = connection.Insert<T>(entity);
}
}
I did not understood few other parts in your code. Say InsertData<T>(T list). What is list? Is it as single object or list of objects? If it is list of objects, List<T> list makes more sense. If it is single object, better you rename list to actual object name, say T customer/entity/poco etc.

Dapper: read multiple results with type array not generics

I have a 3 part query that I am reading using QueryMultiple. My problem is on the first Read<T> I need to split the query into 12 different classes, which Dapper does not support from what I could see. Before I used QueryMultiple, my query was only one part and I was using the method from this example Using Dapper to map more than 5 types to get 12 different classes. My question is, how can i split the first Read<T> into twelve classes and then continue with the GridReader? Please note I cannot create one big query.
public static IEnumerable<TReturn> Query<TReturn>(this IDbConnection cnn, string sql, Type[] types, Func<object[], TReturn> map, dynamic param = null, IDbTransaction transaction = null, bool buffered = true, string splitOn = "Id", int? commandTimeout = null, CommandType? commandType = null);
UPDATE
I tested this method I added to the Dapper file and it worked, but I was only referencing the DLL and not the actual file in my app so I am not sure how to add this on without taking in the Dapper file from github. I was hoping there was built-in support for what I wanted and I just missed it somewhere in the code. Thanks for any help.
public IEnumerable<TReturn> Read<TReturn>(Type[] types, Func<object[], TReturn> func, string splitOn = "id", bool buffered = true)
{
var identity = this.identity.ForGrid(typeof(TReturn), types, gridIndex);
try
{
foreach (var r in SqlMapper.MultiMapImpl<TReturn>(null, default(CommandDefinition), types, func, splitOn, reader, identity, false))
{
yield return r;
}
}
finally
{
NextResult();
}
}
As i was about to add a pull request i noticed someone was one step ahead of me. Seems this functionality is not included in Dapper right now.
https://github.com/StackExchange/dapper-dot-net/pull/308

Solr, Special Characters, and the MultiFieldQueryParser

I need to programatically build boolean queries against multiple Solr fields. I thought that the Lucene MultiFieldQueryParser would be a good way to go. This works well except when special characters are involved.
public class QueryParserSpike {
String userQuery = "(-)-foo";
String escapedQuery = ClientUtils.escapeQueryChars(userQuery); // \(\-\)\-foo
Analyzer analyzer = new WhitespaceAnalyzer(Version.LUCENE_43);
QueryParser parser = new MultiFieldQueryParser(Version.LUCENE_43, new String[]{"a"}, analyzer);
#Test(expected=ParseException.class)
public void testNoEscape() throws Exception {
parser.parse(userQuery); // Throws an exception
}
#Test
public void testEscape() throws Exception {
Query q = parser.parse(escapedQuery);
System.out.println(q.toString()); // a:(-)-foo (This can't be parsed by Solr)
}
#Test
public void testDoubleEscape() throws Exception {
String doubleEscapedQuery = escapedQuery.replaceAll("\\\\", "\\\\\\\\") ;
Query q = parser.parse(doubleEscapedQuery);
System.out.println(q.toString()); // (a:\) (a:\-\) (a:\-foo) (This isn't the correct query)
}
}
What I'm trying to get out of this would be a:\(\-\)\-foo. Is there a Solr class that does something similar? Or is the best option to write something to process the result of the MultiFieldQueryParser myself?
What the query passes from Query.toString() method is a best effort at a user readable query. It is not necessarily a parsable query, like in this case. You can never rely on logic like: parser.parse(query.toString()). The Lucene Query API is capable of expressing many things that there is no way at all to express with the QueryParser syntax.
The method you use to escape the query in testEscape() should be correct, and give you the query you are looking for. You could also use QueryParser.escape(userQuery), for the raw Lucene method.

How to figure out which SQLDependency triggered change function?

I'm exploring query notifications with the SQLDependency class. Building a simple working example is easy, but I feel like I'm missing something. Once I step past a simple one-table/one-dependency example I'm left wondering how can I figure out which dependency triggered my callback?
I'm having a bit of trouble explaining, so I included the simple example below. When AChange() is called I cannot look at the sql inside the dependency, and i don't have a reference to the associated cache object.
So what's a boy to do?
Option 1 - create a distinct function for each object i want to track and hard code the cache-key (or relevant information) in the callback. This feels dirty & eliminates the posibility of adding new cache items without deploying new code--ewww.
Option 2 - Use the Dependency Id property and a parallel tracking structure
Am I just missing something? Is this a deficiency in the SQLDependency structure? I've I've looked at 20 different articles on the topic and all of them seem to have the same hole. Suggestions?
Code Sample
public class DependencyCache{
public static string cacheName = "Client1";
public static MemoryCache memCache = new MemoryCache(cacheName);
public DependencyCache() {
SqlDependency.Start(connString);
}
private static string GetSQL() {
return "select someString FROM dbo.TestTable";
}
public void DoTest() {
if (memCache["TEST_KEY"] != null ) {
Debug.WriteLine("resources found in cache");
return;
}
Cache_GetData();
}
private void Cache_GetData() {
SqlConnection oConn;
SqlCommand oCmd;
SqlDependency oDep;
SqlDataReader oRS;
List<string> stuff = new List<string>();
CacheItemPolicy policy = new CacheItemPolicy();
SqlDependency.Start(connString);
using (oConn = new SqlConnection(connString) ) {
using (oCmd = new SqlCommand(GetSQL(), oConn) ) {
oDep = new SqlDependency(oCmd);
oConn.Open();
oRS = oCmd.ExecuteReader();
while(oRS.Read() ) {
resources.Add( oRS.GetString(0) );
}
oDep.OnChange += new OnChangeEventHandler (AChange);
}
}
memCache.Set("TEST_KEY", stuff, policy);
}
private void AChange( object sender, SqlNotificationEventArgs e) {
string msg= "Dependency Change \nINFO: {0} : SOURCE {1} :TYPE: {2}";
Debug.WriteLine(String.Format(msg, e.Info, e.Source, e.Type));
// If multiple queries use this as a callback how can i figure
// out WHAT QUERY TRIGGERED the change?
// I can't figure out how to tell multiple dependency objects apart
((SqlDependency)sender).OnChange -= Cache_SqlDependency_OnChange;
Cache_GetData(); //reload data
}
}
First and foremost: the handler has to be set up before the command is executed:
oDep = new SqlDependency(oCmd);
oConn.Open();
oDep.OnChange += new OnChangeEventHandler (AChange);
oRS = oCmd.ExecuteReader();
while(oRS.Read() ) {
resources.Add( oRS.GetString(0) );
}
Otherwise you have a window when the notification may be lost and your callback never invoked.
Now about your question: you should use a separate callback for each query. While this may seem cumbersome, is actually trivial by using a lambda. Something like the following:
oDep = new SqlDependency(oCmd);
oConn.Open();
oDep.OnChange += (sender, e) =>
{
string msg = "Dependency Change \nINFO: {0} : SOURCE {1} :TYPE: {2}";
Debug.WriteLine(String.Format(msg, e.Info, e.Source, e.Type));
// The command that trigger the notification is captured in the context:
// is oCmd
//
// You can now call a handler passing in the relevant info:
//
Reload_Data(oCmd, ...);
};
oRS = oCmd.ExecuteReader();
...
And remember to always check the notification source, info and type. Otherwise you run the risk of spinning ad-nauseam when you are notified for reasons other than data change, like invalid query. As a side comment I would add that a good cache design does not refresh the cache on invalidation, but simply invalidates the cached item and lets the next request actually fetch a fresh item. With your 'proactive' approach you are refreshing cached items even when not needed, refresh multiple times before they are accessed etc etc. I left out from the example error handling and proper thread synchronization (both required).
Finally, have a look at LinqtoCache which does pretty much what you're trying to do, but for LINQ queries.

EF ObjectQuery<T> Context, Parameters, Connection properties equivalent on DbSet<T>

In the earlier versions of Entity Framework, we were able to reach the Context out of ObjectQuery in order to read Parameters, Connection, etc. as below:
var query = (ObjectQuery<T>)source;
cmd.Connection = (SqlConnection)((EntityConnection)query.Context.Connection).StoreConnection;
cmd.Parameters.AddRange(
query.Parameters.Select(x => new SqlParameter(
x.Name, x.Value ?? DBNull.Value)
).ToArray()
);
When I look at the DbSet<T> object, I am unable to find any equivalent of this. My purpose here is to create extensions which will manipulate the query and get the result out of it.
Here is an instance: http://philsversion.com/2011/09/07/async-entity-framework-queries
Or should I write the extension for DbContext class and work with Set method?
Any idea?
Edit
Here is what I did so far. Basic implementation so far but certainly not ready for production. Any suggestions on this?
public static async Task<IEnumerable<T>> QueryAsync<T>(this DbContext #this, System.Linq.Expressions.Expression<Func<T, bool>> predicate = null)
where T : class {
var query = (predicate != null) ? #this.Set<T>().Where(predicate) : #this.Set<T>();
var cmd = new SqlCommand();
cmd.Connection = (SqlConnection)(#this.Database.Connection);
cmd.CommandText = query.ToString();
if (cmd.Connection.State == System.Data.ConnectionState.Closed) {
cmd.Connection.ConnectionString = new SqlConnectionStringBuilder(cmd.Connection.ConnectionString) {
AsynchronousProcessing = true
}.ToString();
cmd.Connection.Open();
}
cmd.Disposed += (o, e) => {
cmd.Clone();
};
var source = ((IObjectContextAdapter)#this).ObjectContext.Translate<T>(
await cmd.ExecuteReaderAsync()
);
return source;
}
This is a nice workaround, although I don't think you can make it much more generally applicable than what you already have.
A few things to keep in mind:
- Depending on the EF query, e.g. if you are using Include or not, the columns returned in the reader might not match the properties in the type T you are passsing.
- Depending on whether you have inheritance in your model, the T that you pass to translate may not always be the right thing to materialize for every row returned.
- After the task returned by ExecuteReaderAsync completes, you still have to retrieve each row, which depending on the execution plan for the query and the latency you are getting with the server is potentially also a blocking operation.
Async support is not coming to EF in 5.0 but we worked with other teams to make sure we have all the necessary building blocks included in .NET 4.5 and the feature is pretty high in our priority list. I encourage you to vote for it in our UserVoice site.

Resources