How to add parameters and execute a generic IDbCommand - database

Here is my problem in detail.
I have created a data access layer class that allows me to create most of objects I needed to communicate with databases (Odbc, OleDb and SqlClient). I also have created a business object handling layer class with intensive use of Reflection to handle lots of tasks with my business object(s). Among other things this class generates every single property/object I needed for my DAL handling (SQL stream, list of values, properties, set retrieve values etc.). Take a look the code below for further explanation:
Public Shared Function InvokeParam(Of T)(_classObject As T, _commandType As AdapterCommandType, _arguments As Object()) As Boolean
Dim s As String = DAL.SCRFL.GetParamStatement(_classObject, _commandType, _arguments)
'Debug.Print(s)
Dim hT As Hashtable = DAL.SCRFL.GetProperties(_classObject)
Using cnn As IDbConnection = DataFactory.CreateConnection()
Dim cmd As IDbCommand = DataFactory.CreateCommand(s, cnn)
'cmd.CommandType = CommandType.Text
cmd.CommandText = s
For Each k In hT
Dim param As IDbDataParameter = cmd.CreateParameter()
'param.DbType = DataFactory.ConvertToDbType(k.value.GetType)
param.Value = k.value
param.ParameterName = k.key
'param.Direction = ParameterDirection.Input
'Debug.Print("value:={0}, name:={1}", TypeName(k.value), TypeName(k.key))
Debug.Print("typeMatch:={0}, value:={1}, name:={2}", TypeName(param.Value) = TypeName(k.value), param.Value, param.ParameterName)
cmd.Parameters.Add(param)
Next
If (cmd.ExecuteNonQuery > 0) Then
Return True
End If
End Using
Return False
End Function
So, DAL.SCRFL.GetParamStatement returns string formatted as INSERT INTO t1 (f1, f2, f3...) values (?, ?, ?...) for insert and appropriate strings for update, delete, select statements. All are done with reflection. There is no syntax error here. I can manually execute returned values through direct provider type commands.
The DAL.SCRFL.GetProperties method returns a hashtable formatted as key=property (field), value=field value.
Now, I need to create parameters for each property and add that to my command parameters then execute it. This attempt you will see in my code (I’m creating parameters for each propert/value pair by looping the hash table). However at the end I'm getting an exception with Data type mismatch in criteria expression. description. I've tried adding type property to parameter object, size etc., all was unsuccessful (I commented them). I tried changing param.Value = k.value to param.Value = If(IsDBNull(k.value), DBNull.Value, k.value) thinking that this might be the problem, though k.value is from my business class and I intentionally prevent from null values. Nothing worked! here is the test; business class returned value from DAL.SCRFL.GetParamStatement call: The test is done for OleDb/Access database and, as you can see, I enclosed the Memo field in single quotes. My reflection methods read class properties' attributes (which I set to be table field names) and DAL.SCRFL.GetParamStatement builds basic sql statements for insert, update, delete and select use. AdapterCommandType is a built in enum type for it).
INSERT INTO Clinics
(ClinicId, ClinicName, Phone, Fax, FederalId, DateContracted, Address, City, State, Zip, Inactive, [Memo], DateEntered, EnteredBy, DateModified, ModifiedBy)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
Note that I have another method similar to this that executes an sql statement (InvokeSql) where I thoroughly check value types for each property to construct property=value pairs in my sql statement. Using a fully qualified sql statement in this, InvokeSql, method works w/out a single warning (Rouphly: cnn As IDbConnection = CreateConnection(), cmd = CreateCommand(_cmdText, cnn), cmd.ExecuteNonQuery() where _cmdText is the sql statement. No parameters as you can see!). I'm mentioning this to point out that the problem arises whenever I use parameters with generic IDbCommands. Even though inside my DataFactory the IDbCommand set to be provider specific command type (my DataFactory.CreateCommand(s, cnn) returns generic IDbCommand).
Prior to my DAL development, I was doing all above steps manually though all objects (commands, connections etc.) were explicitly declared to be provider specific types. Technically speaking I’m exercising exact same scenarios as I used to with generic type of objects (not provider specific). But I can’t make it work, there is somewhere, probably, I’m missing something.

I had posted this question on codeproject and the answer is there.
http://www.codeproject.com/Questions/446516/How-to-add-parameters-and-execute-a-generic-IDbCom

Related

Invalid Object name error but table found in schema?

I'm using ADODB connections to connect to a database which none of my other colleagues understand how to connect to. So far I've got as far as being able to see all the available tables via 2 methods:
Dim ado as object
set ado = CreateObject("ADODB.Connection")
Call ado.open("...")
set rs = ado.Execute("SELECT * FROM sys.objects WHERE type='U'")
and also
const adTable = 20
Set rstSchema = ado.OpenSchema(adTable)
Do Until rstSchema.EOF
Debug.Print rstSchema("TABLE_NAME")
rstSchema.MoveNext
Loop
But the part which is confusing me is selecting from the tables directly... I expected to be able to do:
select * from <<TABLENAME>>
where <<TABLENAME>> was one of the table names returned by the above 2 methods. However whenever I do this I get the error in the title:
Invalid object name '<<TABLENAME>>'.
So how exactly am I meant to access the data in the tables identified from OpenSchema() method. Is there another method which I am unfamiliar with?
As discussed with itsLex in the comments:
In SQL Server terms:
select * from <<Database>>.<<Owner>>.<<TableName>>
Alternatively you can use the USE statement as follows:
USE <<Database>>;
select * from <<TableName>>;

System.NotSupportedException: Commands with multiple queries cannot have out parameters

I ran into another issue with using a data reader around a sproc with multiple ref cursors coming out. I am getting a not supported exception. Unfortunately, i can see from where it is coming from the source code of npgsql however.. i am not sure if i agree with throwing that exception. The code we have written works with oracle (both fully managed and managed flavors), sql server. Any help appreciated to keep it consistent for an api across some of those key flavors of dbms out there.
sproc body
CREATE OR REPLACE FUNCTION public.getmultipleresultsets (
v_organizationid integer)
RETURNS Setof refcursor
LANGUAGE 'plpgsql'
AS $BODY$
declare public override void AddCursorOutParameter(DbCommand command,
string RefCursorName)
{
NpgsqlParameter parameter = (NpgsqlParameter)CreateParameter(RefCursorName, false);
parameter.NpgsqlDbType = NpgsqlDbType.Refcursor;
parameter.NpgsqlValue = DBNull.Value;
parameter.Direction = ParameterDirection.Output;
command.Parameters.Add(parameter);
}
cv_1 refcursor;
cv_2 refcursor;
BEGIN
open cv_1 for
SELECT a.errorCategoryId, a.name, a.bitFlag
FROM ErrorCategories a
ORDER BY name;
RETURN next cv_1;
open cv_2 for
SELECT *
FROM StgNetworkStats ;
RETURN next cv_2;
END;
$BODY$;
Key Reader code that wraps postgres sql (Entlib implementation of npgsql)
private IDataReader DoExecuteReader(DbCommand command, CommandBehavior cmdBehavior)
{
try
{
var sql = new StringBuilder();
using (var reader = command.ExecuteReader(CommandBehavior.SequentialAccess))
{
while (reader.Read())
{
sql.AppendLine($"FETCH ALL IN \"{ reader.GetString(0) }\";");
}
}
command.CommandText = sql.ToString();
command.CommandType = CommandType.Text;
IDataReader reader2 = command.ExecuteReader(cmdBehavior);
return reader2;
}
catch (Exception)
{
throw;
}
}
The command building code is shown below
Helper.InitializeCommand(cmd, 300, "getmultipleresultsets");
db.AddReturnValueParameter(cmd);
db.AddInParameter(cmd, "organizationId", DbType.Int32, ORGANIZATIONID);
db.AddCursorOutParameter(cmd, "CV_1");
db.AddCursorOutParameter(cmd, "CV_2
The code that adds the refcursor parameter goes something like this
You code above seems to garble the PostgreSQL function with the .NET client code attempting to read its result.
Regardless, your function is declared to return a set of refcursors - this is not the same as two output parameters; you seem to be confusing the name of the cursor (cursors have names, but not ints, for example) with the name of the parameter (int parameters do have names).
Please note that PostgreSQL does not actually have output parameters - a function always returns a single table, and that's it. PostgreSQL does have a function syntax with output parameters, but that is only a way to construct the schema of the output table. This is unlike SQL Server, which apparently can return both a table and a set of named output parameters. To facilitate portability, when reading results, if Npgsql sees any NpgsqlParameter with direction out, it will attempt to find a resultset with the name of the parameter and will simply populate the NpgsqlParameter's Value with the first row's value for that column. This practice has zero added value over simply reading the resultset yourself - it's just there for compatibility.
To sum it up, I'd suggest you read the refcursors with your reader and then fetch their results as appropriate.

Working with the data from LinQ SQL query

Using VS 2013 (VB) and SQL server 2016.
I have linQ query that returns two columns from a database. The query is as follows.
Dim val = (From value In db.ngc_flowTypes
Where value.defaultValue IsNot Nothing
Select value.flowName, value.defaultValue)
The data it returns is a as follows.
I want to iterate through each row of the results and pass the values to certain variables. A ForEach statement doesnt seem to work as it just runs through once. I am sure this must be easy but I ont quite understand it. Am I getting the data returned in the best way via my query? Can I transpose the data to a data table in VB? so I can work with it easier?
The end result I want is string for each flow name with its corresponding default value (along with some other text). So something like this.
dim strsubmission as string = flowName + " has a value of " + defaultValue
Use ToDictionary.
Dim val = (From value In db.ngc_flowTypes
Where value.defaultValue IsNot Nothing
Select value).ToDictionary(Function(key) key.flowName,
Function(value) value.defaultValue)
This will actually execute the SQL of the linq on the database (approx. Select * From ngc_flowTypes Where defaultValue Is Not NULL), traverse each record into a key/value pair (flowName, defaultValue) and put it into a in-memory dictionary variable (val).
After that you can do whatever you like with the dictionary.
For Each flowName In val.Keys
Console.WriteLine("{0} has a value of {1}", flowName, val(flowName))
Next
Edit:
This will only work as long flowName is unique in table ngc_flowTypes

Calling procedure with updates and a ResultSet in MyBatis

I want to map a SQL Server stored procedure with MyBatis, using annotations.
#Select(value = "{call sp_cen_obliczcene(" +
"#{wytworId, mode=IN, jdbcType=NUMERIC}, " +
"#{rodzajCenyId, mode=IN, jdbcType=NUMERIC}, " +
"#{walutaId, mode=IN, jdbcType=NUMERIC}, " +
"#{jmId, mode=IN, jdbcType=NUMERIC}, " +
"#{ilosc, mode=IN, jdbcType=DECIMAL}, " +
"#{data, mode=IN, jdbcType=DATE})}")
#Result(property = "kwota", column = "kwota", javaType = BigDecimal.class, jdbcType = JdbcType.DECIMAL)
#Options(statementType = StatementType.CALLABLE)
public DtoCena dajCene(CriteriaCena parametry);
The procedure selects one row - I am interested in one column. Now, I've mapped a procedure before, only I had multiple rows and selected more then one column from them. Everything worked perfectly fine. When I mapped new procedure, in a similar way I got an error:
### The error occurred while setting parameters
### SQL: {call sp_cen_obliczcene(?, ?, ?, ?, ?, ?)}
### Cause: java.lang.NullPointerException
I started the SQL Profiler and saw that the procedure is called properly with the given parameters. I've noticed that the procedure I'm mapping is executing other procedures. They're performing some updates. When I changed my annotation to #Update I got an other error: that Integer cannot be cast to DtoCena type. I changed the return value of the method to Integer and I got no errors but as you can guess it did not return what I was looking for.
The question is, can I map a stored procedure which updates tables AND returns a ResultSet? I can do this using JDBC, but is this possible with MyBatis? Am I doing something wrong when using the #Select annotation?
Looks like the #Update returns the affected row count ...
Anyway, I don't think the issue is related to calling stored procedure, this is merely a mapping issue that would occur with simple select.
You must use #Result annotation inside #Results annotation, otherwise it is ignored.
Here is a simplified, yet functional, code:
#Select("select 'hello' as h, 1 as n from dual")
#Results({
#Result(column="n")
})
Integer test();
Just add a property attribute and change return type to retrieve result into an object.

Execute multiple stored procedures with single trip to database

I have a lot of legacy data access code mainly SqlCommand with Stored Procedure calls that we used to execute alot of Insert statment into an database.
As long as the SQL server has been on the same machine as the application there have been acceptable performace but now are we trying to move some of the data to SQL Azure.
The problem is that our code calls a SP for every record to insert which results in quite a few trips to the database and when not located on the same server it takes some time.
var conn = new SqlConnection("connString")
var cmd = new SqlCommand(conn, "spMyStoreProc");
cmd.Params.Add("#a", SqlDbType.VarChar, 10);
cmd.Params.Add("#b", SqlDbType.Int);
using(conn)
{
conn.Open();
foreach(var rec in recordsToInsert)
{
cmd.Parameters["#a"].Value = rec.A;
cmd.Parameters["#b"].Value = rec.B;
cmd.ExecuteNonQuery();
}
conn.Close();
}
I have tried the code above with and without Transactions.
I have also tried to use a "batch" SQL statement to execute several SPs in every trip to the server.
Like this:
var cmd = new SqlCommand(conn);
cmd.CommandText = "EXEC spMyStoreProc #a='a' #b=2; EXEC spMyStoreProc #a='b' #b=4;"
It greatly increases the performance of the operation but since I have quite a few SPs where every SP has about 20-50 params it gets quite tedious to write this code for all the insert commands in this data access component.
Is this the best way to achive this, or can I somehow tell ADO.NET I want to execute my calls as a batch (havent fount anything suggesting its possible but feel that I atleast should ask) to avoid network latency etc betweeen every single SP call?
If not does anybody know any good way to achive this without having to write it "by hand" and since its a legacy application I can not change the data layer completely.
Is there any applications that can take SqlCommands with parameters and generate the TQL they would execute?
Thanks in advance
You should probably have one stored procedure, that calls all the other stored procedures - it will probably be the least amount of work. So, from the code you only call the stored procedures once... so given that they are the same parameters you are passing every time (because your code seems to imply that) you would basically do something like this:
CREATE PROCEDURE sp_RunBatch(#param1, #param2, etc [all the parameters you need])
AS
exec spMyStoreProc #a='a'
exec spMyStoreProc2 #b='b'
The advantages of this are many, some of which being that its all centralized, and you can even wrap all of them within a transaction, so as not to do dirty inserts (given that they all depend on each other).
Also, if you don't feel like passing 20/30 parameters to each SP, you may want to make a user-table-defined data-type for each set of parameters, that you can pass. So then each SP gets 1 or 2 parameters, and the code becomes much simpler and readable.
EDIT:
This is a good reference for the user-defined table types: http://msdn.microsoft.com/en-us/library/bb675163.aspx
And this is how to pass the table valued types to SQL server: http://msdn.microsoft.com/en-us/library/bb675163.aspx
An alternative to M.R.'s approach would be to send all your parameters as an XML document, then parse the XML document to extract your parameters. This may simplify the interface a bit.
However I think you were on something when you discussed the possibility of chaining all the commands in a single string. But instead of manually building them, consider building an extension method to the SqlCommand object that returns a single string for execution, leveraging the sp_executesql syntax, and execute the entire string in a single pass.
So you would have a loop that looks like this, and you would call a new ToInlineSql extension method:
string sqlCommand = "";
foreach(var rec in recordsToInsert)
{ cmd.Parameters["#a"].Value = rec.A;
cmd.Parameters["#b"].Value = rec.B;
sqlCommand += cmd.ToInlineSql();
}
// execute sqlCommand
The ToInlineSql extension method could look like this (peuso-code, you will have to add certain things such as checking for the data type and so on) [and here is the link to sp_executesql:
public static class SqlCmdExt
{
public static string ToInlineSql(this SqlCommand cmd)
{
string sql = "sp_executesql " + cmd.CommandText ;
foreach (SqlParameter p in cmd.Parameters)
{
sql += ", #" + p.Name + " " + p.DataType.ToString() ;
sql += ", " + p.Name + " = " + p.Value;
}
sql += ";";
return sql;
}
}

Resources