Dapper with Access, update statement partially not working - dapper

I have a product class and tried to evaluate Dapper with Access database.. Select, Delete and Insert operations are working fine, but I have a problem with update operation. It is working in one way only code below)
When I tried to change the Description based on ProductNumber it works (updateStatement2) and Description get updated, but when I tried to change the ProductNumber based on Description (updateStatement1) it doesn't work and ProductNumber doesn't get updated. It bit strange to me. Is it a bug or am I missing anything?. My database is just a basic one and no primary keys set. I have attached a screenshot below
(For more information see my code below)
public class Products
{
public string ProductNumber { get; set; }
public string Description { get; set; }
}
static void Main(string[] args)
{
using (var con = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=test.mdb"))
{
Products product2 = new Products();
product2.ProductNumber = "P2";
product2.Description = "TestProduct2Changed";
var updateStatement2 = #"Update Products Set Description = #Description Where ProductNumber = #ProductNumber";
int outp2 = con.Execute(updateStatement2, product2);
Products product1 = new Products();
product1.ProductNumber = "P3Changed";
product1.Description = "TestProduct3";
var updateStatement1 = #"Update Products Set ProductNumber = #ProductNumber Where Description = #Description";
int outp1 = con.Execute(updateStatement1, product1);
}
}
I am using Dapper version 1.50.2. This is my database screenshot

It looks like ADO Access commands require the parameters to be present in the same order as they appear in the SQL query.
In your original code, for the query that works, the parameters appear in the query string in alphabetical order -
Update Products Set Description = #Description Where ProductNumber = #ProductNumber
This works because the properties are taken from "product2" in alphabetical order. This may not be by design, it might just be the order in which reflection lists them.
In your query that fails, the parameters appear in reverse alphabetical order -
Update Products Set ProductNumber = #ProductNumber Where Description = #Description
.. and this fails because the parameter values get mis-assigned within Access.
You should be able confirm this by changing the order of the parameters in your dynamic parameter alternative. I tried using dynamic parameters and it worked when the parameters were in the same order as which they appeared in the SQL query but failed if they weren't. The database I'm using isn't quite the same as yours but the following should illustrate what I'm talking about:
// Doesn't work (parameter order is incorrect)
con.Execute(
"Update People Set PersonName = #PersonName Where Notes = #Notes",
new { Notes = "NotesChanged", PersonName = "New Name" }
);
// DOES work (parameter order is correct)
con.Execute(
"Update People Set PersonName = #PersonName Where Notes = #Notes",
new { PersonName = "New Name", Notes = "NotesChanged" }
);
While trying to find more information about this, I came across this answer that unfortunately seems to confirm the issue: https://stackoverflow.com/a/11424444/3813189
I guess that it might be possible for the custom SQL generator that you've mentioned in one of your other questions to do some magic to parse the query and retrieve the parameters in the order in which they must appear and to then ensure that they are provided in the correct order.. if someone is maintaining an Access connector for DapperExtensions then it might be worth raising an issue. Because, at the moment, I think that you are correct and that it is an issue with the library.

Related

Performance issues while passing UDTT[] to postgres Function

I have created a function in postgres that takes a UDTT[] as an importing parameter, and want to eventually insert that data into a Table
Example Udtt
create type udtt_mytype as
(
id uuid,
payload int
);
And then an example Function is something akin to
CREATE OR REPLACE FUNCTION dbo.p_dothething(p_import udtt_mytype[])
RETURNS void
LANGUAGE plpgsql
AS $function$
BEGIN
insert into mytab select * from unnest($1)
RETURN;
END
$function$;
My C# backend presently looks like
public class udtt_mytype
{
[PgName("id")]
public Guid id{ get; set; }
[PgName("payload ")]
public int payload { get; set; }
}
var payload = CreateAndFillUdttMyType();
var conn = new NpgsqlConnection();
conn.Open();
var transaction = conn.BeginTransaction();
conn.MapComposite<udtt_mytype>("udtt_mytype");
var command = new NpgsqlCommand("dbo.p_dothething", conn);
command.CommandType = CommandType.StoredProcedure;
Object[] objArray = new Object[1];
objArray[0] = new NpgsqlParameter { ParameterName = "p_import",
Value = payload , NpgsqlDbType = NpgsqlTypes.NpgsqlDbType.Array |
NpgsqlTypes.NpgsqlDbType.Composite };
command.Parameters.AddRange(objArray);
var result = command.ExecuteScalar();
transaction.Commit();
conn.Close();
While the above works, it is pretty non-performant compared to a similiar UDTT -> SQL StoredProcedure. Prior to our NPGSQL implementation, this was <1 second, but now i seem to be seeing about a 6seconds per 6k rows (whereas the common usages for this end up being much higher numbers than that).
Using some timestamping and returning from the SP, i see that the processing of the data in the function isnt the issue at all..it appears to entirely be transfer time of the payload. In this particular case its a simple array of UDTT_MYTYPE's, and with a single object, execution is instantaneous, but w/ 6k, its up to the 6-7 seconds range. And this performance persists even if i pass it off to an empty function (removing the cost of the unnest/insert).
In reality, udtt_mytype has 12 columns of various types, but we are still talking about a relatively 'shallow' object.
I have attempted to compare this to NPGSqls' documentation on Bulk copy (found here http://www.npgsql.org/doc/copy.html), and that implementation seemed to be even slower than this, which seems contradictive.
Is postgres typically this much slower than MSSQL, or is there something that may be limiting xfer rate of data that im not aware of? Obviously no one can speak for my network connectivity/hardware setup, but anyone that may have converted between the two, was a performance increase seen along this same scale?

Accessing new field value generated in SQL via dbContext

I'm using dbContext and I am running a SQL query that is rather complex (just showing a simple example below), so to avoid having to run the query twice to get a count, I am using COUNT AS to return the total number of records as per other advice on this site.
But, I haven't been able to figure out how to access the resulting property:
using (var db = new DMSContext())
{
string queryString = "select *, COUNT(1) OVER() AS TotalRecords FROM DMSMetas";
var Metas = db.DMSMetas.SqlQuery(queryString).ToList();
for (int i = 0; i <= Metas.Count - 1; i++)
{
var Item = Metas[i];
if (i == 0)
{
//Want to do this, but TotalRecords not part of the DMSMeta class. How to access the created column?
Console.WriteLine("Total records found: " + Item.TotalRecords);
}
}
}
In the sample above, the SQL query generates the extra field TotalRecords. When I run the query in Management Studio, the results are as expected. But how do I access the TotalRecords field through dbContext?
I also tried including the TotalRecords field as part of the DMSMeta class, but then the SQL query fails with the error that the TotalRecords field is specified twice. I tried creating a partial class for DMSMeta containing the TotalRecords field, but then the value remains the default value and is not updated during the query.
I also tried the following:
db.Entry(Item).Property("TotalRecords").CurrentValue
But that generated an error too. Any help would be much appreciated - I am sure I am missing something obvious! All I want is to figure out a way to access the total number of records returned by the query
you have to create a new class (not an entity class but a pure DAO class) DMSMetaWithCount (self explanatory ?) and then
context.Database.SqlQuery<DMSMetaWithCount>("select *, COUNT(1) OVER() AS TotalRecords FROM DMSMetas");
please note that
imho, select * is ALWAYS a bad practice.
you will have no tracking on the not entity new class

Unable to select just certain columns from table with Entity Framework - get anonymous type error

I am trying to select only certain columns from a table using EF 6.1. However, it won't let me pull back just the columns I want. I have to pull back every column from the table which has 14,000 rows so the query takes ~30 seconds. The column that kills the query is a NVARCHAR in the table. But with EF it's all or nothing. I am using IEnumerable also. Perhaps I should be using IQueryable?>
Using this query I get an anonymous type error:
Using db As Ctx = New Ctx
Dim postcount = db.be_Posts.Count
posts = db.be_Posts.Select(Function(S) New With {S.DateCreated, S.Description, S.PostRowID, S.Title}).OrderByDescending(Function(x) x.DateCreated)
Return posts.ToList
End Using
Error:
Unable to cast object of type 'System.Data.Entity.Infrastructure.DbQuery`1[VB$AnonymousType_0`4[System.DateTime,System.String,System.Int32,System.String]]' to type 'System.Collections.Generic.IEnumerable`1
This works but is getting all the records and the columns I don't need:
Using db As Ctx = New Ctx
Dim postcount = db.be_Posts.Count
posts = db.be_Posts.OrderByDescending(Function(x) x.DateCreated).ToList
Return posts
End Using
What I would do is:
Create PostSummaryDto class:
public class PostSummaryDto
{
public DateTime DateCreated { get; set; }
...rest of fields...
}
Use PostSummaryDto class in query:
New PostSummaryDto { DateCreated = S.DateCreated, ...}
Define return type of function as IEnumerable<PostSummaryDto>.
I am not a fan of Visual Basic, so I am not sure if returning anonymous types is allowed, but I believe it is good custom to define return types clearly.

Salesforce SOQL describe table

Is there a way to fetch a list of all fields in a table in Salesforce? DESCRIBE myTable doesn't work, and SELECT * FROM myTable doesn't work.
From within Apex, you can get this by running the following Apex code snippet. If your table/object is named MyObject__c, then this will give you a Set of the API names of all fields on that object that you have access to (this is important --- even as a System Administrator, if certain fields on your table/object are not visible through Field Level Security to you, they will not show up here):
// Get a map of all fields available to you on the MyObject__c table/object
// keyed by the API name of each field
Map<String,Schema.SObjectField> myObjectFields
= MyObject__c.SObjectType.getDescribe().fields.getMap();
// Get a Set of the field names
Set<String> myObjectFieldAPINames = myObjectFields.keyset();
// Print out the names to the debug log
String allFields = 'ALL ACCESSIBLE FIELDS on MyObject__c:\n\n';
for (String s : myObjectFieldAPINames) {
allFields += s + '\n';
}
System.debug(allFields);
To finish this off, and achieve SELECT * FROM MYTABLE functionality, you would need to construct a dynamic SOQL query using these fields:
List<String> fieldsList = new List<String>(myObjectFieldAPINames);
String query = 'SELECT ';
// Add in all but the last field, comma-separated
for (Integer i = 0; i < fieldsList.size()-1; i++) {
query += fieldsList + ',';
}
// Add in the final field
query += fieldsList[fieldsList.size()-1];
// Complete the query
query += ' FROM MyCustomObject__c';
// Perform the query (perform the SELECT *)
List<SObject> results = Database.query(query);
the describeSObject API call returns all the metadata about a given object/table including its fields. Its available in the SOAP, REST & Apex APIs.
Try using Schema.FieldSet
Schema.DescribeSObjectResult d = Account.sObjectType.getDescribe();
Map<String, Schema.FieldSet> FsMap = d.fieldSets.getMap();
complete documentation
Have you tried DESC myTable?
For me it works fine, it's also in the underlying tips in italic. Look:

What if you don't need a parameter when querying with Dapper?

I have one query that does a count/group by where I don't need a parameter (there is no where clause).
What is the syntax to run a parameterless query with dapper?
var _results = _conn.Query<strongType>("Select Count(columnA) as aCount, ColumnB, ColumnC from mytable group by ColumnB, ColumnC");
does not work.
I've tried it a few different ways but I still keep getting "ArgumentNullException was unhandled by user code".
Tried to figure it out myself, searched all over and I'm giving up. Thanks in advance.
Edit: Below is the line of code from SqlMapper.cs that throws the error. It's line 1334
il.Emit(OpCodes.Newobj, typeof(T).GetConstructor(BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic, null, Type.EmptyTypes, null));
The error details: Value cannot be null. Parameter name: con
Mapping a single result back works just fine:
var a = cnn.Query<int>("select 1").Single()
// a is 1
You may face trouble if somehow your query returns no results, for example:
select count(Id) from
(
select top 0 1 as Id, 2 as Title
) as X
group by Title
return 0 results, so doing a Single on an empty result set is not going to work.
try
var _results = _conn.Query("Select columnB, Count(columnA) C from mytable group by columnB");
int ColumnB = ((int)_results[0].ColumnB);
int C = ((int)_results[0].C);
Value cannot be null. Parameter name: con
This is error is thrown by several dynamic ORMs including Dappper, PetaPoco and Massive but it's usually the same problem: Make sure you're using the [Ignore] attribute on the properties you don't want to include. This includes properties inherited from base classes. The error is useless but that's what it means.
This error can occur because a property you're trying to set in your return object is get-only. Dapper, of course, requires being able to set all properties. You might consider having a separate database DTO object that then gets converted to your properly immutable domain object after reading from the database.
Change this:
public string MyProperty { get; }
to this:
public string MyProperty { get; set; }

Resources