Does dapper support .net dataset - dapper

in my opinion for dapper.query object there is a datareader, for dapper.Execute there is a ExectureNonQuery object. Correct me if i am wrong .
Can we use dapper for dataset which returns multiple tables?

No, there is not any built in support for DataSet, primarily because it seems largely redundant, but also because that isn't what dapper targets. But that doesn't mean it doesn't include an API for handling a query that selects multiple results; see QueryMultiple:
using (var multi = conn.QueryMultiple(sql, args))
{
var ids = multi.Read<int>().ToList();
var customers = multi.Read<Customer>().ToList();
dynamic someOtherRow = multi.Read().Single();
int qty = someOtherRow.Quantity, price = someOtherRow.Price;
}
Note that this API is forwards only (due to the nature of IDataReader etc) - basically, each Read / Read<T> etc maps to the next result grid in turn.

I might be late here but this is how I am doing the conversion of the IDataReader to a DataSet. Dapper returns a IDataReader when we use the ExecuteReaderAsync method. More information on this addition can be found here and here.
This is my attempt on this:
public async Task<DataSet> GetUserInformationOnUserId(int UserId)
{
var storedprocedure = "usp_getUserInformation";
var param = new DynamicParameters();
param.Add("#userId", UserId);
var list = await SqlMapper.ExecuteReaderAsync(_connectionFactory.GetEpaperDBConnection, storedprocedure, param, commandType: CommandType.StoredProcedure);
var dataset = ConvertDataReaderToDataSet(list);
return dataset;
}
And the ConvertDataReaderToDataSet will take in the IDataReader, you can use this method to convert the IReader to Dataset:
public DataSet ConvertDataReaderToDataSet(IDataReader data)
{
DataSet ds = new DataSet();
int i = 0;
while (!data.IsClosed)
{
ds.Tables.Add("Table" + (i + 1));
ds.EnforceConstraints = false;
ds.Tables[i].Load(data);
i++;
}
return ds;
}

Related

Using Dapper in C# MVVM and SQL Server 2019/Visual Studios Pro 2019

I am writing an application using C#, WPF, MVVM and SQL Server, and I am using Dapper as an ORM. Below is a snippet of the sample code I am trying to make work:
public List<MAKE THIS GENERIC> ExecuteSQLSPROC(string SPROCName, SqlParameter[] parameters, IEnumerable<MAKE THIS GENERIC> model)
{
using (IDbConnection connection = new System.Data.SqlClient.SqlConnection(DBHelper.CNNVal("MyDatabaseName")))
{
System.Data.SqlClient.SqlCommand command = new System.Data.SqlClient.SqlCommand();
command.Connection = (SqlConnection)connection;
command.CommandText = SPROCName;
DynamicParameters parms = new DynamicParameters();
for (int i = 0; i < parameters.Length; i++)
{
var parmname = parameters[i].ParameterName;
var parmvalue = parameters[i].Value;
// I KNOW I can put DIRECTION as a parameter variable but right now I don't want to!!
parms.Add(parmname, parmvalue);
}
var output = connection.Query<Make THIS GENERIC>(SPROCName, parms, commandType: CommandType.StoredProcedure).ToList();
return output; // return the recordset to the caller - a List<> of one object/model is acceptable - NOT POPULATING EXCEL spreadsheet with
// the results
}
}
I want to call this piece of code (which does not compile right now for obvious reasons) by supplying any ViewModel and be able to execute this code as if the ViewModel is generic within the routine. I need to know how to correctly code the above and also make the call to this routine with generic ViewModels (probably using a Lambda expression of some sort).
Basically, I want to be able to say "execute this SPROC, with these dynamic parameters and map the output to a ViewModel that is generic and return the results to the caller. Next time you're called it will be same logic, different SPROC, different parameters and ViewModel". I do NOT want extraneous comments that DO NOT directly answer this question. I know that I can add the parameter direction in a call the SQL SPROCs; but that knowledge is irrelevant to this question I am asking!
You can put your method in a generic class if you have other methods:
public class DbStuff<T>
{
public IList<T> ExecuteSQLSPROC(string SPROCName, SqlParameter[] parameters, IEnumerable<T> model)
{
using (IDbConnection connection = new System.Data.SqlClient.SqlConnection(DBHelper.CNNVal("MyDatabaseName")))
{
System.Data.SqlClient.SqlCommand command = new System.Data.SqlClient.SqlCommand();
command.Connection = (SqlConnection)connection;
command.CommandText = SPROCName;
DynamicParameters parms = new DynamicParameters();
for (int i = 0; i < parameters.Length; i++)
{
var parmname = parameters[i].ParameterName;
var parmvalue = parameters[i].Value;
// I KNOW I can put DIRECTION as a parameter variable but right now I don't want to!!
parms.Add(parmname, parmvalue);
}
var output = connection.Query<T>(SPROCName, parms, commandType: CommandType.StoredProcedure).ToList();
return output; // return the recordset to the caller - a List<> of one object/model is acceptable - NOT POPULATING EXCEL spreadsheet with
// the results
}
}
}
Then call it via:
var stuff = new DbStuff<MyViewModel>();
var results = stuff.ExecuteSQLSPROC("", blah, blah);
Or just make the method generic:
public IList<T> ExecuteSQLSPROC<T>(string SPROCName, SqlParameter[] parameters, IEnumerable<T> model)

Preparing statements and batching in npgsql

The Simple Preparation example in the docs (https://www.npgsql.org/doc/prepare.html#simple-preparation) shows an example where parameters are set after the command is prepared.
var cmd = new NpgsqlCommand(...);
cmd.Parameters.Add("param", NpgsqlDbType.Integer);
cmd.Prepare();
// Set parameters
cmd.ExecuteNonQuery();
// And so on
Questions
How are the parameters set?
Is it possible to use AddWithValue instead of Add if the AddWithValue(String, NpgsqlDbType, Object) method which specifies NpgsqlDbType is used -- docs say "setting the value isn't support"?
How does this work if multiple statements exist in the same command?
This answer (https://stackoverflow.com/a/53268090/10984827) shows that multiple commands in a single string can be prepared together but it's not clear how this CommandText string is created.
Edit: I think I'm almost there but I'm not sure how to create and execute the batched the query string. Here's my naive attempt at building a batched query using a StringBuilder. This doesn't work. How do I do this correctly?
using System;
using System.Collections.Generic;
using System.Text;
using Npgsql;
using NpgsqlTypes;
class Model
{
public int value1 { get; }
public int value2 { get; }
public Model(int value1, int value2)
{
this.value1 = value1;
this.value2 = value2;
}
}
class Program
{
static void Main(string[] args)
{
var dataRows = new List<Model>();
dataRows.Add(new Model(3,2));
dataRows.Add(new Model(27,-10));
dataRows.Add(new Model(11,-11));
var connString = "Host=127.0.0.1;Port=5432;Username=postgres;Database=dbtest1";
// tabletest1
// ----------
// id SERIAL PRIMARY KEY
// , value1 INT NOT NULL
// , value2 INT NOT NULL
using (var conn = new NpgsqlConnection(connString))
{
conn.Open();
var cmd = new NpgsqlCommand();
cmd.Connection = conn;
cmd.CommandText = $"INSERT INTO tabletest1 (value1,value2) VALUES (#value1,#value2)";
var parameterValue1 = cmd.Parameters.Add("value1", NpgsqlDbType.Integer);
var parameterValue2 = cmd.Parameters.Add("value2", NpgsqlDbType.Integer);
cmd.Prepare();
var batchCommand = new StringBuilder();
foreach (var d in dataRows)
{
parameterValue1.Value = d.value1;
parameterValue2.Value = d.value2;
batchCommand.Append(cmd.CommandText);
batchCommand.Append(";");
}
Console.WriteLine(batchCommand.ToString());
// conn.ExecuteNonQuery(batchCommand.ToString());
}
}
}
1) Simply capture the NpgsqlParameter returned from Add(), and then set its Value property:
var p = cmd.Parameters.Add("p", NpgsqlDbType.Integer);
cmd.Prepare();
p.Value = 8;
cmd.ExecuteNonQuery();
2) You can use AddWithValue() in the same way, but if you're preparing the command in order to reuse it several times, that makes less sense. The idea is that you first add the parameter without a value, then prepare, then execute it several times, setting the value each time.
3) You can prepare a multi-statement command. As things work now, all statements in the command will share the same parameter list (which lives on NpgsqlCommand). So the same pattern holds: create your command with your SQL and parameters, prepare it, and then set parameter values and execute. Each individual statement within your command will run prepared, benefiting from the perf increase.
Here's an example with two-statement batching:
cmd.CommandText = "INSERT INTO tabletest1 (value1,value2) VALUES (#v1,#v2); INSERT INTO tabletest1 (value1, value2) VALUES (#v3,#v4)";
var v1 = cmd.Parameters.Add("v1", NpgsqlDbType.Integer);
var v2 = cmd.Parameters.Add("v2", NpgsqlDbType.Integer);
var v3 = cmd.Parameters.Add("v3", NpgsqlDbType.Integer);
var v4 = cmd.Parameters.Add("v4", NpgsqlDbType.Integer);
cmd.Prepare();
while (...) {
v1.Value = ...;
v2.Value = ...;
v3.Value = ...;
v4.Value = ...;
cmd.ExecuteNonQuery();
}
However, if the objective is to efficiently insert lots of data, consider using COPY instead - it will be faster than even batched inserts.
Finally, to complete the picture, for INSERT statements specifically you can include more than one row in a single statement:
INSERT INTO tabletest1 (value1, value2) VALUES (1,2), (3,4)
You can also again parameterize the actual values, and prepare this command. This is similar to batching two INSERT statements, and should be faster (although still slower than COPY).
In NpgSQL 6.0 there has been the addition of batching/pipelining.
Here is an updated example:
await using var connection = new NpgsqlConnection(connString);
await connection.OpenAsync();
var batch = new NpgsqlBatch(connection);
const int count = 10;
const string parameterName = "parameter";
for (int i = 0; i < count; i++)
{
var batchCommand = new NpgsqlBatchCommand($"SELECT #{parameterName} as value");
batchCommand.Parameters.Add(new NpgsqlParameter(parameterName, i));
batch.BatchCommands.Add(batchCommand);
}
await batch.PrepareAsync();
var results = new List<int>(count);
await using (var reader = await batch.ExecuteReaderAsync())
{
do
{
while (await reader.ReadAsync())
{
results.Add(await reader.GetFieldValueAsync<int>("value"));
}
} while (await reader.NextResultAsync());
}
Console.WriteLine(string.Join(", ", results));

WebAPI EF update 30,000 rows of data is very slow

I am trying to have my asp.net WebAPI web service read a .csv and update a database using Entity Framework. The .csv file is about 20,000-30,000 rows.
As of now I am using a TextfieldParser to read the .csv, each row of the .csv file I create a new object, then add object to the EF context.
Once it's done adding all rows to the context, then I call db.SaveChanges();
Watching the console I noticed it calls an update statement for each row... which takes a long time. Is there a better more efficient way to accomplish this?
if (filetype == "xxx")
{
using (TextFieldParser csvReader = new TextFieldParser(downloadFolder + fileName))
{
csvReader.SetDelimiters(new string[] { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
int rowCount = 1;
while (!csvReader.EndOfData)
{
string[] fieldData = csvReader.ReadFields();
//skip header row
if (rowCount != 1)
{
var t = new GMI_adatpos
{
PACCT = fieldData[3]
};
db.GMI_adatpos.Add(t);
}
rowCount++;
}
}
}
db.SaveChanges();
This issue is very common,
In your case, we can split it into two category:
Add vs AddRange Performance
Write & database round-trip
Add vs AddRange Performance
The Add method will try to detect change every time you add a new record while the AddRange only does it once. Detecting changes every time can take several minutes.
This issue is very easy to fix, simply create a list, add the entity to this list instead and use AddRange with the list at the end.
List<GMI_adatpo> list = new List<GMI_adatpo>();
if (filetype == "xxx")
{
using (TextFieldParser csvReader = new TextFieldParser(downloadFolder + fileName))
{
csvReader.SetDelimiters(new string[] { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
int rowCount = 1;
while (!csvReader.EndOfData)
{
string[] fieldData = csvReader.ReadFields();
//skip header row
if (rowCount != 1)
{
var t = new GMI_adatpos
{
PACCT = fieldData[3]
};
list.Add(t);
}
rowCount++;
}
}
}
db.GMI_adatpos.AddRange(list)
db.SaveChanges();
Write & Database round-trip
Everytime you save a record, you perform a database round-trip. So if you insert average 30,000 record, you perform 30,000 database round-trip which is insane!
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library allows to perform:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
You can either call BulkSaveChanges instead of SaveChanges or create a list to insert and use directly BulkInsert instead for even more performance.
BulkSaveChanges Solution (Way faster than SaveChanges)
db.GMI_adatpos.AddRange(list)
db.SaveChanges();
BulkInsert Solution (Fastest than BulkSaveChanges but do not save related entities)
db.BulkInsert(list);
Because the number of items added to the DbContext is very high, ram space is gradually filled, and operation is very slow. Therefore is better that after a few records (ex 100), calling SaveChanges Methods and renew DbContext.
if (filetype == "xxx")
{
using (TextFieldParser csvReader = new TextFieldParser(downloadFolder + fileName))
{
csvReader.SetDelimiters(new string[] { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
int rowCount = 1;
while (!csvReader.EndOfData)
{
if(rowCount%100 == 0)
{
db.Dispose();
db.SaveChanges();
db = new AppDbContext();//Your DbContext
}
string[] fieldData = csvReader.ReadFields();
//skip header row
if (rowCount != 1)
{
var t = new GMI_adatpos
{
PACCT = fieldData[3]
};
db.GMI_adatpos.Add(t);
}
rowCount++;
}
}
}

salesforce SOQL : query to fetch all the fields on the entity

I was going through the SOQL documentation , but couldn't find query to fetch all the field data of an entity say , Account , like
select * from Account [ SQL syntax ]
Is there a syntax like the above in SOQL to fetch all the data of account , or the only way is to list all the fields ( though there are lot of fields to be queried )
Create a map like this:
Map<String, Schema.SObjectField> fldObjMap = schema.SObjectType.Account.fields.getMap();
List<Schema.SObjectField> fldObjMapValues = fldObjMap.values();
Then you can iterate through fldObjMapValues to create a SOQL query string:
String theQuery = 'SELECT ';
for(Schema.SObjectField s : fldObjMapValues)
{
String theLabel = s.getDescribe().getLabel(); // Perhaps store this in another map
String theName = s.getDescribe().getName();
String theType = s.getDescribe().getType(); // Perhaps store this in another map
// Continue building your dynamic query string
theQuery += theName + ',';
}
// Trim last comma
theQuery = theQuery.subString(0, theQuery.length() - 1);
// Finalize query string
theQuery += ' FROM Account WHERE ... AND ... LIMIT ...';
// Make your dynamic call
Account[] accounts = Database.query(theQuery);
superfell is correct, there is no way to directly do a SELECT *. However, this little code recipe will work (well, I haven't tested it but I think it looks ok). Understandably Force.com wants a multi-tenant architecture where resources are only provisioned as explicitly needed - not easily by doing SELECT * when usually only a subset of fields are actually needed.
You have to specify the fields, if you want to build something dynamic the describeSObject call returns the metadata about all the fields for an object, so you can build the query from that.
I use the Force.com Explorer and within the schema filter you can click the checkbox next to the TableName and it will select all the fields and insert into your query window - I use this as a shortcut to typeing it all out - just copy and paste from the query window. Hope this helps.
In case anyone was looking for a C# approach, I was able to use reflection and come up with the following:
public IEnumerable<String> GetColumnsFor<T>()
{
return typeof(T).GetProperties(System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance)
.Where(x => !Attribute.IsDefined(x, typeof(System.Xml.Serialization.XmlIgnoreAttribute))) // Exclude the ignored properties
.Where(x => x.DeclaringType != typeof(sObject)) // & Exclude inherited sObject propert(y/ies)
.Where(x => x.PropertyType.Namespace != typeof(Account).Namespace) // & Exclude properties storing references to other objects
.Select(x => x.Name);
}
It appears to work for the objects I've tested (and matches the columns generated by the API test). From there, it's about creating the query:
/* assume: this.server = new sForceService(); */
public IEnumerable<T> QueryAll<T>(params String[] columns)
where T : sObject
{
String soql = String.Format("SELECT {0} FROM {1}",
String.Join(", ", GetColumnsFor<T>()),
typeof(T).Name
);
this.service.QueryOptionsValue = new QueryOptions
{
batchsize = 250,
batchSizeSpecified = true
};
ICollection<T> results = new HashSet<T>();
try
{
Boolean done = false;
QueryResult queryResult = this.service.queryAll(soql);
while (!finished)
{
sObject[] records = queryResult.records;
foreach (sObject record in records)
{
T entity = entity as T;
if (entity != null)
{
results.Add(entity);
}
}
done &= queryResult.done;
if (!done)
{
queryResult = this.service.queryMode(queryResult.queryLocator);
}
}
}
catch (Exception ex)
{
throw; // your exception handling
}
return results;
}
For me it was the first time with Salesforce today and I came up with this in Java:
/**
* #param o any class that extends {#link SObject}, f.ex. Opportunity.class
* #return a list of all the objects of this type
*/
#SuppressWarnings("unchecked")
public <O extends SObject> List<O> getAll(Class<O> o) throws Exception {
// get the objectName; for example "Opportunity"
String objectName= o.getSimpleName();
// this will give us all the possible fields of this type of object
DescribeSObjectResult describeSObject = connection.describeSObject(objectName);
// making the query
String query = "SELECT ";
for (Field field : describeSObject.getFields()) { // add all the fields in the SELECT
query += field.getName() + ',';
}
// trim last comma
query = query.substring(0, query.length() - 1);
query += " FROM " + objectName;
SObject[] records = connection.query(query).getRecords();
List<O> result = new ArrayList<O>();
for (SObject record : records) {
result.add((O) record);
}
return result;
}
I used following to get complete records-
query_all("Select Id, Name From User_Profile__c")
To get complete fields of record, we have to mention those fields as mentioned here-
https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql_select.htm
Hope will help you !!!

Is it possible to use `SqlDbType.Structured` to pass Table-Valued Parameters in NHibernate?

I want to pass a collection of ids to a stored procedure that will be mapped using NHibernate. This technique was introduced in Sql Server 2008 ( more info here => Table-Valued Parameters ). I just don't want to pass multiple ids within an nvarchar parameter and then chop its value on the SQL Server side.
My first, ad hoc, idea was to implement my own IType.
public class Sql2008Structured : IType {
private static readonly SqlType[] x = new[] { new SqlType(DbType.Object) };
public SqlType[] SqlTypes(NHibernate.Engine.IMapping mapping) {
return x;
}
public bool IsCollectionType {
get { return true; }
}
public int GetColumnSpan(NHibernate.Engine.IMapping mapping) {
return 1;
}
public void NullSafeSet(DbCommand st, object value, int index, NHibernate.Engine.ISessionImplementor session) {
var s = st as SqlCommand;
if (s != null) {
s.Parameters[index].SqlDbType = SqlDbType.Structured;
s.Parameters[index].TypeName = "IntTable";
s.Parameters[index].Value = value;
}
else {
throw new NotImplementedException();
}
}
#region IType Members...
#region ICacheAssembler Members...
}
No more methods are implemented; a throw new NotImplementedException(); is in all the rest. Next, I created a simple extension for IQuery.
public static class StructuredExtensions {
private static readonly Sql2008Structured structured = new Sql2008Structured();
public static IQuery SetStructured(this IQuery query, string name, DataTable dt) {
return query.SetParameter(name, dt, structured);
}
}
Typical usage for me is
DataTable dt = ...;
ISession s = ...;
var l = s.CreateSQLQuery("EXEC some_sp #id = :id, #par1 = :par1")
.SetStructured("id", dt)
.SetParameter("par1", ...)
.SetResultTransformer(Transformers.AliasToBean<SomeEntity>())
.List<SomeEntity>();
Ok, but what is an "IntTable"? It's the name of SQL type created to pass table value arguments.
CREATE TYPE IntTable AS TABLE
(
ID INT
);
And some_sp could be like
CREATE PROCEDURE some_sp
#id IntTable READONLY,
#par1 ...
AS
BEGIN
...
END
It only works with Sql Server 2008 of course and in this particular implementation with a single-column DataTable.
var dt = new DataTable();
dt.Columns.Add("ID", typeof(int));
It's POC only, not a complete solution, but it works and might be useful when customized. If someone knows a better/shorter solution let us know.
A simpler solution than the accepted answer would be to use ADO.NET. NHibernate allows users to enlist IDbCommands into NHibernate transactions.
DataTable myIntsDataTable = new DataTable();
myIntsDataTable.Columns.Add("ID", typeof(int));
// ... Add rows to DataTable
ISession session = sessionFactory.GetSession();
using(ITransaction transaction = session.BeginTransaction())
{
IDbCommand command = new SqlCommand("StoredProcedureName");
command.Connection = session.Connection;
command.CommandType = CommandType.StoredProcedure;
var parameter = new SqlParameter();
parameter.ParameterName = "IntTable";
parameter.SqlDbType = SqlDbType.Structured;
parameter.Value = myIntsDataTable;
command.Parameters.Add(parameter);
session.Transaction.Enlist(command);
command.ExecuteNonQuery();
}
For my case, my stored procedure needs to be called in the middle of an open transaction.
If there is an open transaction, this code works because it is automatically reusing the existing transaction of the NHibernate session:
NHibernateSession.GetNamedQuery("SaveStoredProc")
.SetInt64("spData", 500)
.ExecuteUpdate();
However, for my new Stored Procedure, the parameter is not as simple as an Int64. It's a table-valued-parameter (User Defined Table Type)
My problem is that I cannot find the proper Set function.
I tried SetParameter("spData", tvpObj), but it's returning this error:
Could not determine a type for class: …
Anyways, after some trial and error, this approach below seems to work.
The Enlist() function is the key in this approach. It basically tells the SQLCommand to use the existing transaction. Without it, there will be an error saying
ExecuteNonQuery requires the command to have a transaction when the
connection assigned to the command is in a pending local transaction…
using (SqlCommand cmd = NHibernateSession.Connection.CreateCommand() as SqlCommand)
{
cmd.CommandText = "MyStoredProc";
NHibernateSession.Transaction.Enlist(cmd); // Because there is a pending transaction
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#wiData", SqlDbType.Structured) { Value = wiSnSqlList });
int affected = cmd.ExecuteNonQuery();
}
Since I am using the SqlParameter class with this approach, SqlDbType.Structured is available.
This is the function where wiSnList gets assigned:
private IEnumerable<SqlDataRecord> TransformWiSnListToSql(IList<SHWorkInstructionSnapshot> wiSnList)
{
if (wiSnList == null)
{
yield break;
}
var schema = new[]
{
new SqlMetaData("OriginalId", SqlDbType.BigInt), //0
new SqlMetaData("ReportId", SqlDbType.BigInt), //1
new SqlMetaData("Description", SqlDbType.DateTime), //2
};
SqlDataRecord row = new SqlDataRecord(schema);
foreach (var wi in wiSnList)
{
row.SetSqlInt64(0, wi.OriginalId);
row.SetSqlInt64(1, wi.ShiftHandoverReportId);
if (wi.Description == null)
{
row.SetDBNull(2);
}
else
{
row.SetSqlString(2, wi.Description);
}
yield return row;
}
}
You can pass collections of values without the hassle.
Example:
var ids = new[] {1, 2, 3};
var query = session.CreateQuery("from Foo where id in (:ids)");
query.SetParameterList("ids", ids);
NHibernate will create a parameter for each element.

Resources