SQL server refusing to cache plan for a fixed length parameterized IN clause - sql-server

Using .NET 4.0, I have defined the following sqlcommand. When I execute the sqlcommand multiple times consecutively without making any changes, SQL Server refuses to cache the query plan.
string[] colors = new string[] { "red", "blue", "yellow", "green" };
string cmdText = "SELECT * FROM ColoredProducts WHERE Color IN ({0})";
string[] paramNames = tags.Select(
(s, i) => "#color" + i.ToString()
).ToArray();
string inClause = string.Join(",", paramNames);
using (SqlCommand cmd = new SqlCommand(string.Format(cmdText, inClause))) {
for(int i = 0; i < paramNames.Length; i++) {
cmd.Parameters.AddWithValue(paramNames[i], tags[i]);
}
//Execute query here
}
I know it's refusing the cache the plan because the following query was running at a fraction of the time after consecutive runs:
string[] colors = new string[] { "red", "blue", "yellow", "green" };
string cmdText = "SELECT * FROM ColoredProducts WHERE Color IN ({0})";
string inClause = string.Join(",", colors);
using (SqlCommand cmd = new SqlCommand(string.Format(cmdText, inClause))) {
//Execute query here
}
In my actual test case the param list is fixed at a size of exactly 2000. The scenario I am attempting to optimize is selecting a specific set of 2000 records from a very large table. I would like for the query to be as fast as possible so I really want it to cached.
Sleepy post Edit:
The question is, why wouldn't this plan get cached? And yes, I have confirmed that the query is not in the cache using sys.dm_exec_cached_plans and sys.dm_exec_sql_test.

Here is an idea using a table-valued parameter. Please let us know if this approach performs better than your huge string. There are other ideas too, but this is the closest to treating your set of colors as an array.
In SQL Server:
CREATE TYPE dbo.Colors AS TABLE
(
Color VARCHAR(32) -- be precise here! Match ColoredProducts.Color
PRIMARY KEY
);
GO
CREATE PROCEDURE dbo.MatchColors
#colors AS dbo.Colors READONLY
AS
BEGIN
SET NOCOUNT ON;
SELECT cp.* -- use actual column names please!
FROM dbo.ColoredProducts AS cp -- always use schema prefix
INNER JOIN #colors AS c
ON cp.Color = c.Color;
END
GO
Now in C#:
DataTable tvp = new DataTable();
tvp.Columns.Add(new DataColumn("Color"));
tvp.Rows.Add("red");
tvp.Rows.Add("blue");
tvp.Rows.Add("yellow");
tvp.Rows.Add("green");
// ...
using (connectionObject)
{
SqlCommand cmd = new SqlCommand("dbo.MatchColors", connectionObject);
cmd.CommandType = CommandType.StoredProcedure;
SqlParameter tvparam = cmd.Parameters.AddWithValue("#colors", tvp);
tvparam.SqlDbType = SqlDbType.Structured;
// execute query here
}
I can almost guarantee this will perform better than an IN list with a large number of parameters, regardless of the length of the actual string in your C# code.

Related

best solution for multiple insert update solution

Struggle with understanding C# & Npgsql as a beginner. Following code examples:
// Insert some data
using (var cmd = new NpgsqlCommand())
{ cmd.Connection = conn;
cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p)";
cmd.Parameters.AddWithValue("p", "Hello world");
cmd.ExecuteNonQuery();
}
The syntax for more than one insert & update statement like this is clear so far:
cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p);INSERT INTO data1...;INSERT into data2... and so on";
But what is the right solution for a loop which should handle one statement within.
This works not:
// Insert some data
using (var cmd = new NpgsqlCommand())
{
foreach(s in SomeStringCollectionOrWhatever)
{
cmd.Connection = conn;
cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p)";
cmd.Parameters.AddWithValue("p", s);
cmd.ExecuteNonQuery();
}
}
It seems the values will be "concatenated" or remembered. I cannot see any possibility to "clear" the existing cmd-object.
My second solution would be to wrap the whole "using" block into the loop. But every cycle would create a new object. That seems ugly to me.
So what is the best solution for my problem?
To insert lots of rows efficiently, take a look at Npgsql's bulk copy feature - the API is more suitable (and more efficient) for inserting large numbers of rows than concatenating INSERT statements into a batch like you're trying to do.
If you want to rerun the same SQL with changing parameter values, you can do the following:
using (var cmd = new NpgsqlCommand("INSERT INTO data (some_field) VALUES (#p)", conn))
{
var p = new NpgsqlParameter("p", DbType.String); // Adjust DbType according to type
cmd.Parameters.Add(p);
cmd.Prepare(); // This is optional but will optimize the statement for repeated use
foreach(var s in SomeStringCollectionOrWhatever)
{
p.Value = s;
cmd.ExecuteNonQuery();
}
}
If you need lots of rows and performance is key then i would recommend Npgsql's bulk copy capability as #Shay mentioned. But if you are looking for quick way to do this without the bulk copy i would recommend to use Dapper.
Consider the example below.
Lets say you have a class called Event and a list of events to add.
List<Event> eventsToInsert = new List<Event>
{
new Event() { EventId = 1, EventName = "Bday1" },
new Event() { EventId = 2, EventName = "Bday2" },
new Event() { EventId = 3, EventName = "Bday3" }
};
The snippet that would add the list to the DB shown below.
var sqlInsert = "Insert into events( eventid, eventname ) values (#EventId, #EventName)";
using (IDbConnection conn = new NpgsqlConnection(cs))
{
conn.Open();
// Execute is an extension method supplied by Dapper
// This code will add all the entries in the eventsToInsert List and match up the values based on property name. Only caveat is that the property names of the POCO should match the placeholder names in the SQL Statement.
conn.Execute(sqlInsert, eventsToInsert);
// If we want to retrieve the data back into the list
List<Event> eventsAdded;
// This Dapper extension will return an Ienumerable, so i cast it to a List.
eventsAdded = conn.Query<Event>("Select * from events").ToList();
foreach( var row in eventsAdded)
{
Console.WriteLine($"{row.EventId} {row.EventName} was added");
}
}
-HTH

SSIS - Sending email different recipients different data

I'm new in SSIS.
I have a table with 1500 rows and I need to send emails from that table but each recipients has 15 rows from the table.
So I need to send different data to different emails from the same query.
Can you guys could help me please?
Thanks in advance.
Leo
-------------------update------------------------
Guys I could create a SSIS package to send email to different recipients the problem is: sample: 2 different users is receiving emails for the number of rows that they have in database...that's terrible each customer has 15 lines it will be 15 emails can I send just one email for customer contains the whole data?
Thanks in advance...
This is going to vary somewhat based on the query and other specifications, but at a high level you're probably going to want to follow these steps for sending the emails using SSIS. This example assumes that the emails are stored in a column within this table. As others have pointed out, using sp_send_dbmail will likely be your best option.
Create two string variables. One will hold the email addresses and the other will be for the SQL for sp_send_dbmail (more on this below). Create an additional variable of the object type that will hold the list of emails during execution.
Modify the string variable that will hold the SQL for sp_send_dbmail to be an expression using the variable with the email names. Depending on the query, you may need to add additional variables for other parameters in this query. An example of this variable is at the end of this post.
Have an initial Execute SQL Task that queries the table and retrieves the email addresses. Make sure to get all rows for each email. Set the ResultSet property to full and on the Result Set pane, add the object variable with 0 as the Result Name.
Next add a Foreach Loop, use the Foreach ADO Enumerator type, and select the object variable from the last Execute SQL Task for the source variable. The Enumeration Mode can be left as the "Rows in the first table" option.
On the Variable Mappings pane, add the string variable (for the email addresses) and set the index to 0. This will hold the email addresses for each execution of sp_send_dbmail.
Within the Foreach Loop, add an Execute SQL Task. For this, you will need to set the SQLSourceType to variable and use a variable holding the SQL with sp_send_dbmail.
Make sure that you have Database Mail properly configured for the account and profile used, including membership in the DatabaseMailUserRole role in msdb. You may also need to use the three-part name (database.schema.table) for your table.
Example SQL Variable Expression:
Note the double-quotes in the #query parameter around the email variable in addition to the quotes from concatenating the expression. You can either use two single quotes or precede a double-quote with a \ in the query to use a double-quote as part of the expression.
"DECLARE #Title varchar(100)
SET #Title = 'Email Title'
EXEC MSDB.DBO.SP_SEND_DBMAIL #profile_name = 'Your Profile',
#recipients = 'YourEmail#test.org',
#query = 'SELECT * FROM YourDatabase.YourSchema.YourTable WHERE EmailColumn = ""
+ #[User::VariableWithEmailAddress] + ""',
#query_result_no_padding = 1, #subject = #Title ; "
I have a package that sole role is to send emails from my packages and record the results in to a table. I use this package over and over from any package that sends mail.
It is simply a script task, that takes parameters and does the work:
The script to process:
public void Main()
{
//Read variables
#region ReadVariables
string cstr = Dts.Variables["connString"].Value.ToString();
//string sender = (string)Dts.Variables["User::Sender"].Value;
string title = (string)Dts.Variables["$Package::Title"].Value;
string priority = (string)Dts.Variables["$Package::Priority"].Value;
string body = (string)Dts.Variables["$Package::Body"].Value;
string source = Dts.Variables["$Package::Source"].Value.ToString();
string directTo = Dts.Variables["$Package::DirectMail"].Value.ToString();
string groups = Dts.Variables["$Package::MailGroups"].Value.ToString();
#endregion
//Send Email
#region SendMail
MailMessage mail = new MailMessage();
//mail.From = new MailAddress(sender);
mail.Subject = title;
mail.Body = body;
mail.IsBodyHtml = true;
switch(priority.ToUpper())
{
case "HIGH":
mail.Priority= MailPriority.High;
priority = "High";
break;
default:
mail.Priority=MailPriority.Normal;
priority = "Normal";
break;
}
DataTable dt = new DataTable(); //This is going to be a full distribution list
//Fill table with group email
if (groups.Split(',').Length > 0)
{
foreach (string group in groups.Split(','))
{
string strCmd = "mail.spGetEmailAddressesByGroup";
using (OleDbConnection conn = new OleDbConnection(cstr))
{
using (OleDbCommand cmd = new OleDbCommand(strCmd, conn))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("A", group);
OleDbDataAdapter da = new OleDbDataAdapter(cmd);
da.Fill(dt);
}
}
}
}
//add the directs to email
if (directTo.Split(',').Length > 0)
{
foreach (string m in directTo.Split(','))
{
if (m != "")
{
DataRow dr = dt.NewRow();
dr[0] = "TO";
dr[1] = m;
dt.Rows.Add(dr);
}
}
}
//Add from and reply to defaults
DataRow dr2 = dt.NewRow();
dr2[0] = "REPLYTO";
dr2[1] = ""; //WHERE DO YOU WANT REPLIES
dt.Rows.Add(dr2);
DataRow dr3 = dt.NewRow();
dr3[0] = "FROM";
dr3[1] = ""; //ENTER WHO YOU WANT THE EMAIL TO COME FROM
dt.Rows.Add(dr3);
//Bind dt to mail
foreach (DataRow dr in dt.Rows)
{
switch (dr[0].ToString().ToUpper())
{
case "TO":
mail.To.Add(new MailAddress(dr[1].ToString()));
dr[0] = "To";
break;
case "CC":
mail.CC.Add(new MailAddress(dr[1].ToString()));
dr[0] = "Cc";
break;
case "BCC":
mail.Bcc.Add(new MailAddress(dr[1].ToString()));
dr[0] = "Bcc";
break;
case "REPLYTO":
mail.ReplyToList.Add(new MailAddress(dr[1].ToString()));
dr[0] = "ReplyTo";
break;
case "FROM":
mail.From = new MailAddress(dr[1].ToString());
dr[0] = "From";
break;
case "SENDER":
mail.Sender = new MailAddress(dr[1].ToString());
dr[0] = "Sender";
break;
default:
dr[0] = "NotSent";
break;
}
}
try
{
SmtpClient smtp = new SmtpClient();
smtp.Port = 25;
smtp.DeliveryMethod = SmtpDeliveryMethod.Network;
smtp.UseDefaultCredentials = false;
smtp.Host = ""; //ENTER YOUR IP / SERVER
smtp.Send(mail);
}
catch (Exception e)
{
}
#endregion
//Record email as sent //I WILL NOT BE PROVIDING THIS PART
//#region RecordEmailInDB
That's just to send mails, I have many packages that build the emails to send. Most are variables to parameters on the call. The most complicated is the building of the Email Body and this is where your specific question comes into play.
This is a sample control flow:
There's a data flow that queries the details that need to be sent and are recorded into an object. As well as a record counter.
Back to control flow. There is a precendence constraint set to rowcount >0.
The I have a script task to build the body basically. And I have a class that converts the ADO Object into an HTML table.
public string BuildHTMLTablefromDataTable(DataTable t)
{
System.Text.StringBuilder sb = new System.Text.StringBuilder();
sb.Append("<table border='1'><tr style='background-color: #1A5276; color:#FFFFFF;'>");
foreach (DataColumn c in t.Columns)
{
sb.Append("<th align='left'>");
sb.Append(c.ColumnName);
sb.Append("</th>");
}
sb.Append("</tr>");
int rc = 0;
foreach (DataRow r in t.Rows)
{
rc++;
//every other row switches from white to gray
string OpeningTR = "<tr style='background-color: " + ((rc % 2 == 1) ? "#E5E7E9;'>" : "#FCF3CF;'>");
sb.Append(OpeningTR);
foreach (DataColumn c in t.Columns)
{
sb.Append("<td align='left'>");
sb.Append(System.Web.HttpUtility.HtmlEncode(
r[c.ColumnName] == null ? String.Empty : r[c.ColumnName].ToString()
)); //This will handle any invalid characcters and convert null to empty string
sb.Append("</td>");
}
sb.Append("</tr>");
}
sb.Append("</table>");
return sb.ToString();
}
public string BuildBody(DataTable dt)
{
string body = "<P>The following are vouchers that are not in the voucher table but in the GL:</p>";
DataView v = new DataView(dt);
body += BuildHTMLTablefromDataTable(dt); //v.ToTable(true, "Name", "LastVisit", "DaysUntilTimeout", "ExpDate", "RoleName"));
return body;
}
public void Main()
{
#region Read Variables
System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
DataTable dt = new DataTable();
da.Fill(dt, Dts.Variables["User::Changes"].Value);
#endregion
string body = BuildBody(dt);
Dts.Variables["User::Body"].Value = body;
Dts.TaskResult = (int)ScriptResults.Success;
}
Finally I will call the SendMail package and pass the parameters.
For your purpose you will need to have a foreach around this package and adjust your where clause for the person on each pass.
This is an example of an email sent (Body only):

Unsure which join to use with with the following sql code

i have 2 tables. I am wanting to insert some values into 1 table. The fields i am updating is ingredient_Name, Ingredient_Amount and Recipe_ID.
Ingredient (Table 1)
Ingredient_Name|Ingredient_Amount|Recipe_ID
---------------|-----------------|--------- <---- Insert into here
Recipe (Table 2)
Recipe_Name|Recipe_ID
yummyRecipe|----1---- <-----Recipe_ID stored here
The form i am using has a comboBox which lists all Recipe_Names. So when i go to insert a row into ingredients i need to fetch the Recipe_ID from the Recipe table where i have selected the Recipe_Name in the comboBox. Then use this Recipe_ID for the ID in the Ingredients table.
I am not very familiar with JOINs and unsure how to work out what one to use and if i need to use one. Any help or ideas?
Sorry if this is too long winded.
Recipe ComboBox Code
SqlConnection con = new SqlConnection(#"Data Source=(LocalDB)\v11.0; AttachDbFilename=C:\Users\Donald\Documents\Visual Studio 2013\Projects\DesktopApplication\DesktopApplication\Student_CB.mdf ;Integrated Security=True");
con.Open();
try
{
SqlDataAdapter da = new SqlDataAdapter("Select * FROM Recipe", con);
DataTable dt = new DataTable();
da.Fill(dt);
for (int i = 0; i < dt.Rows.Count; i++)
{
recipeCombo.Items.Add(dt.Rows[i]["Recipe_Name"]);
}
dt.Clear();
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
con.Close();
You can set the ComboBox items using directly the DataSource and control which field you want to display using the DisplayMember property. Together with the ValueMember property you could write
using(SqlConnection con = new SqlConnection(....))
{
con.Open();
try
{
SqlDataAdapter da = new SqlDataAdapter("Select * FROM Recipe", con);
DataTable dt = new DataTable();
da.Fill(dt);
recipeCombo.DataSource = dt;
recipeCombo.DisplayMember = "Recipe_Name";
recipeCombo.ValueMember = "Recipe_ID";
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
Now in the ComboBox_SelectedIndexChanged event (or everywhere you need to know the RecipeID you just have to write
if(recipeCombo.SelectedItem != null)
{
int recipeID = Convert.ToInt32(recipeCombo.SelectedValue);
... and use your value for insert without any JOIN
}
In whichever point you need, (for example in a SAVE button click event) add the following INSERT
if(recipeCombo.SelectedItem == null)
.... error message and return....
else
string sql = #"INSERT INTO Ingredient
(Ingredient_Name, Ingredient_Amount, Recipe_ID)
VALUES (#IngredientName, #IngredientFirstname, #RecipeID)";
using (var cmd = new SqlCommand(sql, con))
{
cmd.Parameters.Add("#IngredientName", SqlDbType.NVarChar).Value = ingredientTxt.Text);
cmd.Parameters.Add("#IngredientAmount", SqlDbType.Integer).Value = Convert.ToInt32(ingredientAmount.Text);
cmd.Parameters.Add("#RecipeID", SqlDbType.Integer).Value = Convert.ToInt32(recipeCombo.SelectedValue);
cmd.ExecuteNonQuery();
}
PS. Do not use AddWithValue - it is a shortcut with a lot of problems -
You do not need a JOIN in your case because you have only one table "Recipe" that contains the data you need to find "Recipe_ID". JOINs are used to "join" two tables.
If "Recipe_Name" is identical you can select the "Recipe_ID" where the "Recipe_Name" is equal to the selected value from the combobox then insert the new row to the "Ingredient" table.
INSERT INTO Ingredient SELECT #Ingredient_Name, #Ingredient_Amount, Recipe_ID FROM Recipe WHERE Recipe_ID = #myComboboxSelectedValue
Note: In this case Recipe_ID is redundant because you can remove it from your database and use Recipe_Name instead.
If "Recipe_Name" is not identical so you will need to fetch "Recipe_ID" with it and store it in the code-behind (if you do not want to show it to the user) and use it in your insert query.
By the way:
4. Whether using MYSQL or SQLSERVER the solution is the same, so "using .mdf" in the title of the question is irrelevant.
5. ".mdf" files are the extensions for SQLSERVER databases.

Multiple Selects In Query Not Giving All Result Sets

In my code there is a loop that gives the results of an SQL query using SqlCommand.
However, for some queries that I need to run there are multiple select statements in the query. For example, this might be what the entire statement would look like:
Dim query as string = "
Select * from people
Select * from places
Select * from items
Select * from foods"
cmd = New SqlCommand(query, connect)
cmd.Connection.Open()
reader = cmd.ExecuteReader
While reader.HasRows()
//various logic
While reader.Read()
//Do Logic Here
End While
End While
When my query is ran I get the results for the first 2 but since the 3rd one has no results it kicks the application out of the loop and I do not get the results of the 4th select. I need the results of the 4th select also.
Edit: Union will not work for this case because I need to be able to differentiate between result sets in my logic.
static void Main(string[] args)
{
cmd = new SqlCommand("zp_multiple_results", connect);
cmd.Connection.Open();
reader = cmd.ExecuteReader();
do
{
if (reader.HasRows)
while (reader.Read())
{
Console.WriteLine(reader[0].ToString());
}
}
while (reader.NextResult());
cmd.Connection.Close();
cmd.Connection.Dispose();
cmd.Dispose();
Console.ReadLine();
}

Pass Dictionary<string,int> to Stored Procedure T-SQL

I have mvc application. In action I have Dictionary<string,int>. The Key is ID and Value is sortOrderNumber. I want to create stored procedure that will be get key(id) find this record in database and save orderNumber column by value from Dictionary. I want to call stored procedure once time and pass data to it, instead of calling many times for updating data.
Have you any ideas?
Thanks!
The accepted answer of using a TVP is generally correct, but needs some clarification based on the amount of data being passed in. Using a DataTable is fine (not to mention quick and easy) for smaller sets of data, but for larger sets it does not scale given that it duplicates the dataset by placing it in the DataTable simply for the means of passing it to SQL Server. So, for larger sets of data there is an option to stream the contents of any custom collection. The only real requirement is that you need to define the structure in terms of SqlDb types and iterate through the collection, both of which are fairly trivial steps.
A simplistic overview of the minimal structure is shown below, which is an adaptation of the answer I posted on How can I insert 10 million records in the shortest time possible?, which deals with importing data from a file and is hence slightly different as the data is not currently in memory. As you can see from the code below, this setup is not overly complicated yet highly flexible as well as efficient and scalable.
SQL object # 1: Define the structure
-- First: You need a User-Defined Table Type
CREATE TYPE dbo.IDsAndOrderNumbers AS TABLE
(
ID NVARCHAR(4000) NOT NULL,
SortOrderNumber INT NOT NULL
);
GO
SQL object # 2: Use the structure
-- Second: Use the UDTT as an input param to an import proc.
-- Hence "Tabled-Valued Parameter" (TVP)
CREATE PROCEDURE dbo.ImportData (
#ImportTable dbo.IDsAndOrderNumbers READONLY
)
AS
SET NOCOUNT ON;
-- maybe clear out the table first?
TRUNCATE TABLE SchemaName.TableName;
INSERT INTO SchemaName.TableName (ID, SortOrderNumber)
SELECT tmp.ID,
tmp.SortOrderNumber
FROM #ImportTable tmp;
-- OR --
some other T-SQL
-- optional return data
SELECT #NumUpdates AS [RowsUpdated],
#NumInserts AS [RowsInserted];
GO
C# code, Part 1: Define the iterator/sender
using System.Collections;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using Microsoft.SqlServer.Server;
private static IEnumerable<SqlDataRecord> SendRows(Dictionary<string,int> RowData)
{
SqlMetaData[] _TvpSchema = new SqlMetaData[] {
new SqlMetaData("ID", SqlDbType.NVarChar, 4000),
new SqlMetaData("SortOrderNumber", SqlDbType.Int)
};
SqlDataRecord _DataRecord = new SqlDataRecord(_TvpSchema);
StreamReader _FileReader = null;
// read a row, send a row
foreach (KeyValuePair<string,int> _CurrentRow in RowData)
{
// You shouldn't need to call "_DataRecord = new SqlDataRecord" as
// SQL Server already received the row when "yield return" was called.
// Unlike BCP and BULK INSERT, you have the option here to create an
// object, do manipulation(s) / validation(s) on the object, then pass
// the object to the DB or discard via "continue" if invalid.
_DataRecord.SetString(0, _CurrentRow.ID);
_DataRecord.SetInt32(1, _CurrentRow.sortOrderNumber);
yield return _DataRecord;
}
}
C# code, Part 2: Use the iterator/sender
public static void LoadData(Dictionary<string,int> MyCollection)
{
SqlConnection _Connection = new SqlConnection("{connection string}");
SqlCommand _Command = new SqlCommand("ImportData", _Connection);
SqlDataReader _Reader = null; // only needed if getting data back from proc call
SqlParameter _TVParam = new SqlParameter();
_TVParam.ParameterName = "#ImportTable";
// _TVParam.TypeName = "IDsAndOrderNumbers"; //optional for CommandType.StoredProcedure
_TVParam.SqlDbType = SqlDbType.Structured;
_TVParam.Value = SendRows(MyCollection); // method return value is streamed data
_Command.Parameters.Add(_TVParam);
_Command.CommandType = CommandType.StoredProcedure;
try
{
_Connection.Open();
// Either send the data and move on with life:
_Command.ExecuteNonQuery();
// OR, to get data back from a SELECT or OUTPUT clause:
SqlDataReader _Reader = _Command.ExecuteReader();
{
Do something with _Reader: If using INSERT or MERGE in the Stored Proc, use an
OUTPUT clause to return INSERTED.[RowNum], INSERTED.[ID] (where [RowNum] is an
IDENTITY), then fill a new Dictionary<string, int>(ID, RowNumber) from
_Reader.GetString(0) and _Reader.GetInt32(1). Return that instead of void.
}
}
finally
{
_Reader.Dispose(); // optional; needed if getting data back from proc call
_Command.Dispose();
_Connection.Dispose();
}
}
Using Table Valued parameters is really not that complex.
given this SQL:
CREATE TYPE MyTableType as TABLE (ID nvarchar(25),OrderNumber int)
CREATE PROCEDURE MyTableProc (#myTable MyTableType READONLY)
AS
BEGIN
SELECT * from #myTable
END
this will show how relatively easy it is, it just selects out the values you sent in for demo purposes. I am sure you can easily abstract this away in your case.
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
namespace TVPSample
{
class Program
{
static void Main(string[] args)
{
//setup some data
var dict = new Dictionary<string, int>();
for (int x = 0; x < 10; x++)
{
dict.Add(x.ToString(),x+100);
}
//convert to DataTable
var dt = ConvertToDataTable(dict);
using (SqlConnection conn = new SqlConnection("[Your Connection String here]"))
{
conn.Open();
using (SqlCommand comm = new SqlCommand("MyTableProc",conn))
{
comm.CommandType=CommandType.StoredProcedure;
var param = comm.Parameters.AddWithValue("myTable", dt);
//this is the most important part:
param.SqlDbType = SqlDbType.Structured;
var reader = comm.ExecuteReader(); //or NonQuery, etc.
while (reader.Read())
{
Console.WriteLine("{0} {1}", reader["ID"], reader["OrderNumber"]);
}
}
}
}
//I am sure there is a more elegant way of doing this.
private static DataTable ConvertToDataTable(Dictionary<string, int> dict)
{
var dt = new DataTable();
dt.Columns.Add("ID",typeof(string));
dt.Columns.Add("OrderNumber", typeof(Int32));
foreach (var pair in dict)
{
var row = dt.NewRow();
row["ID"] = pair.Key;
row["OrderNumber"] = pair.Value;
dt.Rows.Add(row);
}
return dt;
}
}
}
Produces
0 100
1 101
2 102
3 103
4 104
5 105
6 106
7 107
8 108
9 109
Stored procedures do not support arrays as inputs. Googling gives a couple of hacks using XML or comma separated strings, but those are hacks.
A more SQLish way to do this is to create a temporary table (named e.g. #Orders) and insert all the data into that one. Then you can call the sp, using the same open Sql Connection and insie the SP use the #Orders table to read the values.
Another solution is to use Table-Valued Parameters but that requires some more SQL to setup so I think it is probably easier to use the temp table approach.

Resources