FOR statement to query a SQL Server - sql-server

I want to create a query against a database with a for statement (in C#)
something like this:
List<object> data = new List<object>();
for(int i = 0; i < executeScalar("SELECT COUNT(*) FROM mytable"); i++)
{
List[i] = executeRead("SELECT rownumber(i) From mytable");
// or
executeUpdate("UPDATE mytable SET ... inrownumber(i)",List[i])
}
and the question is: is there any function to use for this "rownumber(i)" and "inrownumber(i)"?
I know I can do it like this
List[i] = executeRead("SELECT * From mytable WHERE ROW_NUMBER() = " + i);
and
executeUpdate("UPDATE mytable SET ... WHERE ROW_NUMBER() = " + i,List[i])
but if I do that - the database will search in all the table each time to find one item, so if I have 100 items, the database will pass on 10,000 items. and I wont that each time the database go directly to the row, so it pass only 100 items in all the for statement
Do you know any way do do it?
(I need it because in my program - the developer assumed that all the data is in the list, and he take them with a for statement and by index, and do "Add" and "Insert" and so on, and I don't wont to change all the program)
Thanks

Assuming you have your data stored in a generic list called places then
Using (SqlConnection cn = GetMyDbConnectionHere())
{
Using(SqlCommand cmd = new SqlCommand("dbo.UpdatePlace", cn)
{
// Create your parameters for the command here - e.g. p_PlaceName
foreach(Place place in places)
{
if(place.HasChanged)
{
p_PrimaryKey.value = place.primaryKey;
p_PlaceName.value = place.placeName;
p_PlaceLat.value = place.lat;
// And so on and so forth
cmd.ExecuteNonQuery();
}
}
}
}
All this code is straight off the top of my head and typed directly into SO on the web page - so I make no guarantee as to it being fully functional - but it should at least get you going... In addition there's zero error handling here - also a major no-no.

Related

SSIS - Various number of columns to output to flat file

I am currently creating an SSIS that will gather data from database and output it to a single Comma delimited Flat File. The file will contain order details Format of file is
Order#1 details (51 columns)
Order#1 header (62 columns)
Order#2 details (51 columns)
Order#2 header (62 columns)
etc...
Order header has 62 columns, order details has 51 columns. I need to output this to a flat file and I am running into an issue because SSIS does not handle varying columns. Can someone please help me and given that my source is an OLEDB source with the query, how do I create a script component to output to a file.
Current Package looks like the following:
Get a list of all order. Pass orderid as a variable.
For loop container goes through each orderid, runs a data task flow to get the order details for the order. Run a data task to get order header.
I am just running into an issue to output each line to Flat file.
IF anyone can help that will be immensely appreciated. I have been struggling with this for a week now.If anyone can start me off with what the script component code should look like that would be immensely appreciated.
I have added what I have so far:
http://imgur.com/a/yTxfH
This is what my script looks like:
public void Main()
{
// TODO: Add your code here
DataTable RecordType300 = new DataTable();
DataTable RecordType210 = new DataTable();
DataTable RecordType220 = new DataTable();
DataTable RecordType200 = new DataTable();
OleDbDataAdapter adapter = new OleDbDataAdapter();
adapter.Fill(RecordType300, Dts.Variables["User:rec_type300"].Value);
adapter.Fill(RecordType210, Dts.Variables["User::rec_type_210"].Value);
adapter.Fill(RecordType220, Dts.Variables["User::rec_type_220"].Value);
adapter.Fill(RecordType200, Dts.Variables["User::rec_type200"].Value);
using (StreamWriter outfile = new StreamWriter("C:\\myoutput.csv"))
{
for (var i = 0; i < RecordType300.Rows.Count; i++)
{
var detailFields = RecordType300.Rows[i].ItemArray.Select(field => field.ToString()).ToArray();
// var poBillFields = RecordType210.Rows[i].ItemArray.Select(field => field.ToString()).ToArray();
// var poShipFields = RecordType220.Rows[i].ItemArray.Select(field => field.ToString()).ToArray();
// var poHeaderFields = RecordType200.Rows[i].ItemArray.Select(field => field.ToString()).ToArray();
outfile.WriteLine(String.Join(",", detailFields));
// outfile.WriteLine(string.Join(",", poBillFields));
// outfile.WriteLine(string.Join(",", poShipFields));
// outfile.WriteLine(string.Join(",", poHeaderFields));
}
}
Dts.TaskResult = (int)ScriptResults.Success;
}
But every time I run it, it errors out. Am I missing something here? Also, how would I create a file in the beginning only 1 time. Meaning every time this package is run it will create a file with the datestamp and append to it each time. The next time the package runs, it will create a new file with new date stamp and append each order details based on the order number.
This code/method has not been tested but should give you a good idea of what to do.
Create 2 SSIS variables of type object, one for the headers and one for the detail.
Create 2 Execute SQL tasks and 1 Script Task as outlined here:
Setup your Tasks to handle a full result set, similar to these pics (the Detail version is shown, do similar for Header but map results to the Header object and change your query to point at the header table):
Edit your script task and allow Detail and Header as read only vars:
Edit your actual script now along these lines (this is assuming you have exactly 1 detail row for 1 header row):
using System.IO;
using System.Linq;
using System.Data.OleDb;
// following to be inserted into Main() function
DataTable detailData = new DataTable();
DataTable headerData = new DataTable();
OleDbDataAdapter adapter = new OleDbDataAdapter();
adapter.Fill(detailData, Dts.Variables["User::Detail"].Value);
adapter.Fill(headerData, Dts.Variables["User::Header"].Value);
using (StreamWriter outfile = new StreamWriter("myoutput.csv"))
{
// we are making the assumption that
for (var i = 0; i < detailData.Rows.Count; i++)
{
var detailFields = detailData.Rows[i].ItemArray.Select(field => field.ToString()).ToArray();
var headerFields = headerData.Rows[i].ItemArray.Select(field => field.ToString()).ToArray();
outfile.WriteLine(string.Join(",", detailFields));
outfile.WriteLine(string.Join(",", headerFields));
}
}
Not a complete answer, just something to put you on the track of an alternative approach
SELECT Type, OrderBy, Col
FROM
(
SELECT 'D' As Type, Ord as OrderBy,
Col1 + ',' + CAST(Col2 AS VARCHAR(50)) + ',' + Col3 As Col
FROM Details
UNION ALL
SELECT 'H' As Type, Ord as OrderBy,
Col1 + ',' + CAST(Col2 AS VARCHAR(50)) + ',' + Col3 As Col + ',' + Col4
FROM Header
) S
ORDER BY OrderBy, Type
Its ugly but it works as long as you cast all datatypes to varchar
You can wrap this up in a view or a stored procedure and test it from the database (before you get to the SSIS part). You can even export this using BCP.EXE rather than SSIS
What you have here is one column which happens to contain this kind of data:
A,B,C
D,E,F,G
From a metadata perspective there is consistently one column
From a CSV perspective there are variable columns

Dapper SqlBuilder OrWhere using AND instead of OR

I was trying to use the Where and OrWhere methods of SqlBuilder for Dapper, but it is not acting like how I would expect.
The edited portion of this question is basically what I ran into. Since it didn't receive a response, I'll ask it here.
var builder = new SqlBuilder();
var sql = builder.AddTemplate("select * from table /**where**/ ");
builder.Where("a = #a", new { a = 1 })
.OrWhere("b = #b", new { b = 2 });
I expected select * from table WHERE a = #a OR b = #b
but I got select * from table WHERE a = #a AND b = #b
Is there any way to add an OR to the where clause using the SqlBuilder?
I think it's just a matter of changing the following in the SqlBuilder class to say OR instead of AND, but I wanted to confirm.
public SqlBuilder OrWhere(string sql, dynamic parameters = null)
{
AddClause("where", sql, parameters, " AND ", prefix: "WHERE ", postfix: "\n", IsInclusive: true);
return this;
}
Nevermind. I looked through the SqlBuilder code and found that if there is a mixture of Where and OrWhere, it will do the following:
Join all the AND clauses
Join all the OR clauses separately
Attach the OR clauses at the end of the AND clauses with an AND
If you don't have more than 1 OrWhere, then you won't see any OR.
I'll modify my query logic to take this into account
You have to change your query into:
var builder = new SqlBuilder();
var sql = builder.AddTemplate("select * from table /**where**/ ");
builder.OrWhere("a = #a", new { a = 1 })
.OrWhere("b = #b", new { b = 2 });
In case you want to try another alternative, DapperQueryBuilder may be easier to understand:
var query = cn.QueryBuilder($#"
SELECT *
FROM table
/**where**/
");
// by default multiple filters are combined with AND
query.FiltersType = Filters.FiltersType.OR;
int a = 1;
int b = 2;
query.Where($"a = {a}");
query.Where($"b = {b}");
var results = query.Query<YourPOCO>();
The output is fully parametrized SQL (WHERE a = #p0 OR b = #p1).
You don't have to manually manage the dictionary of parameters.
Disclaimer: I'm one of the authors of this library

Using LINQ to find Excel columns that don't exist in array?

I have a solution that works for what I want, but I'm hoping to get some slick LINQ types to help me improve what I have, and learn something new in the process.
The code below is used verify that certain column names exist on a spreadsheet. I was torn between using column index values or column names to find them. They both have good and bad points, but decided to go with column names. They'll always exist, and sometimes in different order, though I'm working on this.
Details:
GetData() method returns a DataTable from the Excel spreadsheet. I cycle through all the required field names from my array, looking to see if it matches with something in the column collection on the spreadsheet. If not, then I append the missing column name to an output parameter from the method. I need both the boolean value and the missing fields variable, and I wasn't sure of a better way than using the output parameter. I then remove the last comma from the appended string for the display on the UI. If the StringBuilder object isn't null (I could have used the missingFieldCounter too) then I know there's at least one missing field, bool will be false. Otherwise, I just return output param as empty, and method as true.
So, Is there a more slick, all-in-one way to check if fields are missing, and somehow report on them?
private bool ValidateFile(out string errorFields)
{
data = GetData();
List<string> requiredNames = new [] { "Site AB#", "Site#", "Site Name", "Address", "City", "St", "Zip" }.ToList();
StringBuilder missingFields = null;
var missingFieldCounter = 0;
foreach (var name in requiredNames)
{
var foundColumn = from DataColumn c in data.Columns
where c.ColumnName == name
select c;
if (!foundColumn.Any())
{
if (missingFields == null)
missingFields = new StringBuilder();
missingFieldCounter++;
missingFields.Append(name + ",");
}
}
if (missingFields != null)
{
errorFields = missingFields.ToString().Substring(0, (missingFields.ToString().Length - 1));
return false;
}
errorFields = string.Empty;
return true;
}
Here is the linq solution that makes the same.
I call the ToArray() function to activate the linq statement
(from col in requiredNames.Except(
from dataCol in data
select dataCol.ColumnName
)
select missingFields.Append(col + ", ")
).ToArray();
errorFields = missingFields.ToString();
Console.WriteLine(errorFields);

Composite C1 How would I rewrite this Sql update statement to work in c#?

I have a piece of code in an image sortable grid which sends back a resulting string array of integers based on the user's new sort order for 'propid':
{ 'imgid': '4,2,3,5,6,7,8,9,1','propid':'391' }
The above shows 9 images on the screen. The db image table has both an image id (imgid) field and a sort sequence field (orderseq). I am using a custom namespace datatype:
< connection.Get< ALocal.propimage >()
like all datatype connections in C1.
In direct SQL I would write this:
string []q = imgid.Split(',');
string qry="";
for (int i = 0; i < q.Length; i++)
{
qry += "update ALocal_propimage set propimage_orderseq="+(i+1)+" where prop_id="+propid+" and propimage_id="+q[i]+" ;";
}
sqlHelper obj = new sqlHelper();
obj.ExecuteNonQuery(qry);
return "Record Updated";
How does this convert to writing it using c# into Composite's C1 CMS 'Updating Multiple Data' method as I keep failing at it?
The C1 site 'Updating Multiple Data' method rudimentary example is:
using (DataConnection connection = new DataConnection())
{
var myUsers = connection.Get<Demo.Users>().Where (d => d.Number < 10).ToList();
foreach (Demo.Users myUser in myUsers)
{
myUser.Number += 10;
}
connection.Update<Demo.Users>(myUsers);
}
Any help would be really appreciated.
You would need to split your update code into a get and a update, to let C1 know exactly which entity you would like to update. So something like this
for (int i = 0; i < q.Length; i++)
{
var propimages = connection.Get<ALocal.propimage>().Where(o => o.PropId = propid && p.PropImageId = q[i]);
foreach (var o in propimages)
{
o.OrderSeq = i + 1;
}
connection.Update(propimages);
}

How to Add a command/SQL statement to a strongly typed TableAdapter's Update/Insert command?

See this question. I have the following code that executes against a SQLIte database using a strongly typed dataset.
messagesAdapter.Update(messages);//messages is a DataTable
var connection = messagesAdapter.Connection;
var retrieveIndexCommand= connection.CreateCommand();
retrieveIndexCommand.CommandText = #"Select last_insert_rowid()";
connection.Open();
var index = retrieveIndexCommand.ExecuteScalar();
connection.Close();
This does not work as the last_inser_rowid() always returns zero. This caused by the fact that it needs to be called during the same connection that is used by the TableAdapter's Update command. How can I change the the TableAdapter's Insert or Update command so that it return the index?
If you are inserting a single row, you can use this:
// cast if necessary
using (var insert = (SQLiteCommand)this.Adapter.InsertCommand.Clone()) {
insert.CommandText += "; SELECT last_insert_rowid()";
foreach (SQLiteParameter parameter in insert.Parameters) {
parameter.Value = row[parameter.SourceColumn];
}
}
var index = Convert.ToInt32(insert.ExecuteScalar());
You can also use it to insert multiple rows and assign your id to each one:
using (var insert = (SQLiteCommand)this.Adapter.InsertCommand.Clone())
{
insert.CommandText += "; SELECT last_insert_rowid()";
// this filter only added rows
foreach (MyDataSet.MessageRow row in messages.GetChanges(DataRowState.Added))
{
foreach (SQLiteParameter parameter in insert.Parameters)
{
parameter.Value = row[parameter.SourceColumn];
}
// use the name of your rowid column
row.ID = Convert.ToInt32(insert.ExecuteScalar());
row.AcceptChanges();
}
}
// then you can perfom the other updates
messagesAdapter.Update(messages);
Note: Be sure to open / close your connection

Resources