SQL Server 2008 changed table name bizarre behavior - sql-server

I changed the name of one of my tables, then afterwards encoded some data then pulled it using a view to my surprise the data is not showing. I tried renaming it back to its original name with no luck the same thing is happening.
Then finally I tried retyping the data on one of the columns and then executed the view and there the data is finally showing now the problem arises I need to re encode the data on one of the column every time a data is inserted which is obviously not a good thing to do.
here is the code on how i added some data
tblcsv.Columns.AddRange(new DataColumn[7] { new DataColumn("unit_name", typeof(string)), new DataColumn("unit", typeof(string)), new DataColumn("adrress", typeof(string)), new DataColumn("latitude", typeof(string))
,new DataColumn("longitude" , typeof(string)) , new DataColumn("region" , typeof(string)) , new DataColumn("linkid" , typeof(string))});
string ReadCSV = File.ReadAllText(forex);
foreach (string csvRow in ReadCSV.Split('\n'))
{
if (!string.IsNullOrEmpty(csvRow))
{
//Adding each row into datatable
tblcsv.Rows.Add();
int count = 0;
foreach (string FileRec in csvRow.Split(','))
{
tblcsv.Rows[tblcsv.Rows.Count - 1][count] = FileRec;
if (count == 5)
{
tblcsv.Rows[tblcsv.Rows.Count - 1][6] = link;
}
count++;
}
}
}
string consString = ConfigurationManager.ConnectionStrings["diposlConnectionString"].ConnectionString;
using (SqlConnection con = new SqlConnection(consString))
{
using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(con))
{
//Set the database table name
sqlBulkCopy.DestinationTableName = "dbo.FRIENDLY_FORCES";
//[OPTIONAL]: Map the Excel columns with that of the database table
sqlBulkCopy.ColumnMappings.Add("unit_name", "unit_name");
sqlBulkCopy.ColumnMappings.Add("unit", "unit");
sqlBulkCopy.ColumnMappings.Add("adrress", "adrress");
sqlBulkCopy.ColumnMappings.Add("latitude", "latitude");
sqlBulkCopy.ColumnMappings.Add("longitude", "longitude");
sqlBulkCopy.ColumnMappings.Add("region", "region");
sqlBulkCopy.ColumnMappings.Add("linkid", "linkid");
con.Open();
sqlBulkCopy.WriteToServer(tblcsv);
con.Close();
}
}
the column region is where i manually edited the data
Did the renaming of the table did something to my data?
Or am I just missing something?
Thank you

Related

best solution for multiple insert update solution

Struggle with understanding C# & Npgsql as a beginner. Following code examples:
// Insert some data
using (var cmd = new NpgsqlCommand())
{ cmd.Connection = conn;
cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p)";
cmd.Parameters.AddWithValue("p", "Hello world");
cmd.ExecuteNonQuery();
}
The syntax for more than one insert & update statement like this is clear so far:
cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p);INSERT INTO data1...;INSERT into data2... and so on";
But what is the right solution for a loop which should handle one statement within.
This works not:
// Insert some data
using (var cmd = new NpgsqlCommand())
{
foreach(s in SomeStringCollectionOrWhatever)
{
cmd.Connection = conn;
cmd.CommandText = "INSERT INTO data (some_field) VALUES (#p)";
cmd.Parameters.AddWithValue("p", s);
cmd.ExecuteNonQuery();
}
}
It seems the values will be "concatenated" or remembered. I cannot see any possibility to "clear" the existing cmd-object.
My second solution would be to wrap the whole "using" block into the loop. But every cycle would create a new object. That seems ugly to me.
So what is the best solution for my problem?
To insert lots of rows efficiently, take a look at Npgsql's bulk copy feature - the API is more suitable (and more efficient) for inserting large numbers of rows than concatenating INSERT statements into a batch like you're trying to do.
If you want to rerun the same SQL with changing parameter values, you can do the following:
using (var cmd = new NpgsqlCommand("INSERT INTO data (some_field) VALUES (#p)", conn))
{
var p = new NpgsqlParameter("p", DbType.String); // Adjust DbType according to type
cmd.Parameters.Add(p);
cmd.Prepare(); // This is optional but will optimize the statement for repeated use
foreach(var s in SomeStringCollectionOrWhatever)
{
p.Value = s;
cmd.ExecuteNonQuery();
}
}
If you need lots of rows and performance is key then i would recommend Npgsql's bulk copy capability as #Shay mentioned. But if you are looking for quick way to do this without the bulk copy i would recommend to use Dapper.
Consider the example below.
Lets say you have a class called Event and a list of events to add.
List<Event> eventsToInsert = new List<Event>
{
new Event() { EventId = 1, EventName = "Bday1" },
new Event() { EventId = 2, EventName = "Bday2" },
new Event() { EventId = 3, EventName = "Bday3" }
};
The snippet that would add the list to the DB shown below.
var sqlInsert = "Insert into events( eventid, eventname ) values (#EventId, #EventName)";
using (IDbConnection conn = new NpgsqlConnection(cs))
{
conn.Open();
// Execute is an extension method supplied by Dapper
// This code will add all the entries in the eventsToInsert List and match up the values based on property name. Only caveat is that the property names of the POCO should match the placeholder names in the SQL Statement.
conn.Execute(sqlInsert, eventsToInsert);
// If we want to retrieve the data back into the list
List<Event> eventsAdded;
// This Dapper extension will return an Ienumerable, so i cast it to a List.
eventsAdded = conn.Query<Event>("Select * from events").ToList();
foreach( var row in eventsAdded)
{
Console.WriteLine($"{row.EventId} {row.EventName} was added");
}
}
-HTH

WebAPI EF update 30,000 rows of data is very slow

I am trying to have my asp.net WebAPI web service read a .csv and update a database using Entity Framework. The .csv file is about 20,000-30,000 rows.
As of now I am using a TextfieldParser to read the .csv, each row of the .csv file I create a new object, then add object to the EF context.
Once it's done adding all rows to the context, then I call db.SaveChanges();
Watching the console I noticed it calls an update statement for each row... which takes a long time. Is there a better more efficient way to accomplish this?
if (filetype == "xxx")
{
using (TextFieldParser csvReader = new TextFieldParser(downloadFolder + fileName))
{
csvReader.SetDelimiters(new string[] { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
int rowCount = 1;
while (!csvReader.EndOfData)
{
string[] fieldData = csvReader.ReadFields();
//skip header row
if (rowCount != 1)
{
var t = new GMI_adatpos
{
PACCT = fieldData[3]
};
db.GMI_adatpos.Add(t);
}
rowCount++;
}
}
}
db.SaveChanges();
This issue is very common,
In your case, we can split it into two category:
Add vs AddRange Performance
Write & database round-trip
Add vs AddRange Performance
The Add method will try to detect change every time you add a new record while the AddRange only does it once. Detecting changes every time can take several minutes.
This issue is very easy to fix, simply create a list, add the entity to this list instead and use AddRange with the list at the end.
List<GMI_adatpo> list = new List<GMI_adatpo>();
if (filetype == "xxx")
{
using (TextFieldParser csvReader = new TextFieldParser(downloadFolder + fileName))
{
csvReader.SetDelimiters(new string[] { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
int rowCount = 1;
while (!csvReader.EndOfData)
{
string[] fieldData = csvReader.ReadFields();
//skip header row
if (rowCount != 1)
{
var t = new GMI_adatpos
{
PACCT = fieldData[3]
};
list.Add(t);
}
rowCount++;
}
}
}
db.GMI_adatpos.AddRange(list)
db.SaveChanges();
Write & Database round-trip
Everytime you save a record, you perform a database round-trip. So if you insert average 30,000 record, you perform 30,000 database round-trip which is insane!
Disclaimer: I'm the owner of the project Entity Framework Extensions
This library allows to perform:
BulkSaveChanges
BulkInsert
BulkUpdate
BulkDelete
BulkMerge
You can either call BulkSaveChanges instead of SaveChanges or create a list to insert and use directly BulkInsert instead for even more performance.
BulkSaveChanges Solution (Way faster than SaveChanges)
db.GMI_adatpos.AddRange(list)
db.SaveChanges();
BulkInsert Solution (Fastest than BulkSaveChanges but do not save related entities)
db.BulkInsert(list);
Because the number of items added to the DbContext is very high, ram space is gradually filled, and operation is very slow. Therefore is better that after a few records (ex 100), calling SaveChanges Methods and renew DbContext.
if (filetype == "xxx")
{
using (TextFieldParser csvReader = new TextFieldParser(downloadFolder + fileName))
{
csvReader.SetDelimiters(new string[] { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
int rowCount = 1;
while (!csvReader.EndOfData)
{
if(rowCount%100 == 0)
{
db.Dispose();
db.SaveChanges();
db = new AppDbContext();//Your DbContext
}
string[] fieldData = csvReader.ReadFields();
//skip header row
if (rowCount != 1)
{
var t = new GMI_adatpos
{
PACCT = fieldData[3]
};
db.GMI_adatpos.Add(t);
}
rowCount++;
}
}
}

Unsure which join to use with with the following sql code

i have 2 tables. I am wanting to insert some values into 1 table. The fields i am updating is ingredient_Name, Ingredient_Amount and Recipe_ID.
Ingredient (Table 1)
Ingredient_Name|Ingredient_Amount|Recipe_ID
---------------|-----------------|--------- <---- Insert into here
Recipe (Table 2)
Recipe_Name|Recipe_ID
yummyRecipe|----1---- <-----Recipe_ID stored here
The form i am using has a comboBox which lists all Recipe_Names. So when i go to insert a row into ingredients i need to fetch the Recipe_ID from the Recipe table where i have selected the Recipe_Name in the comboBox. Then use this Recipe_ID for the ID in the Ingredients table.
I am not very familiar with JOINs and unsure how to work out what one to use and if i need to use one. Any help or ideas?
Sorry if this is too long winded.
Recipe ComboBox Code
SqlConnection con = new SqlConnection(#"Data Source=(LocalDB)\v11.0; AttachDbFilename=C:\Users\Donald\Documents\Visual Studio 2013\Projects\DesktopApplication\DesktopApplication\Student_CB.mdf ;Integrated Security=True");
con.Open();
try
{
SqlDataAdapter da = new SqlDataAdapter("Select * FROM Recipe", con);
DataTable dt = new DataTable();
da.Fill(dt);
for (int i = 0; i < dt.Rows.Count; i++)
{
recipeCombo.Items.Add(dt.Rows[i]["Recipe_Name"]);
}
dt.Clear();
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
con.Close();
You can set the ComboBox items using directly the DataSource and control which field you want to display using the DisplayMember property. Together with the ValueMember property you could write
using(SqlConnection con = new SqlConnection(....))
{
con.Open();
try
{
SqlDataAdapter da = new SqlDataAdapter("Select * FROM Recipe", con);
DataTable dt = new DataTable();
da.Fill(dt);
recipeCombo.DataSource = dt;
recipeCombo.DisplayMember = "Recipe_Name";
recipeCombo.ValueMember = "Recipe_ID";
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
Now in the ComboBox_SelectedIndexChanged event (or everywhere you need to know the RecipeID you just have to write
if(recipeCombo.SelectedItem != null)
{
int recipeID = Convert.ToInt32(recipeCombo.SelectedValue);
... and use your value for insert without any JOIN
}
In whichever point you need, (for example in a SAVE button click event) add the following INSERT
if(recipeCombo.SelectedItem == null)
.... error message and return....
else
string sql = #"INSERT INTO Ingredient
(Ingredient_Name, Ingredient_Amount, Recipe_ID)
VALUES (#IngredientName, #IngredientFirstname, #RecipeID)";
using (var cmd = new SqlCommand(sql, con))
{
cmd.Parameters.Add("#IngredientName", SqlDbType.NVarChar).Value = ingredientTxt.Text);
cmd.Parameters.Add("#IngredientAmount", SqlDbType.Integer).Value = Convert.ToInt32(ingredientAmount.Text);
cmd.Parameters.Add("#RecipeID", SqlDbType.Integer).Value = Convert.ToInt32(recipeCombo.SelectedValue);
cmd.ExecuteNonQuery();
}
PS. Do not use AddWithValue - it is a shortcut with a lot of problems -
You do not need a JOIN in your case because you have only one table "Recipe" that contains the data you need to find "Recipe_ID". JOINs are used to "join" two tables.
If "Recipe_Name" is identical you can select the "Recipe_ID" where the "Recipe_Name" is equal to the selected value from the combobox then insert the new row to the "Ingredient" table.
INSERT INTO Ingredient SELECT #Ingredient_Name, #Ingredient_Amount, Recipe_ID FROM Recipe WHERE Recipe_ID = #myComboboxSelectedValue
Note: In this case Recipe_ID is redundant because you can remove it from your database and use Recipe_Name instead.
If "Recipe_Name" is not identical so you will need to fetch "Recipe_ID" with it and store it in the code-behind (if you do not want to show it to the user) and use it in your insert query.
By the way:
4. Whether using MYSQL or SQLSERVER the solution is the same, so "using .mdf" in the title of the question is irrelevant.
5. ".mdf" files are the extensions for SQLSERVER databases.

How to Bulk Insert csv with double quotes around all values?

I am trying to insert a .csv file into SQL Server 2008 R2.
The .csv is 300+MB from http://ipinfodb.com/ip_database.php Complete
(City), 4.0M records.
Here're the top 5 lines, with 1st line = column headers:
"ip_start";"country_code";"country_name";"region_code";"region_name";"city";"zipcode";"latitude";"longitude";"metrocode"
"0";"RD";"Reserved";;;;;"0";"0";
"16777216";"AU";"Australia";;;;;"-27";"133";
"17367040";"MY";"Malaysia";;;;;"2.5";"112.5";
"17435136";"AU";"Australia";;;;;"-27";"133";
I tried Import and Export Data, and BULK INSERT, but haven't been able to import them correctly yet.
Shall I resort to use bcp? can it handle stripping the ""? how?
Thank you very much.
Got it, forgot to set Text Qualifier as ":
Your data looks pretty inconsistent since NULL values don't also carry a quotation enclosure.
I believe you can create a format file to customize to your particular csv file and its particular terminators in SQL SERVER.
See more here:
http://lanestechblog.blogspot.com/2008/08/sql-server-bulk-insert-using-format.html
Is this a single import or are you wanting to schedule a recurring import? If this is a one-time task, you should be able to use the Import and Export Wizard. The text qualifier will be the quotation mark ("), be sure to select column names in the first data row, and you'll want to convey that the field delimiter is the semicolon (;).
I'm not certain the file is properly formatted - the last semicolon following each of the data rows might be a problem. If you hit any errors, simply add a new column header to the file.
EDIT: I just did a quick test, the semicolons at the end will be treated as part of the final value in that row. I would suggest adding a ;"tempheader" at the end of your header (first) row - that will cause SQL to treat the final semicolon as a delimiter and you can delete that extra column once the import is complete.
In C# you can use this code, working for me
public bool CSVFileRead(string fullPathWithFileName, string fileNameModified, string tableName)
{
SqlConnection con = new SqlConnection(ConfigurationSettings.AppSettings["dbConnectionString"]);
string filepath = fullPathWithFileName;
StreamReader sr = new StreamReader(filepath);
string line = sr.ReadLine();
string[] value = line.Split(',');
DataTable dt = new DataTable();
DataRow row;
foreach (string dc in value)
{
dt.Columns.Add(new DataColumn(dc));
}
while (!sr.EndOfStream)
{
//string[] stud = sr.ReadLine().Split(',');
//for (int i = 0; i < stud.Length; i++)
//{
// stud[i] = stud[i].Replace("\"", "");
//}
//value = stud;
value = sr.ReadLine().Split(',');
if (value.Length == dt.Columns.Count)
{
row = dt.NewRow();
row.ItemArray = value;
dt.Rows.Add(row);
}
}
SqlBulkCopy bc = new SqlBulkCopy(con.ConnectionString, SqlBulkCopyOptions.TableLock);
bc.DestinationTableName = tableName;
bc.BatchSize = dt.Rows.Count;
con.Open();
bc.WriteToServer(dt);
bc.Close();
con.Close();
return true;
}

problem with update strong typed dataset turn to database in C# .net framework 3.5

I want to remove some special characters in a table of database. I've used strong typed table to do it. When i got all data into dataset from database and modified it then i called method update() of data adapter to turn dataset to database but it doesn't work.
Below is my code
DsTel tel = new DsTel();
DsTelTableAdapters.telephone_bkTableAdapter adapter = new DsTelTableAdapters.telephone_bkTableAdapter();
adapter.Connection = new SqlConnection(ConfigurationManager.AppSettings["SiteSqlServer"].ToString());
adapter.Fill(tel.telephone_bk);
foreach (DsTel.telephone_bkRow row in tel.telephone_bk.Rows)
{
row.telephoneNo = RemoveWhiteSpace(row.telephoneNo.ToString());
row.AcceptChanges();
}
tel.AcceptChanges();
adapter.Update(tel.telephone_bk);
Please give me some ideas?
Thanks in advance.
I've found the solution for this problem by using the TableAdapterManager.
Below is my code:
DsTel tel = new DsTel();
DsTelTableAdapters.telephone_bkTableAdapter adapter = new DsTelTableAdapters.telephone_bkTableAdapter();
adapter.Connection = new SqlConnection(ConfigurationManager.AppSettings["SiteSqlServer"].ToString());
adapter.Fill(tel.telephone_bk);
foreach (DsTel.telephone_bkRow row in tel.telephone_bk.Rows)
{
if (!row.IstelephoneNoNull())
{
row.telephoneNo = RemoveWhiteSpace(row.telephoneNo.ToString());
}
}
DsTelTableAdapters.TableAdapterManager mrg = new DsTelTableAdapters.TableAdapterManager();
mrg.telephone_bkTableAdapter = adapter;
mrg.BackupDataSetBeforeUpdate = true;
mrg.UpdateAll((DsTel)tel.GetChanges());
You've called AcceptChanges before the update. This means the dataset has no changes any more so there is nothing to send to the database.
Remove all of the calls to AcceptChanges and add one AFTER the update. ie:
DsTel tel = new DsTel();
DsTelTableAdapters.telephone_bkTableAdapter adapter = new DsTelTableAdapters.telephone_bkTableAdapter();
adapter.Connection = new SqlConnection(ConfigurationManager.AppSettings["SiteSqlServer"].ToString());
adapter.Fill(tel.telephone_bk);
foreach (DsTel.telephone_bkRow row in tel.telephone_bk.Rows)
{
row.telephoneNo = RemoveWhiteSpace(row.telephoneNo.ToString());
}
adapter.Update(tel.telephone_bk);
tel.telephone_bk.AcceptChanges();

Resources