Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I read the xml file with linq and create list of objects.
StringReader stream=new StringReader(xml);
XmlTextReader reader=new XmlTextReader(stream);
XElement req = XElement.Load(reader);
var users= (req.Descendants("Report")
.Select(e => new {
Fname= e.Descendants("firstName").FirstOrDefault().Value,
Lname = e.Descendants("lastName").FirstOrDefault().Value,
personalId = e.Descendants("id").FirstOrDefault().Value,
})).ToList();
the users value include 100,000 objects.
I want bulk insert these objects into a database table.
public static void saveData<T>(ref List<T> list, string destinationTableName, int batchSize)
{
using (EntityDataReader<T> reader = new EntityDataReader<T>(list))
using (System.Data.SqlClient.SqlBulkCopy sbc = new System.Data.SqlClient.SqlBulkCopy("your connection string"))
{
for (int i = 0; i < reader.FieldCount; i++)
{
string colName = reader.GetName(i);
sbc.ColumnMappings.Add(colName, colName);
}
sbc.BatchSize = batchSize;
sbc.DestinationTableName = destinationTableName;
sbc.WriteToServer(reader);
}
}
I'm using this code to insert a very large list of items, T should be a known entity object
Related
Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed last year.
Improve this question
I use dapper ORM with unit of work
using IDbConnection/IDbTransaction processing (Close/Commit/Rollback)
public IDbTransaction BeginTransaction(string connectionName = "", bool useWadminUser = false, IsolationLevel isolationLevel = IsolationLevel.ReadCommitted)
{
using (DbConnection db = GetDbconnection(connectionName, useWadminUser))
{
db.Open();
_dbTransaction = db.BeginTransaction();
}
return this._dbTransaction;
}
DbConnection GetDbconnection(string connectionName = "", bool useWadminUser = false)
{
UserInfoHelper userInfoHelper = _iSecurityAuthorizService.GetCookieProfileUser();
return new SqlConnection(
string.Format(_config.GetConnectionString(string.IsNullOrEmpty(connectionName) ? SqlHelper.DefaultConnection : connectionName),
SqlHelper.WadminUserName, SqlHelper.WadminPassword));
}
I call service var tran= _dapper.BeginTransaction();
but tran is null.
Please see this this pic1
public IDbTransaction BeginTransaction(string connectionName = "", bool useWadminUser = false, IsolationLevel isolationLevel = IsolationLevel.ReadCommitted)
{
using (DbConnection db = GetDbconnection(connectionName, useWadminUser))
{
db.Open();
_dbTransaction = db.BeginTransaction();
} <-- Here
return this._dbTransaction;
}
At the pointer above the connection is closed and disposed and the transaction with it. You cannot use transactions like that.
This question already has answers here:
WPF How to convert from DataGrid to DataTable?
(4 answers)
Closed 1 year ago.
I need to be able to put the WPF datagrid into the datatable after loading and filling the datagrid. After searching in Google, I came across the code that Datagrid should be cast to Dataview.
This code is loading datagrid:
private void LoadDataGrid()
{
using (famloanEntities db = new famloanEntities())
{
var ash = db.Ashkhas;
DataGrid1.ItemsSource = ash.ToList();
}
}
I use this code to convert the datagrid to the datatable:
DataTable dt = ((DataView)DataGrid1.ItemsSource).ToTable();
The following error occurs during execution. Please advise where the problem is?
I changed the code in the link as follows to ignore foreignkeys when checking properties.
public static DataTable ToDataTable<T>(List<T> items)
{
DataTable dataTable = new DataTable(typeof(T).Name);
//Get all the properties
PropertyInfo[] Props = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance);
foreach (PropertyInfo prop in Props)
{
//Setting column names as Property names
dataTable.Columns.Add(prop.Name);
}
foreach (T item in items)
{
var values = new object[Props.Length];
for (int i = 0; i < Props.Length; i++)
{
**if (!Props[i].PropertyType.Name.ToLower().Contains("collection"))**
{
//inserting property values to datatable rows
values[i] = Props[i].GetValue(item, null);
}
}
dataTable.Rows.Add(values);
}
//put a breakpoint here and check datatable
return dataTable;
}
I am using dtSearch on combination with a SQL database and would like to maintain a table that includes all DocIds and their related FileNames. From there, I will add a column with my foreign key to allow me to combine text and database searches.
I have code to simply return all the records in the index and add them one by one to the DB. This, however, takes FOREVER, and doesn't address the issue of how to simply append new records as they are added to the index. But just in case it helps:
MyDatabaseContext db = new StateScapeEntities();
IndexJob ij = new dtSearch.Engine.IndexJob();
ij.IndexPath = #"d:\myindex";
IndexInfo indexInfo = dtSearch.Engine.IndexJob.GetIndexInfo(#"d:\myindex");
bool jobDone = ij.Execute();
SearchResults sr = new SearchResults();
uint n = indexInfo.DocCount;
for (int i = 1; i <= n; i++)
{
sr.AddDoc(ij.IndexPath, i, null);
}
for (int i = 1; i <= n; i++)
{
sr.GetNthDoc(i - 1);
//IndexDocument is defined elsewhere
IndexDocument id = new IndexDocument();
id.DocId = sr.CurrentItem.DocId;
id.FilePath = sr.CurrentItem.Filename;
if (id.FilePath != null)
{
db.IndexDocuments.Add(id);
db.SaveChanges();
}
}
To keep the DocId in the index you must use the flag dtsIndexKeepExistingDocIds in the IndexJob
You can also look the dtSearch Text Retrieval Engine Programmer's Reference when the DocID is changed
When a document is added to an index, it is assigned a DocId, and DocIds are always numbered sequentially.
When a document is reindexed, the old DocId is cancelled and a new DocId is assigned.
When an index is compressed, all DocIds in the index are renumbered to remove the cancelled DocIds unless the dtsIndexKeepExistingDocIds flag is set in IndexJob.
When an index is merged into another index, DocIds in the target index are never changed. The documents merged into the target index will all be assigned new, sequentially-numbered DocIds, unless (a) the dtsIndexKeepExistingDocIds flag is set in IndexJob and (b) the indexes have non-overlapping ranges of doc ids.
To improve your speed you can search for the word "xfirstword" and get all documents in an index.
You can also look to the faq How to retrieve all documents in an index
So, I used part of user2172986's response, but combined it with some additional code to get the solution to my question. I did indeed have to set the dtsKeepExistingDocIds flag in my index update routine.
From there, I only wanted to add the newly created DocIds to my SQL database. For that, I used the following code:
string indexPath = #"d:\myindex";
using (IndexJob ij = new dtSearch.Engine.IndexJob())
{
//make sure the updated index doesn't change DocIds
ij.IndexingFlags = IndexingFlags.dtsIndexKeepExistingDocIds;
ij.IndexPath = indexPath;
ij.ActionAdd = true;
ij.FoldersToIndex.Add( indexPath + "<+>");
ij.IncludeFilters.Add( "*");
bool jobDone = ij.Execute();
}
//create a DataTable to hold results
DataTable newIndexDoc = MakeTempIndexDocTable(); //this is a custom method not included in this example; just creates a DataTable with the appropriate columns
//connect to the DB;
MyDataBase db = new MyDataBase(); //again, custom code not included - link to EntityFramework entity
//get the last DocId in the DB?
int lastDbDocId = db.IndexDocuments.OrderByDescending(i => i.DocId).FirstOrDefault().DocId;
//get the last DocId in the Index
IndexInfo indexInfo = dtSearch.Engine.IndexJob.GetIndexInfo(indexPath);
uint latestIndexDocId = indexInfo.LastDocId;
//create a searchFilter
dtSearch.Engine.SearchFilter sf = new SearchFilter();
int indexId = sf.AddIndex(indexPath);
//only select new records (from one greater than the last DocId in the DB to the last DocId in the index itself
sf.SelectItems(indexId, lastDbDocId + 1, int.Parse(latestIndexDocId.ToString()), true);
using (SearchJob sj = new dtSearch.Engine.SearchJob())
{
sj.SetFilter(sf);
//return every document in the specified range (using xfirstword)
sj.Request = "xfirstword";
// Specify the path to the index to search here
sj.IndexesToSearch.Add(indexPath);
//additional flags and limits redacted for clarity
sj.Execute();
// Store the error message in the status
//redacted for clarity
SearchResults results = sj.Results;
int startIdx = 0;
int endIdx = results.Count;
if (startIdx==endIdx)
return;
for (int i = startIdx; i < endIdx; i++)
{
results.GetNthDoc(i);
IndexDocument id = new IndexDocument();
id.DocId = results.CurrentItem.DocId;
id.FileName= results.CurrentItem.Filename;
if (id.FileName!= null)
{
DataRow row = newIndexDoc.NewRow();
row["DocId"] = id.DocId;
row["FileName"] = id.FileName;
newIndexDoc.Rows.Add(row);
}
}
newIndexDoc.AcceptChanges();
//SqlBulkCopy
using (SqlConnection connection =
new SqlConnection(db.Database.Connection.ConnectionString))
{
connection.Open();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName =
"dbo.IndexDocument";
try
{
// Write from the source to the destination.
bulkCopy.WriteToServer(newIndexDoc);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
}
newIndexDoc.Clear();
db.UpdateIndexDocument();
}
Here is my new solution with AddDoc method from the SearchResults interface:
First get the StartingDocID and the LastDocID from the IndexInfo and walk the loop like this:
function GetFilename(paDocID: Integer): String;
var
lCOMSearchResults: ISearchResults;
lSearchResults_Count: Integer;
begin
if Assigned(prCOMServer) then
begin
lCOMSearchResults := prCOMServer.NewSearchResults as ISearchResults;
lCOMSearchResults.AddDoc(GetIndexPath(prIndexContent), paDocID, 0);
lSearchResults_Count := lCOMSearchResults.Count;
if lSearchResults_Count = 1 then
begin
lCOMSearchResults.GetNthDoc(0);
Result := lCOMSearchResults.DocDetailItem['_Filename'];
end;
end;
end
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
ASP.NET MVC: Best Way To Call Stored Procedure
I am developing MCV3 application.
I want to call store procedure in one of the controller of the applicaiton.
I have already saved store procedure in DB which I am using for applicaiton.
The Query is
Create Procedure ConvertLeadToCustomer1
#CompanyID int
as
begin
update Companies set __Disc__ = 'Customer' where CompanyID = #CompanyID
end
Now, I wan to call this procesure into controller...
namespace CRMWeb.Controllers
{
public class LeadController : Controller
{
private CRMWebContainer db = new CRMWebContainer();
//
// GET: /Lead/
public ViewResult Index()
{
//return View(db.Companies.ToList());
return View(db.Companies.OfType<Lead>().ToList());
}
public ActionResult Convert(int id)
{
// I want to write code here to call stored procedure...
}
}
}
How to call it ?
It's not different in mvc, if using ADO.net, below code call the stored procedure:
public ActionResult Convert(int id)
{
var connection = new SqlConnection("YOUR CONNECTION STRING");
var command = new SqlCommand("ConvertLeadToCustomer1",connection)
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#CompanyID", id);
connection.Open();
command.ExcuteNonQuery();
connection.Close();
}
I have created an ADO.NET entity data model, and using linq to update/edit my oracle database.
using (Entities ent = new Entities())
{
RUSHPRIORITYRATE rp = new RUSHPRIORITYRATE();
rp.RATE = rate;
var query = from j in ent.RUSHPRIORITYRATEs
select j;
List<RUSHPRIORITYRATE> list = query.ToList();
if (list.Count == 0)
{
ent.AddToRUSHPRIORITYRATEs(rp);
ent.SaveChanges();
}
else
{
foreach (RUSHPRIORITYRATE r in query)
{
r.RATE = rp.RATE;
}
ent.SaveChanges();
}
}
I have a method that either adds or updates a Table that will always have one record. The record's value is then only update once there is one record in place. Adding to the table is no problem, but I've looked up how to update recores through MSDN, and "ent" does not seem to have the "submitchanges" method that the solution requires. Running this, I get the error: "The property 'RATE' is part of the object's key information and cannot be modified."