This may be a simple answer, but i can't see how to execute a stored procedure with EF CTP5.
In Entity Framework 4.0, we did this:
ExecuteFunction("ContainerName.StoredProcName", new ObjectParameter("Id", id)).
Which is a method on the ObjectContext.
But DbContext has no such method.
How do we call a stored proc? Is it not supported in EF CTP5?
EDIT:
I found this thread, which states you need to do this:
var people = context.People.SqlQuery("EXECUTE [dbo].[GetAllPeople]");
This raises some concerns:
1) You are now calling a stored prodedure on the set, not the context. Stored procedures should be available context-wide, not tied to a particular entity set. Just like how they are under the "Database" in SQL Server, and not under the "Table".
2) What about complex types? I previously had a complex type being returned from a stored procedure. But now, it looks as though you have to map directly to an entity? That doesn't make any sense. I have many stored procs that return a type not directly represented by an ObjectSet/DBSet, which i can't see how i can pull over.
Hope someone can clear this up for me, because from what i understand so far, i won't be able to upgrade to CTP5.
You can execute database-wide sql statements like this
using(var context = new MyContext())
{
// custum sql statement
var c = context.Database.SqlQuery<int>("SELECT COUNT(*) FROM Employees");
// returned entity type doesn't have to be represented by ObjectSet/DBSet
var e = context.Database.SqlQuery<Employee>("SELECT * FROM Employees");
// stored procedure
var q = context.Database.SqlQuery<Employee>("GetEmployees");
}
Related
Code Migration due to Performance Issues :-
SQL Server LIKE Condition ( BEFORE )
SQL Server Full Text Search --> CONTAINS ( BEFORE )
Elastic Search ( CURRENTLY )
Achieved So Far :-
We have a web page created in ASP.Net Core which has a Auto Complete Drop Down of 2.5+ Million Companies Indexed in Elastic Search https://www.99corporates.com/
Due to performance issues we have successfully shifted our code from SQL Server Full Text Search to Elastic Search and using NEST v7.2.1 and Elasticsearch.Net v7.2.1 in our .Net Code.
Still looking for a solution :-
If the user does not select a company from the Auto Complete List and simply enters a few characters and clicks on go then a list should be displayed which we had done earlier by using the SQL Server Full Text Search --> CONTAINS
Can we call the ASP.Net Web Service which we have created using SQL CLR and code like SELECT * FROM dbo.Table WHERE Name IN( dbo.SQLWebRequest('') )
[System.Web.Script.Services.ScriptMethod()]
[System.Web.Services.WebMethod]
public static List<string> SearchCompany(string prefixText, int count)
{
}
Any better or alternate option
While that solution (i.e. the SQL-APIConsumer SQLCLR project) "works", it is not scalable. It also requires setting the database to TRUSTWORTHY ON (a security risk), and loads a few assemblies as UNSAFE, such as Json.NET, which is risky if any of them use static variables for caching, expecting each caller to be isolated / have their own App Domain, because SQLCLR is a single, shared App Domain, hence static variables are shared across all callers, and multiple concurrent threads can cause race-conditions (this is not to say that this is something that is definitely happening since I haven't seen the code, but if you haven't either reviewed the code or conducted testing with multiple concurrent threads to ensure that it doesn't pose a problem, then it's definitely a gamble with regards to stability and ensuring predictable, expected behavior).
To a slight degree I am biased given that I do sell a SQLCLR library, SQL#, in which the Full version contains a stored procedure that also does this but a) handles security properly via signatures (it does not enable TRUSTWORTHY), b) allows for handling scalability, c) does not require any UNSAFE assemblies, and d) handles more scenarios (better header handling, etc). It doesn't handle any JSON, it just returns the web service response and you can unpack that using OPENJSON or something else if you prefer. (yes, there is a Free version of SQL#, but it does not contain INET_GetWebPages).
HOWEVER, I don't think SQLCLR is a good fit for this scenario in the first place. In your first two versions of this project (using LIKE and then CONTAINS) it made sense to send the user input directly into the query. But now that you are using a web service to get a list of matching values from that user input, you are no longer confined to that approach. You can, and should, handle the web service / Elastic Search portion of this separately, in the app layer.
Rather than passing the user input into the query, only to have the query pause to get that list of 0 or more matching values, you should do the following:
Before executing any query, get the list of matching values directly in the app layer.
If no matching values are returned, you can skip the database call entirely as you already have your answer, and respond immediately to the user (much faster response time when no matches return)
If there are matches, then execute the search stored procedure, sending that list of matches as-is via Table-Valued Parameter (TVP) which becomes a table variable in the stored procedure. Use that table variable to INNER JOIN against the table rather than doing an IN list since IN lists do not scale well. Also, be sure to send the TVP values to SQL Server using the IEnumerable<SqlDataRecord> method, not the DataTable approach as that merely wastes CPU / time and memory.
For example code on how to accomplish this correctly, please see my answer to Pass Dictionary to Stored Procedure T-SQL
In C#-style pseudo-code, this would be something along the lines of the following:
List<string> = companies;
companies = SearchCompany(PrefixText, Count);
if (companies.Length == 0)
{
Response.Write("Nope");
}
else
{
using(SqlConnection db = new SqlConnection(connectionString))
{
using(SqlCommand batch = db.CreateCommand())
{
batch.CommandType = CommandType.StoredProcedure;
batch.CommandText = "ProcName";
SqlParameter tvp = new SqlParameter("ParamName", SqlDbType.Structured);
tvp.Value = MethodThatYieldReturnsList(companies);
batch.Paramaters.Add(tvp);
db.Open();
using(SqlDataReader results = db.ExecuteReader())
{
if (results.HasRows)
{
// deal with results
Response.Write(results....);
}
}
}
}
}
Done. Got the solution.
Used SQL CLR https://github.com/geral2/SQL-APIConsumer
exec [dbo].[APICaller_POST]
#URL = 'https://www.-----/SearchCompany'
,#JsonBody = '{"searchText":"GOOG","count":10}'
Let me know if there is any other / better options to achieve this.
When I have a query generated like this:
var query = from x in Entities.SomeTable
select x;
I can set a breakpoint and after hovering cursor over query I can see what will be the SQL command sent to database. Unfortunately I cannot do it when I use Count
var query = (from x in Entities.SomeTable
select x).Count();
Of course I could see what comes to SqlServer using profiler but maybe someone has any idea how to do it (if it is possible) in VS.
You can use ToTraceString():
ObjectQuery<SomeTable> query = (from x in Entities.SomeTable select x).Count();
Console.WriteLine(query.ToTraceString());
You can use the Database.Log to log any query made like this :
using (var context = new MyContext())
{
context.Database.Log = Console.Write;
// Your code here...
}
Usually, in my context's constructor, I set that to my logger (whether it is NLog, Log4Net, or the stock .net loggers) and not the console, but actual logging tool is irrelevant.
For more information
In EF6 and above, you can use the following before your query:
context.Database.Log = s => System.Diagnostics.Debug.WriteLine(s);
I've found this to be quicker than pulling up SQL Profiler and running a trace.
Also, this post talks more about this topic:
How do I view the SQL generated by the Entity Framework?
I am trying to start using EF6 for a project. My database is already filled with millions of records.
I can't find right explanation how does EF send T-SQL to SQL Server? I am afraid that I am going to download bunch of data to user for no reason.
In code below I have found three way to get my data to List<> but I am not sure which is right way to do WHERE clause at SQL.
I do not want to fill client with millions of record and to query (filter) that data at client.
using (rgtBaza baza = new rgtBaza())
{
var t = baza.Database.SqlQuery<CJE_DOC>("select * from cje_doc where datum between #od and #do",new SqlParameter("od", this.dateTimePickerOD.Value.Date ) ,new SqlParameter("do", this.dateTimePickerOD.Value.Date)).ToList();
var t = baza.CJE_DOC.Where(s => s.DATUM.Value >= this.dateTimePickerOD.Value.Date && s.DATUM.Value <= this.dateTimePickerDO.Value.Date).ToList();
var query = from b in baza.CJE_DOC
where b.DATUM >= this.dateTimePickerOD.Value.Date && b.DATUM.Value <= this.dateTimePickerDO.Value.Date
select b;
var t = query.ToList();
this.dataGridViewCJENICI.DataSource = t;
}
In all 3 cases, the filtering will happen on the database side, the filtering (or WHERE clause) will not take place on the client side.
If you want to verify that this is true, especially for your last 2 options, add some logging so that you can see the generated SQL:
baza.Database.Log = s => Console.WriteLine(s);
In this case, since you are using EF already, choose the 2nd or 3rd options, they are both equivalent with different syntax. Pick your favorite syntax.
In all of those examples, EF6 will generate a SQL query including the where clause - it won't perform the where clause on the client.
It won't actually retrieve any data from the database until you iterate through the results, which in the examples above, is when you call .ToList().
EF6 would only run the filter on the client if you called something like:
baza.CJE_DOC.ToList().Where(x => x.Field == value)
In this instance, it would retrieve the entire table when you called ToList(), and then use a client-side Linq query to filter the results in the where clause.
Any of the 3 will run the query on the SQL Server.
EF relies on LINQ's deferred execution model to build up an expression tree. Once you take an action that causes the expression to be enumerated (e.g. calling ToList(), ToArray(), or any of the other To*() methods), it will convert the expression tree to SQL, send the query to the server, and then start returning the results.
One of the side effects of this is that when using the query or lambda syntax, expressions that EF does not understand how to convert to SQL will cause an exception.
If you absolutely need to use some code that EF can't handle, you can break your code into multiple segments -- filtering things down as far as possible via code that can be converted to SQL, using the AsEnumerable() method to "close off" the EF expression, and doing your remaining filtering or transformations using Linq to Objects.
I'm in the process of migrating from a legacy system. The database cannot be modified - including adding/modifying stored procedures.
I've added a stored procedure to an EDMX model successfully, it generated the following code:
public virtual ObjectResult<sp_GetUserInfoByUID_Result> sp_GetUserInfoByUID(Nullable<System.Guid> sessionID, Nullable<System.Guid> userUID)
{
var sessionIDParameter = sessionID.HasValue ?
new ObjectParameter("SessionID", sessionID) :
new ObjectParameter("SessionID", typeof(System.Guid));
var userUIDParameter = userUID.HasValue ?
new ObjectParameter("userUID", userUID) :
new ObjectParameter("userUID", typeof(System.Guid));
return ((IObjectContextAdapter)this).ObjectContext.ExecuteFunction<sp_GetUserInfoByUID_Result>("sp_GetUserInfoByUID", sessionIDParameter, userUIDParameter);
}
However, I get the following runtime error:
The data reader is incompatible with the specified 'MyApp.Repository.sp_GetUserInfoByUID_Result'. A member of the type, 'useraccount_uid1', does not have a corresponding column in the data reader with the same name.
So it looks like EF generated two mappings: useraccount_uid and useraccount_uid1. This is because the stored procedure returns a table with two columns named useraccount_uid.
Is there a way to get round this in the EF model?
Turns out the solution was really simple, I'd just overlooked how EF modeled stored procedures. When you add a stored procedure to the Model, by default it actually adds a couple of references.
A mapping from the model to the basic function in the DB - you cannot edit these mappings.
A "Function Import" - this is the part which maps result sets to code models.
So all I had to was look for the Function Imports folder in the EDMX Model Browser. In here the stored procedure was listed. If you right-click on the function you'll see the "Function Import Mapping" option. This will open the Mappings Detail window. Here I could simply correct the column naming.
from f in CUSTOMERS
where depts.Contains(f.DEPT_ID)
select f.NAME
depts is a list (IEnumerable<int>) of department ids
This query works fine until you pass a large list (say around 3000 dept ids) .. then I get this error:
The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Too many parameters were provided in this RPC request. The maximum is 2100.
I changed my query to:
var dept_ids = string.Join(" ", depts.ToStringArray());
from f in CUSTOMERS
where dept_ids.IndexOf(Convert.ToString(f.DEPT_id)) != -1
select f.NAME
using IndexOf() fixed the error but made the query slow. Is there any other way to solve this? thanks so much.
My solution (Guids is a list of ids you would like to filter by):
List<MyTestEntity> result = new List<MyTestEntity>();
for(int i = 0; i < Math.Ceiling((double)Guids.Count / 2000); i++)
{
var nextGuids = Guids.Skip(i * 2000).Take(2000);
result.AddRange(db.Tests.Where(x => nextGuids.Contains(x.Id)));
}
this.DataContext = result;
Why not write the query in sql and attach your entity?
It's been awhile since I worked in Linq, but here goes:
IQuery q = Session.CreateQuery(#"
select *
from customerTable f
where f.DEPT_id in (" + string.Join(",", depts.ToStringArray()) + ")");
q.AttachEntity(CUSTOMER);
Of course, you will need to protect against injection, but that shouldn't be too hard.
You will want to check out the LINQKit project since within there somewhere is a technique for batching up such statements to solve this issue. I believe the idea is to use the PredicateBuilder to break the local collection into smaller chuncks but I haven't reviewed the solution in detail because I've instead been looking for a more natural way to handle this.
Unfortunately it appears from Microsoft's response to my suggestion to fix this behavior that there are no plans set to have this addressed for .NET Framework 4.0 or even subsequent service packs.
https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=475984
UPDATE:
I've opened up some discussion regarding whether this was going to be fixed for LINQ to SQL or the ADO.NET Entity Framework on the MSDN forums. Please see these posts for more information regarding these topics and to see the temporary workaround that I've come up with using XML and a SQL UDF.
I had similar problem, and I got two ways to fix it.
Intersect method
join on IDs
To get values that are NOT in list, I used Except method OR left join.
Update
EntityFramework 6.2 runs the following query successfully:
var employeeIDs = Enumerable.Range(3, 5000);
var orders =
from order in Orders
where employeeIDs.Contains((int)order.EmployeeID)
select order;
Your post was from a while ago, but perhaps someone will benefit from this. Entity Framework does a lot of query caching, every time you send in a different parameter count, that gets added to the cache. Using a "Contains" call will cause SQL to generate a clause like "WHERE x IN (#p1, #p2.... #pn)", and bloat the EF cache.
Recently I looked for a new way to handle this, and I found that you can create an entire table of data as a parameter. Here's how to do it:
First, you'll need to create a custom table type, so run this in SQL Server (in my case I called the custom type "TableId"):
CREATE TYPE [dbo].[TableId] AS TABLE(
Id[int] PRIMARY KEY
)
Then, in C#, you can create a DataTable and load it into a structured parameter that matches the type. You can add as many data rows as you want:
DataTable dt = new DataTable();
dt.Columns.Add("id", typeof(int));
This is an arbitrary list of IDs to search on. You can make the list as large as you want:
dt.Rows.Add(24262);
dt.Rows.Add(24267);
dt.Rows.Add(24264);
Create an SqlParameter using the custom table type and your data table:
SqlParameter tableParameter = new SqlParameter("#id", SqlDbType.Structured);
tableParameter.TypeName = "dbo.TableId";
tableParameter.Value = dt;
Then you can call a bit of SQL from your context that joins your existing table to the values from your table parameter. This will give you all records that match your ID list:
var items = context.Dailies.FromSqlRaw<Dailies>("SELECT * FROM dbo.Dailies d INNER JOIN #id id ON d.Daily_ID = id.id", tableParameter).AsNoTracking().ToList();
You could always partition your list of depts into smaller sets before you pass them as parameters to the IN statement generated by Linq. See here:
Divide a large IEnumerable into smaller IEnumerable of a fix amount of item