I have a Table called Sanad in Microsoft SQL Server which has two columns bedeh and bestan. I would like to fetch all codes and sum of its bedeh and bestan in Laravel provided that the sum of its bedeh is higher than that of bestan. In addition, the data type of both bedeh and bestan is money. Here is the code I have written for this purpose in Laravel 6:
DB::table('t1')
->fromSub(function ($query){
return $query->selectRaw('Code, sum(bedeh) bed, sum(bestan) bes')
->from('Sanad')
->groupBy('Code');
}, 't1')
->where('bed', '>', 'bes')
->get();
However, when the code is executed, I come across to the problem below:
[Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Cannot convert a char value to money. The char value has incorrect syntax. (SQL: select * from (select Code, sum(bedeh) bed, sum(bestan) bes
What is interesting is that when I copy the resulted query and run it in the Management Studio, it fetches the codes flawlessly. Does anyone know what the problem is?
Thanks,
Habib
Did you try to cast the values in your query i.e.
sum(isnull(cast(bedeh as float),0)) as bed,
sum(isnull(cast(bestanh as float),0)) as bes
I finally changed my approach: instead of writing a query for the whole process, I divided it into different parts each of which works with Eloquent. To be specific, I selected Code, sum(bedeh), and sum(bestan) by this code:
$codes = Sanad::selectRaw('Code, sum(bedeh) bed, sum(bestan) bes')
->groupBy('Code')
->get();
Then, I fetched rows with the higher sum of bedeh by laravel instead of SQL like this:
$debtors = array();
foreach ($codes as $c) {
if($c->bes < $c->bed) {
$debtors[] = $c->Code;
}
}
Now, we can use another select to get Eloquent instances like this:
Person::select(....)->whereIn('Code', $debtors)....->get();
The performance may decrease in this scenario; however, I can benefit from Eloquent facilities such as pagination.
Habib
Related
I am trying to start using EF6 for a project. My database is already filled with millions of records.
I can't find right explanation how does EF send T-SQL to SQL Server? I am afraid that I am going to download bunch of data to user for no reason.
In code below I have found three way to get my data to List<> but I am not sure which is right way to do WHERE clause at SQL.
I do not want to fill client with millions of record and to query (filter) that data at client.
using (rgtBaza baza = new rgtBaza())
{
var t = baza.Database.SqlQuery<CJE_DOC>("select * from cje_doc where datum between #od and #do",new SqlParameter("od", this.dateTimePickerOD.Value.Date ) ,new SqlParameter("do", this.dateTimePickerOD.Value.Date)).ToList();
var t = baza.CJE_DOC.Where(s => s.DATUM.Value >= this.dateTimePickerOD.Value.Date && s.DATUM.Value <= this.dateTimePickerDO.Value.Date).ToList();
var query = from b in baza.CJE_DOC
where b.DATUM >= this.dateTimePickerOD.Value.Date && b.DATUM.Value <= this.dateTimePickerDO.Value.Date
select b;
var t = query.ToList();
this.dataGridViewCJENICI.DataSource = t;
}
In all 3 cases, the filtering will happen on the database side, the filtering (or WHERE clause) will not take place on the client side.
If you want to verify that this is true, especially for your last 2 options, add some logging so that you can see the generated SQL:
baza.Database.Log = s => Console.WriteLine(s);
In this case, since you are using EF already, choose the 2nd or 3rd options, they are both equivalent with different syntax. Pick your favorite syntax.
In all of those examples, EF6 will generate a SQL query including the where clause - it won't perform the where clause on the client.
It won't actually retrieve any data from the database until you iterate through the results, which in the examples above, is when you call .ToList().
EF6 would only run the filter on the client if you called something like:
baza.CJE_DOC.ToList().Where(x => x.Field == value)
In this instance, it would retrieve the entire table when you called ToList(), and then use a client-side Linq query to filter the results in the where clause.
Any of the 3 will run the query on the SQL Server.
EF relies on LINQ's deferred execution model to build up an expression tree. Once you take an action that causes the expression to be enumerated (e.g. calling ToList(), ToArray(), or any of the other To*() methods), it will convert the expression tree to SQL, send the query to the server, and then start returning the results.
One of the side effects of this is that when using the query or lambda syntax, expressions that EF does not understand how to convert to SQL will cause an exception.
If you absolutely need to use some code that EF can't handle, you can break your code into multiple segments -- filtering things down as far as possible via code that can be converted to SQL, using the AsEnumerable() method to "close off" the EF expression, and doing your remaining filtering or transformations using Linq to Objects.
My data model has an entity Person with 3 related (1:N) entities Jobs, Tasks and Dates.
My query looks like
var persons = (from x in context.Persons
select new {
PersonId = x.Id,
JobNames = x.Jobs.Select(y => y.Name),
TaskDates = x.Tasks.Select(y => y.Date),
DateInfos = x.Dates.Select(y => y.Info)
}).ToList();
Everything seems to work fine, but the lists JobNames, TaskDates and DateInfos are not all filled.
For example, TaskDates and DateInfos have the correct values, but JobNames stays empty. But when I remove TaskDates from the query, then JobNames is correctly filled.
So it seems that EF can only handle a limited number of these "subqueries"? Is this correct? If so, what is the max. number of these "subqueries" for a single statement? Is there a way to work around these issue without having to make more than one call to the database?
(ps: I'm not entirely sure, but I seem to remember that this query worked in LINQ2SQL - could it be?)
UPDATE
I'm getting crazy about this. I tried to repro the issue from ground up using a fresh, simple project (to post the entire piece of code here, not only an oversimplified example) - and I found I wasn't able to repro it. It still happens within our existing code base (apparently there's more behind this problem, but I cannot share this closed code base, unfortunately).
After hours and hours of playing around I found the weirdest behavior:
It works great when I don't SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED; before calling the LINQ statement
It also works great (independent of the above) when I don't use a .Take() to only get the first X rows
It also works great when I add an additional .Where() statements to cut the the number of rows returned from SQL Server
I didn't find any comprehensible reason why I see this behavior, but I started to look at the SQL: Although EF generates the exact same SQL, the execution plan is different when I use READ UNCOMMITTED. It returns more rows on a specific index in the middle of the execution plan, which curiously ends in less rows returned for the entire SQL statement - which in turn results in the missing data, that is the reason for my question to begin with.
This sounds very confusing and unbelievable, I know, but this is the behavior I see. I don't know what else to do, I don't even know what to google for at this point ;-).
I can fix my problem (just don't use READ UNCOMMITTED), but I have no idea why it occurs and if it is a bug or something I don't know about SQL Server. Maybe there's some "magic max number of allowed results in sub-queries" in SQL Server? At least: As far as I can see, it's not an issue with EF itself.
A little late, but does calling ToList() on each subquery produce the required effect?
var persons = (from x in context.Persons
select new {
PersonId = x.Id,
JobNames = x.Jobs.Select(y => y.Name.ToList()),
TaskDates = x.Tasks.Select(y => y.Date).ToList(),
DateInfos = x.Dates.Select(y => y.Info).ToList()
}).ToList();
I am using MS SQL server 2008 with Hibernate. the question I have is how Hibernate implements setMaxResults
Take the following simple scenario.
If I have a query that returns 100 rows and if I pass 1 to setMaxResults, will this affect the returned result from the SQL server itself(as if running a select top 1 statement) or does Hibernate get all the results first (all 100 rows in this case) and pick the top one itself?
Reason I am asking is that it would have a huge performance issue when the number of rows starts to grow.
Thank you.
Hibernate will generate a limit-type query, for all dialects which supports limit query. As the SQLServerDialect supports this (see org.hibernate.dialect.SQLServerDialect.supportsLimit(), and .getLimitString()), you will get a select top 1-query.
If you would like to be absolutly sure, you may turn on debug-logging, or enable the showSql-option and test.
May be following snippet will help. Assume we have a managed Bean class EmpBean and we want only first 5 records. So following is the code
public List<EmpBean> getData()
{
Session session = null;
try
{
session = HibernateUtil.getSession();
Query qry = session.createQuery("FROM EmpBean");
qry.setMaxResults(5);
return qry.list();
}
catch(HibernateException e)
{}
finally
{
HibernateUtil.closeSession(session);
}
return null;
}
Here getSession and closeSession are static utility methods which will take care of creating and closing session
We use the DBAmp for integrating Salesforce.com with SQL Server (which basically adds a linked server), and are running queries against our SF data using OPENQUERY.
I'm trying to do some reporting against opportunities and want to return the created date of the opportunity in the opportunity owners local date time (i.e. the date time the user will see in salesforce).
Our dbamp configuration forces the dates to be UTC.
I stumbled across a date function (in the Salesforce documentation) that I thought might be some help, but I get an error when I try an use it so can't prove it, below is the example useage for the convertTimezone function:
SELECT HOUR_IN_DAY(convertTimezone(CreatedDate)), SUM(Amount)
FROM Opportunity
GROUP BY HOUR_IN_DAY(convertTimezone(CreatedDate))
Below is the error returned:
OLE DB provider "DBAmp.DBAmp" for linked server "SALESFORCE" returned message "Error 13005 : Error translating SQL statement: line 1:37: expecting "from", found '('".
Msg 7350, Level 16, State 2, Line 1
Cannot get the column information from OLE DB provider "DBAmp.DBAmp" for linked server "SALESFORCE".
Can you not use SOQL functions in OPENQUERY as below?
SELECT
*
FROM
OPENQUERY(SALESFORCE,'
SELECT HOUR_IN_DAY(convertTimezone(CreatedDate)), SUM(Amount)
FROM Opportunity
GROUP BY HOUR_IN_DAY(convertTimezone(CreatedDate))')
UPDATE:
I've just had some correspondence with Bill Emerson (I believe he is the creator of the DBAmp Integration Tool):
You should be able to use SOQL functions so I am not sure why you are
getting the parsing failure. I'll setup a test case and report back.
I'll update the post again when I hear back. Thanks
A new version of DBAmp (2.14.4) has just been released that fixes the issue with using ConvertTimezone in openquery.
Version 2.14.4
Code modified for better memory utilization
Added support for API 24.0 (SPRING 12)
Fixed issue with embedded question marks in string literals
Fixed issue with using ConvertTimezone in openquery
Fixed issue with "Invalid Numeric" when using aggregate functions in openquery
I'm fairly sure that because DBAmp uses SQL and not SOQL, SOQL functions would not be available, sorry.
You would need to expose this data some other way. Perhaps it's possible with a Salesforce report, web-service, or compiling the data through the program you are using to access the (DBAmp) SQL Server.
If you were to create a Salesforce web service, the following example might be helpful.
global class MyWebService
{
webservice static AggregateResult MyWebServiceMethod()
{
AggregateResult ar = [
SELECT
HOUR_IN_DAY(convertTimezone(CreatedDate)) Hour,
SUM(Amount) Amount
FROM Opportunity
GROUP BY HOUR_IN_DAY(convertTimezone(CreatedDate))];
system.debug(ar);
return ar;
}
}
from f in CUSTOMERS
where depts.Contains(f.DEPT_ID)
select f.NAME
depts is a list (IEnumerable<int>) of department ids
This query works fine until you pass a large list (say around 3000 dept ids) .. then I get this error:
The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Too many parameters were provided in this RPC request. The maximum is 2100.
I changed my query to:
var dept_ids = string.Join(" ", depts.ToStringArray());
from f in CUSTOMERS
where dept_ids.IndexOf(Convert.ToString(f.DEPT_id)) != -1
select f.NAME
using IndexOf() fixed the error but made the query slow. Is there any other way to solve this? thanks so much.
My solution (Guids is a list of ids you would like to filter by):
List<MyTestEntity> result = new List<MyTestEntity>();
for(int i = 0; i < Math.Ceiling((double)Guids.Count / 2000); i++)
{
var nextGuids = Guids.Skip(i * 2000).Take(2000);
result.AddRange(db.Tests.Where(x => nextGuids.Contains(x.Id)));
}
this.DataContext = result;
Why not write the query in sql and attach your entity?
It's been awhile since I worked in Linq, but here goes:
IQuery q = Session.CreateQuery(#"
select *
from customerTable f
where f.DEPT_id in (" + string.Join(",", depts.ToStringArray()) + ")");
q.AttachEntity(CUSTOMER);
Of course, you will need to protect against injection, but that shouldn't be too hard.
You will want to check out the LINQKit project since within there somewhere is a technique for batching up such statements to solve this issue. I believe the idea is to use the PredicateBuilder to break the local collection into smaller chuncks but I haven't reviewed the solution in detail because I've instead been looking for a more natural way to handle this.
Unfortunately it appears from Microsoft's response to my suggestion to fix this behavior that there are no plans set to have this addressed for .NET Framework 4.0 or even subsequent service packs.
https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=475984
UPDATE:
I've opened up some discussion regarding whether this was going to be fixed for LINQ to SQL or the ADO.NET Entity Framework on the MSDN forums. Please see these posts for more information regarding these topics and to see the temporary workaround that I've come up with using XML and a SQL UDF.
I had similar problem, and I got two ways to fix it.
Intersect method
join on IDs
To get values that are NOT in list, I used Except method OR left join.
Update
EntityFramework 6.2 runs the following query successfully:
var employeeIDs = Enumerable.Range(3, 5000);
var orders =
from order in Orders
where employeeIDs.Contains((int)order.EmployeeID)
select order;
Your post was from a while ago, but perhaps someone will benefit from this. Entity Framework does a lot of query caching, every time you send in a different parameter count, that gets added to the cache. Using a "Contains" call will cause SQL to generate a clause like "WHERE x IN (#p1, #p2.... #pn)", and bloat the EF cache.
Recently I looked for a new way to handle this, and I found that you can create an entire table of data as a parameter. Here's how to do it:
First, you'll need to create a custom table type, so run this in SQL Server (in my case I called the custom type "TableId"):
CREATE TYPE [dbo].[TableId] AS TABLE(
Id[int] PRIMARY KEY
)
Then, in C#, you can create a DataTable and load it into a structured parameter that matches the type. You can add as many data rows as you want:
DataTable dt = new DataTable();
dt.Columns.Add("id", typeof(int));
This is an arbitrary list of IDs to search on. You can make the list as large as you want:
dt.Rows.Add(24262);
dt.Rows.Add(24267);
dt.Rows.Add(24264);
Create an SqlParameter using the custom table type and your data table:
SqlParameter tableParameter = new SqlParameter("#id", SqlDbType.Structured);
tableParameter.TypeName = "dbo.TableId";
tableParameter.Value = dt;
Then you can call a bit of SQL from your context that joins your existing table to the values from your table parameter. This will give you all records that match your ID list:
var items = context.Dailies.FromSqlRaw<Dailies>("SELECT * FROM dbo.Dailies d INNER JOIN #id id ON d.Daily_ID = id.id", tableParameter).AsNoTracking().ToList();
You could always partition your list of depts into smaller sets before you pass them as parameters to the IN statement generated by Linq. See here:
Divide a large IEnumerable into smaller IEnumerable of a fix amount of item