Like I said in the topic,
My team developed social website based on Zend Framework (1.11).
The problem is that our client wants a debug (on screen list of DB queries) with
execution time, how many rows were affected and statement sentence.
Time and statement Zend_DB_profiler gets for us with no hassle, but
we need also the amount of rows that query affected (fetched, update, inserted or deleted).
Please help, how to cope with this task?
This is the current implementation which prevents you from getting what you want.
$qp->start($this->_queryId);
$retval = $this->_execute($params);
$prof->queryEnd($this->_queryId);
Possible solution would be:
Create your own class for Statement, let's say extended from Zend_Db_Statement_Mysqli or what have you.
Redefine its execute method so that $retval is carried on into $prof->queryEnd($this->_queryId);
Redefine $_defaultStmtClass in Zend_Db_Adapter_Mysqli with your new statement class name
Create your own Zend_Db_Profiler with public function queryEnd($queryId) redefined, so it accepts $retval and handles $retval
Related
I want to change opportunity.CreatedById to lead.ownerid,rectify my code
trigger on opportunity(after insert)
lst<ooportunity> opps=[select opportunityId,CreatedById from opportunity where opportunityId IN
: Trigger.New];
list<lead> leads=[SELECT Id,ConvertedOpportunityId,OwnerId From lead where isConverted=true];
for(opportunity ops:opps)
{if(leads.Id==opps.OpportunityId)
{ opps.CreatedById=leads.OwnerId;
ops.add(opps);
}
update ops;
You can't, at least not like that.
Audit fields (createdbyid, createddate, lastmodifiedbyid, lastmodifieddate) are readonly and modified by SF internally.
There's a trick to open them up for example for data migrations (you're loading 7 years worth of opportunities, you want to record their real creation date so it doesn't show as a spike in reports). But you have only 1 shot at this. Only during insert.
https://help.salesforce.com/s/articleView?id=000334139&type=1 and https://help.salesforce.com/s/articleView?id=000331038&type=1
You'd need to tick something in setup, this causes 2 magic checkboxes to show up in profiles/permission sets. Then you'd edit your profile (or create new permission set and assign to yourself).
You'd need to change your trigger to before insert (but if it's before - the lead is still being converted, the opp id isn't known yet - so you can't match it by Lead.OpportunityId)... And even then - this feature is aimed at sysadmins, I doubt normal user without "Modify All Data" can do it.
Rethink your business scenario. Maybe you want to set Opportunity.OwnerId, not CreatedById?
Code Migration due to Performance Issues :-
SQL Server LIKE Condition ( BEFORE )
SQL Server Full Text Search --> CONTAINS ( BEFORE )
Elastic Search ( CURRENTLY )
Achieved So Far :-
We have a web page created in ASP.Net Core which has a Auto Complete Drop Down of 2.5+ Million Companies Indexed in Elastic Search https://www.99corporates.com/
Due to performance issues we have successfully shifted our code from SQL Server Full Text Search to Elastic Search and using NEST v7.2.1 and Elasticsearch.Net v7.2.1 in our .Net Code.
Still looking for a solution :-
If the user does not select a company from the Auto Complete List and simply enters a few characters and clicks on go then a list should be displayed which we had done earlier by using the SQL Server Full Text Search --> CONTAINS
Can we call the ASP.Net Web Service which we have created using SQL CLR and code like SELECT * FROM dbo.Table WHERE Name IN( dbo.SQLWebRequest('') )
[System.Web.Script.Services.ScriptMethod()]
[System.Web.Services.WebMethod]
public static List<string> SearchCompany(string prefixText, int count)
{
}
Any better or alternate option
While that solution (i.e. the SQL-APIConsumer SQLCLR project) "works", it is not scalable. It also requires setting the database to TRUSTWORTHY ON (a security risk), and loads a few assemblies as UNSAFE, such as Json.NET, which is risky if any of them use static variables for caching, expecting each caller to be isolated / have their own App Domain, because SQLCLR is a single, shared App Domain, hence static variables are shared across all callers, and multiple concurrent threads can cause race-conditions (this is not to say that this is something that is definitely happening since I haven't seen the code, but if you haven't either reviewed the code or conducted testing with multiple concurrent threads to ensure that it doesn't pose a problem, then it's definitely a gamble with regards to stability and ensuring predictable, expected behavior).
To a slight degree I am biased given that I do sell a SQLCLR library, SQL#, in which the Full version contains a stored procedure that also does this but a) handles security properly via signatures (it does not enable TRUSTWORTHY), b) allows for handling scalability, c) does not require any UNSAFE assemblies, and d) handles more scenarios (better header handling, etc). It doesn't handle any JSON, it just returns the web service response and you can unpack that using OPENJSON or something else if you prefer. (yes, there is a Free version of SQL#, but it does not contain INET_GetWebPages).
HOWEVER, I don't think SQLCLR is a good fit for this scenario in the first place. In your first two versions of this project (using LIKE and then CONTAINS) it made sense to send the user input directly into the query. But now that you are using a web service to get a list of matching values from that user input, you are no longer confined to that approach. You can, and should, handle the web service / Elastic Search portion of this separately, in the app layer.
Rather than passing the user input into the query, only to have the query pause to get that list of 0 or more matching values, you should do the following:
Before executing any query, get the list of matching values directly in the app layer.
If no matching values are returned, you can skip the database call entirely as you already have your answer, and respond immediately to the user (much faster response time when no matches return)
If there are matches, then execute the search stored procedure, sending that list of matches as-is via Table-Valued Parameter (TVP) which becomes a table variable in the stored procedure. Use that table variable to INNER JOIN against the table rather than doing an IN list since IN lists do not scale well. Also, be sure to send the TVP values to SQL Server using the IEnumerable<SqlDataRecord> method, not the DataTable approach as that merely wastes CPU / time and memory.
For example code on how to accomplish this correctly, please see my answer to Pass Dictionary to Stored Procedure T-SQL
In C#-style pseudo-code, this would be something along the lines of the following:
List<string> = companies;
companies = SearchCompany(PrefixText, Count);
if (companies.Length == 0)
{
Response.Write("Nope");
}
else
{
using(SqlConnection db = new SqlConnection(connectionString))
{
using(SqlCommand batch = db.CreateCommand())
{
batch.CommandType = CommandType.StoredProcedure;
batch.CommandText = "ProcName";
SqlParameter tvp = new SqlParameter("ParamName", SqlDbType.Structured);
tvp.Value = MethodThatYieldReturnsList(companies);
batch.Paramaters.Add(tvp);
db.Open();
using(SqlDataReader results = db.ExecuteReader())
{
if (results.HasRows)
{
// deal with results
Response.Write(results....);
}
}
}
}
}
Done. Got the solution.
Used SQL CLR https://github.com/geral2/SQL-APIConsumer
exec [dbo].[APICaller_POST]
#URL = 'https://www.-----/SearchCompany'
,#JsonBody = '{"searchText":"GOOG","count":10}'
Let me know if there is any other / better options to achieve this.
so basically I am making a student database that contains student grades, I have a query that gives me a list of what classes a specific student has taken that are part of their major.
This is what the query returns:
Query result
So what I want to do is create a report that has a section like this where it lists all possible classes they can taken per that major:
Report
and I want to have a checkbox next to each class and have the box be checked off if they have taken the class, if they have not I want the box to be empty, so I don't necessarily need anything on the report like grades etc. I just want it checked off if they have taken that specific class . How can I go about this, lost on this part.
Although more information may be required to answer your question correctly, I'll give this a shot:
Assuming your the check box in question is called "Check1" and the table in question is called "Table1" and the field is called "Test" and the Report you mentioned is a form called "Report"... AND its only on the table when it has been completed.
Something to the effect of an "On Load" command into the form like:
If Not IsNull(DLookup("Class_UID", "Table1", "Class_UID = 'INF-202'")) Then
Forms("Report").Check1.Value = 1
End If
Of course, this would need to be added for each unique Class_UID
It's a simplistic way to do it, but far from the only option.
I am trying to achieve email notification . The condition is , it should go by end of the day with the current day published content list.
For the same I have tried couple of things using Rules, but stuck in between.
Any help?
I tried using rules, and I created a rule like so:
Events:
After updating existing content of type(content type name)
Cron maintenance tasks are performed
Condition: Data to compare: [node:field-img-status], Data value: Approve
When I am trying to add second condition to check if the node is published within 24hrs, I am unable to achieve it. When I add strtotime("-1 day"), I get an error like:
Wrong date format. Specify the date in the format 2017-05-10 08:17:18.
I tried date('Y-m-d h:i:s',strtotime("-1 day")) but I did not succeed.
Now I am trying one more method to achieve it using Views Rules which is suggested in this answer to the question about 'How to create a Drupal rule to check (on cron) a date field and if passed set field "status" to "ended"?'.
Below is a blueprint of how I'd get this to work ...
Step 1: Create a single eMail for each node that was published
Create a view (using Views) of all the nodes that were published the last 24 hours. Make sure to include a column in that view for the various data you want to be included about each node in your eMail later on.
Use Rules to create a rule with a Rules Action that consists of a "Rules Loop", in which its "list items" are actually the list of nodes that you want to be included in your eMail later on. To create this Rules Loop, use the Views Rules combined with a Views display type of "Views Rules", for the view that you created. Refer to my answer to "How to pass arguments to a view from Rules?" for way more details on how to use the Views Rules module.
For each list item in the Rules Loop of the previous step, you have access to all data for each column in the View you created. By using these data you could add an additional Rules Action (within the same Rules Loop) to send an appropriate eMail about the node being processed.
Step 2: Group all eMails in a single eMail
Obviously, the previous step creates a single eMail for each node that was published in the last 24 hours. If you only have a few nodes that may not be a real issue to worry about. But if you have dozens (or more?) of such nodes then you might want to consider consolidating all such eMails in a single eMail, which contains (in its eMail body) the complete list of nodes.
A possible solution to implement such consolidation, is similar to what is shown in the Rules example included in my answer to "How to concatenate all token values of a list in a single field within a Rules loop?". In your case, you could make it work like so:
Add some new Rules variable that will be used later on as part of the eMail body, before the start of your loop. Say you name the variable nodes_list_var_for_email_body.
Within your loop, for each iteration, prepend or append the value for each "list item" to that variable nodes_list_var_for_email_body.
Move the Rules Action to send an eMail outside your loop, and after the loop completed. And finetune the details (configuration) of your (new) "send an eMail" Rules Action. When doing so, you'll be able to select the token for nodes_list_var_for_email_body to include anywhere in your eMail body.
Step 3: Schedule the daily execution of your rule
Use the Rules Once per Day to schedule the daily execution of your rule. Refer to my answer to "How to limit the execution of a rule for sending an email to only run once in a day?" for way more details about this module.
VoilĂ , that's it ...
This is how I would achieve this:
Make some view which would list all nodes created today.
Make some end-point (from my module, check out: https://api.drupal.org/api/drupal/modules%21system%21system.api.php/function/hook_menu/7.x)
It would call this view, and grab that node list (i.e. with views_get_view_result : https://api.drupal.org/api/views/views.module/function/views_get_view_result/7.x-3.x ), loop through the list, compose the email and send it.
Then I would set cron job to call that end-point at end of every day.
I just inherited some cakePHP code and I am not very familiar with it (or any other php/serverside language). I need to set the id of the item I am adding to the database to be the value of the last item plus one, originally I did a call like this:
$id = $this->Project->find('count') + 1;
but this seems to add about 8 seconds to my page loading (which seems weird because the database only has about 400 items) but that is another problem. For now I need a faster way to find the id of the last item in the database, is there a way using find to quickly retrieve the last item in a given table?
That's a very bad approach on setting the id.
You do know that, for example, MySQL supports auto-increment for INT-fields and therefore will set the id automatically for you?
The suggested functions getLastInsertId and getInsertId will only work after an insert and not always.
I also can't understand that your call adds 8 seconds to your siteload. If I do such a call on my table (which also has around 400 records) the call itself only needs a few milliseconds. There is no delay the user would notice.
I think there might be a problem with your database-setup as this seems very unlikely.
Also please have a look if your database supports auto-increment (I can't imagine that's not possible) as this would be the easiest way of adding your wanted functionality.
I would try
$id = $this->Project->getLastInsertID();
$id++;
The method can be found in cake/libs/model/model.php in line 2768
As well as on this SO page
Cheers!
If you are looking for the cakePHP3 solution to this you simply use last().
ie:
use Cake\ORM\TableRegistry;
....
$myrecordstable=Tableregistry::get('Myrecords');
$myrecords=$myrecordstable->find()->last();
$lastId = $myrecords->id;
....