, Iam working on this project in hibernate, where the user enters the quantity based on this number i need to display the unique fields in the next form (For example if the user enters quantity as 5 then 5 MAC Address and DeviceSerial fields are shown in the jsp page, the other common information about the devices are entered in the same form where quantity is entered) and all this information should be put into the database . Thanks!
Please take a look at this page: webpage
You should open a transaction, save all entities (in loop), at the end commit the transaction and close the session. You should not commit for every single record but at the end.
Referenced to Goran
or
if you can use hibernateTemplate instead of session then you can save your object list by using saveOrUpdateAll(Collection entities).
Although HibernateTemplate saveOrUpdateAll does save a collection at once but, it will use the same loop logic of saving the object and flushing
You should go for Batch Insert in hibernate.
Batch Insertion is a powerful feature of hibernate particularly useful when you are importing data from other systems in batch. If you do not use batch feature of hibernate, your application's performance may decrease dramatically at the time of insertion of many records.
For more information visit hibernate tutorials.
Related
We want to know what rows in a certain table is used frequently, and which are never used. We could add an extra column for this, but then we'd get an UPDATE for every SELECT, which sounds expensive? (The table contains 80k+ rows, some of which are used very often.)
Is there a better and perhaps faster way to do this? We're using some old version of Microsoft's SQL Server.
This kind of logging/tracking is the classical application server's task. If you want to realize your own architecture (there tracking architecture) do it on your own layer.
And in any case you will need application server there. You are not going to update tracking field it in the same transaction with select, isn't it? what about rollbacks? so you have some manager who first run select than write track information. And what is the point to save tracking information together with entity info sending it back to DB? Save it into application server file.
You could either update the column in the table as you suggested, but if it was me I'd log the event to another table, i.e. id of the record, datetime, userid (maybe ip address etc, browser version etc), just about anything else I could capture and that was even possibly relevant. (For example, 6 months from now your manager decides not only does s/he want to know which records were used the most, s/he wants to know which users are using the most records, or what time of day that usage pattern is etc).
This type of information can be useful for things you've never even thought of down the road, and if it starts to grow large you can always roll-up and prune the table to a smaller one if performance becomes an issue. When possible, I log everything I can. You may never use some of this information, but you'll never wish you didn't have it available down the road and will be impossible to re-create historically.
In terms of making sure the application doesn't slow down, you may want to 'select' the data from within a stored procedure, that also issues the logging command, so that the client is not doing two roundtrips (one for the select, one for the update/insert).
Alternatively, if this is a web application, you could use an async ajax call to issue the logging action which wouldn't slow down the users experience at all.
Adding new column to track SELECT is not a practice, because it may affect database performance, and the database performance is one of major critical issue as per Database Server Administration.
So here you can use one very good feature of database called Auditing, this is very easy and put less stress on Database.
Find more info: Here or From Here
Or Search for Database Auditing For Select Statement
Use another table as a key/value pair with two columns(e.g. id_selected, times) for storing the ids of the records you select in your standard table, and increment the times value by 1 every time the records are selected.
To do this you'd have to do a mass insert/update of the selected ids from your select query in the counting table. E.g. as a quick example:
SELECT id, stuff1, stuff2 FROM myTable WHERE stuff1='somevalue';
INSERT INTO countTable(id_selected, times)
SELECT id, 1 FROM myTable mt WHERE mt.stuff1='somevalue' # or just build a list of ids as values from your last result
ON DUPLICATE KEY
UPDATE times=times+1
The ON DUPLICATE KEY is right from the top of my head in MySQL. For conditionally inserting or updating in MSSQL you would need to use MERGE instead
I have a table named "Daily Result" that contain calculated information of all users in my application.this info are generated after running a calc method for each user.this method use data of multi relative table.
in my app after some actions I need to reCalculate info of "DailyResult" table for one or all users.
in this situation I will create threads to do calc.but my problem is when call calc method for multi user the threads will wait for resource table ("DailyResult") to release.and that method take time.
we use MVC 3, Sql server Enterprise 2012 64 bit, Dapper ORM for insert-delete-update info.
we use nhibrenate already but have to replace it with dapper.it is better but for parallel user over 2000 it take times to insert-delete-update info in Dailyresult table.
what is best solution for handle this issue,to ham maximum performance.what is your suggestion?
I presume you what to access the table during the "reCalculate" phase? So - if you can edit the query which you use to obtain data from the table in your app, add "set transaction isolation level read uncommitted;" at the beginning
I have 1 application module, 1 connection to DB and two DataControls based on a single ViewObject. They are placed on the same form. Is it any possibility that ADF makes 2 sessions when I insert data to first DataControl and trying to re-execute query in second?
Yours is not practically a problem. It is the way it should work. Two users cannot update/change the same row in the same time. The first that Commits the change is OK whereas to the second an Error popup will be displayed telling him that the current row has been updated from someone else. If the users are not working (changing) the same row but different rows of the same ViewObject then you should consider this link:
http://radio-weblogs.com/0118231/stories/2004/03/24/whyDoIGetOraclejborowinconsistentexception.html
I suggest you take a look also to this book, you can find it free to download, just search a bit.
http://www.amazon.com/Quick-Start-Oracle-Fusion-Development/dp/0071744282
Have a nice day, tung.
I'm writing an Apex program that reads through a database and processes record. Each time I process a record, I want to output a message. Currently I'm using System.Debug to do this, but the debug log is cluttered with so much that this doesn't seem like the right approach.
What other ways can I generate screen or logfile output in SalesForce?
Keep using System.Debug() but when you want to view only your output messages, just filter by DEBUG. Otherwise the only other option is to create a view and then that is more clutter than what it's worth.
Please Open the Log in Raw format under Setup>> Administration Setup >> Monitoring >>Debug Logs. Under Monitored Users go to Filters and enable all the filter levels. Now use apex code as given
System.debug('StackOverflow >>1234'+ e.getMessage)
and search the detailed debug logs for StackOverflow >>1234 the unique message. It may also happen that your system.debug might not have been executed in that specific Debug logs so do not forget to check all the recent debug logs. :)
You could think about creating your own Logging__c object. And create a record for it for each record processed. You have to be creative to work round the governor limits though.
If it's not essential that you output the message in between processing each record, then you could build up a collection of Logging__c records as processing continues and then either insert them periodically, or when there's an exception in your process.
Note that if inserting them periodically, you still have to make sure the jobs not so large that you're going to hit the DML limit of 150 together with the processing work you're doing. Also, if storing the records to all be inserted at the end of processing, bear in mind the heap size is 6MB.
Alternatively, have a look at Batch Apex http://www.salesforce.com/us/developer/docs/apexcode/Content/apex_batch_interface.htm
This allows you to create a class to handle processing a job in asynchronous chunks. You can set the number of records processed in one go. So you could set this small (~20) and then insert a Logging_c record as each job record is processed to stay within the Batch Apex DML limit of 200. You can then monitor Logging_c records in real time to view progress.
I'm working on a project in dead ASP (I know :( )
Anyway it is working with a kdb+ database which is major overkill but not my call. Therefore to do inserts etc we're having to write special functions so they can be handled.
Anyway we've hit a theoretical problem and I'm a bit unsure how it should be dealt with in this case.
So basically you register a company, when you submit validation will occur and the page will be processed, inserting new values to the appropriate tables. Now at this stage I want to pull ID's from the tables and use them in the session for further registration screens. The user will never add a specific ID of course so it needs to be pulled from the database.
But how can this be done? I'm particularly concerned with 2 user's simultaneously registering, how can I ensure the correct ID is passed back to the correct session?
Thank you for any help you can provide.
Instead of having the ID set at the point of insert, is it possible for you to "grab" an ID value before hand, and then use that value throughout the process?
So:
Start the registration.
System connects to the database, creates an ID (perhaps from an ID table) and Stores in ASP Session.
Company registers.
You validate and insert data into DB (including the ID session)
The things you put in the Session(...) collection is only visible to that session (i.e. the session is used only by the browser windows on one computer). The session is identified by a GUID value that is stored in a cookie on the client machine. It is "safe" to store your IDs there (other users won't be able to read them easily) .
either your id can include date and time - so it will be example - id31032012200312 - but if you still think that 2 people can register at the same type then I would use recordset locks liek the ones here - http://www.w3schools.com/ado/prop_rs_locktype.asp
To crea ids like above in asp you do - replace(date(),"/","") ' and then same with time with ":"
Thanks