I am using ADF with JPA as model
I have a page in which i am using button to add a row in to database
My actionlistionar calling function
public void addAns() {
OperationBinding op1 =
BindingContext.getCurrent().getCurrentBindingsEntry().getOperationBinding("QuestionsFindByQid");
questions = (Questions) op1.getResult();
BigDecimal aid = maxAnsID();
Answer = new Answers();
Answer.setAnsvalue(AnsValue);
Answer.setUsers(user);
Answer.setQuestions(questions);
Answer.setGroups(questions.getGroups2());
Answer.setAnsdate(date);
Answer.setAid(aid);
// op =BindingContext.getCurrent().getCurrentBindingsEntry().getOperationBinding("Create");
//op.execute();
OperationBinding op = BindingContext.getCurrent().getCurrentBindingsEntry().getOperationBinding("persistAnswers");
op.getParamsMap().put("answers", Answer);
op.execute();
System.out.println(op.getResult());
op1 = BindingContext.getCurrent().getCurrentBindingsEntry().getOperationBinding("QuestionsFindByQid");
op1.execute();
FacesMessage message =
new FacesMessage("answer added succesfully ");
FacesContext.getCurrentInstance().addMessage(user.getUsername(), message);
}
i am calling persist method for insertion
i have a table in which i need data to be updated on insert and i have tried re-executing binding method but the QuestionsFindByQid is not getting updated data
i have checked my entity after persist and i am getting updated data in my entity
can some one help me please
Krishna, what build of JDev/ADF are you on?
The simplest answer may be that whatever class is implementing persistAnswers() is not implicitly committing the operation. A persist() call itself updates the JPA persistence context but not the DB. To apply the changes, it needs to be committed.
JDev provides wizards to create either an EJB session bean or a simple POJO that will act as a facade to handle the transaction. This evolved in the last few releases, and the latest (12.1.3) provides turn-key solutions for CRUD operations using both EJB and POJO facades over JPA entities.
Related
Let’s assume that the primary components in your application are an Angular client, which calls an ASP.NET Web API, which uses Entity Framework to perform CRUD operations on your database. So, for example, in your API controllers, the Post (Add) method adds a new entity to the database context and then commits it to the database by calling the Entity Framework SaveChanges method.
This works fine when only one record needs to be added to the database at a time.
But, what if, for example, you want to add several records of different entity types to your database in one transaction? Where do you implement the Database.BeginTransaction and Database.CommitTransaction/RollbackTransaction? If you add a service layer to accomplish this, then what does the Angular client call?
PLEASE SEE BELOW FOR FURTHER DETAIL AND QUESTIONS.
I want to provide more detail about my current approach to solving this problem and ask the following questions:
(1) Is this a good approach, or is there a better way?
(2) My approach does not port to .NET Core, since .NET Core does not support OData yet (see https://github.com/OData/WebApi/issues/229). Any thoughts or ideas about this?
I have stated the problems that I faced and the solutions that I chose below. I will use a simple scenario where a customer is placing an order for several items – so, there is one Order record with several OrderDetail records. The Order record and associated OrderDetail records must be committed to the database in a single transaction.
Problem #1: What is the best way to send the Order and OrderDetail records from the Angular client to the ASP.NET Web API?
Solution #1: I decided to use OData batching, so that I could send all the records in one POST. I am using the datajs library to perform the batching (https://www.nuget.org/packages/datajs).
Problem #2: How do I wrap a single transaction around the Order and OrderDetail records?
Solution #2: I set up an OData batch endpoint in my Web API, which involved the following:
(1) In the client, configure a batch request route.
// Configure the batch request route.
config.Routes.MapODataServiceRoute(
routeName: "batch",
routePrefix: "batch",
model: builder.GetEdmModel(),
pathHandler: new DefaultODataPathHandler(),
routingConventions: conventions,
batchHandler: new TransactionalBatchHandler(GlobalConfiguration.DefaultServer));
}
(2) In the Web API, implement a custom batch handler, which wraps a database transaction around the given OData batch. The batch handler starts the transaction, calls the appropriate ODataController to perform the CRUD operation, and then commits/rolls back the transaction, depending on the results.
/// <summary>
/// Custom batch handler specialized to execute batch changeset in OData $batch requests with transactions.
/// The requests will be executed in the order they arrive, that means that the client is responsible for
/// correctly ordering the operations to satisfy referential constraints.
/// </summary>
public class TransactionalBatchHandler : DefaultODataBatchHandler
{
public TransactionalBatchHandler(HttpServer httpServer)
: base(httpServer)
{
}
/// <summary>
/// Executes the batch request and wraps the execution of the whole changeset within a transaction.
/// </summary>
/// <param name="requests">The <see cref="ODataBatchRequestItem"/> instances of this batch request.</param>
/// <param name="cancellation">The <see cref="CancellationToken"/> associated with the request.</param>
/// <returns>The list of responses associated with the batch request.</returns>
public async override Task<IList<ODataBatchResponseItem>> ExecuteRequestMessagesAsync(
IEnumerable<ODataBatchRequestItem> requests,
CancellationToken cancellation)
{
if (requests == null)
{
throw new ArgumentNullException("requests");
}
IList<ODataBatchResponseItem> responses = new List<ODataBatchResponseItem>();
try
{
foreach (ODataBatchRequestItem request in requests)
{
OperationRequestItem operation = request as OperationRequestItem;
if (operation != null)
{
responses.Add(await request.SendRequestAsync(Invoker, cancellation));
}
else
{
await ExecuteChangeSet((ChangeSetRequestItem)request, responses, cancellation);
}
}
}
catch
{
foreach (ODataBatchResponseItem response in responses)
{
if (response != null)
{
response.Dispose();
}
}
throw;
}
return responses;
}
private async Task ExecuteChangeSet(
ChangeSetRequestItem changeSet,
IList<ODataBatchResponseItem> responses,
CancellationToken cancellation)
{
ChangeSetResponseItem changeSetResponse;
// Since IUnitOfWorkAsync is a singleton (Unity PerRequestLifetimeManager) used by all our ODataControllers,
// we simply need to get a reference to it and use it for managing transactions. The ODataControllers
// will perform IUnitOfWorkAsync.SaveChanges(), but the changes won't get committed to the DB until the
// IUnitOfWorkAsync.Commit() is performed (in the code directly below).
var unitOfWorkAsync = GlobalConfiguration.Configuration.DependencyResolver.GetService(typeof(IUnitOfWorkAsync)) as IUnitOfWorkAsync;
unitOfWorkAsync.BeginTransaction();
// This sends each request in the changeSet to the appropriate ODataController.
changeSetResponse = (ChangeSetResponseItem)await changeSet.SendRequestAsync(Invoker, cancellation);
responses.Add(changeSetResponse);
if (changeSetResponse.Responses.All(r => r.IsSuccessStatusCode))
{
unitOfWorkAsync.Commit();
}
else
{
unitOfWorkAsync.Rollback();
}
}
}
You do not need to implement Database.BeginTransaction and Database.CommitTransaction/RollbackTransaction if you are using Entity Framework. Entity Framework implements UnitOfWork. The only thing that you should care about is to work with a different instance of DbContext for every web request, but exaclty 1 instance for 1 request and call SaveChanges only 1 time when you made all the changes you need.
In case of any Exception during SaveChanges all the changes will be rolled back.
The angular client should not care about this, it only sends the data and checks if everything was fine.
This is very easy to do if you use an IoC framework, like Unity and let your DbContext injected in your Controller or Service.
In this case you should use the following settings (if you use Unity):
container.RegisterType<DbContext, YourDbContext>(new PerRequestLifetimeManager(), ...);
Then you can do this if you want to use it in a Controller:
public class YourController : Controller
{
private YourDbContext _db;
public YourController(DbContext context)
{
_db = context;
}
...
No need to over-complicate things. Add the code to the WebApi project. Pass around your Transaction object and re-use it. See https://msdn.microsoft.com/en-us/library/dn456843(v=vs.113).aspx for an example.
Sorry for the general question, but is there an approach for still using JPA lazy loading of entities, when developing a restful AngularJS application.
In the old JSF days, it would all just work when a backing bean accessed the list.
I am using EclipseLink and Spring Data, with Jersey for the restful end points.
Regards
I
Generally you'd have to trigger the lazy loading of the entities prior the EntityManager being closed during the lifecycle of the request.
To do so, you can use the "Open EntityManager in View" pattern. Spring provides a Filter you can apply: OpenEntityManagerInViewFilter (read the docs here: http://docs.spring.io/spring/docs/4.1.0.RELEASE/javadoc-api/org/springframework/orm/jpa/support/OpenEntityManagerInViewFilter.html).
Alternatively, you can manually call getMyLazyCollection() on your JPA entity(ies) prior to serializing them to JSON.
I think the best course depends on following.
Are you able to retrieve the fully resolved entity i.e. all of its
components without adversely affecting performance ?
If the answer is Yes then go for resolving the full entity using JPA fetch option=eager.
I the answer is No. I would go for the following approach:-
1) Expose every lazy JPA component/association explicitly as a sub-resource.
e.g.
#Entity
public class Employee {
#Id
private long id;
...
#OneToOne(fetch=FetchType.LAZY)
#JoinColumn(name="ADDR_ID")
private Address homeAddress;
...
}
#Entity
public class Address{
}
2) Expose service as controller(although you can have them separated but I don't recommend)
#RequestMapping(value="/api/employee")
#Controller
public class EmployeeSvc
public Employee getEmployee(#PathVariable empID){
.....
}
#RequestMapping(value="{empID}/homeaddress")
public Address getHomeAddress(#PathVariable empID){
// will serve http://localhost:8080/api/employee/11111/homeaddress
return this.getEmployee(empID).getHomeAddress();
//make sure you are using 2nd Level cache - as you retrieve object twice
}
}
I am using Google App Engine and using Google's datastore interface for a database .
My question is this , I have the following code : I have a network object that I want to either update if it exists on db , or to create if it's the first time. . For this I have to catch an exception and repeat the same code twice - it seems ugly and redundant and makes me think I'm doing something wrong .
The second thing that strikes me as odd is that there is no method I can think of that copies an object to an entity or vice versa . Am I expected to implement this myself ? It is very uncomfrotable to use the setProperty or getProperty for each property and well ...I am just wondering why there is no objectToEntity method or something of the sort.
This is how my code currently looks ...
try {
Entity network=datastore.get(KeyFactory.stringToKey(networks.get(i)._ipDigits));
//If I get here no exception was thrown - entity already exists on db.
Network contextNet= //fetch the network object from servlet context ...
network.setProperty("ip", contextNet._ip); //update the fields using setProperty - no better way??
network.setProperty("offlineUsers",contextNet._offlineUsers);
datastore.put(network);
}
//Entity doesn't exist , create a new entity and save it (while repeating the same code)...
catch (EntityNotFoundException e) {
Entity network=new Entity("network",Long.parseLong(networks.get(i)._ipDigits));
Network contextNet= // ...fetch the network object from servlet context
network.setProperty("ip", contextNet._ip);
network.setProperty("offlineUsers",contextNet._offlineUsers);
datastore.put(network);
}
You don't have to get and put the entity in order to update it. If you know the ID of the entity you can just put it. If it exists it will be updated, if not it will be created.
Use objectify to automatically map your classes to entities.
I use Entity Framework 4 and Self Tracking Entities. The schema is like:
Patient -> Examinations -> LeftPictures
-> RightPictures
So there is TrackableCollection of these two relationships Patient 1 - * ....Pictures.
Now when loading the customers Form and browsing the details I dont need to load these
data images, only when another form is loaded for Examination details!
I am using a class library as a Data Repository to get data from the database (SQL Server) and this code:
public List<Patient> GetAllPatients()
{
try
{
using (OptoEntities db = new OptoEntities())
{
List<Patient> list = db.Patients
.Include("Addresses")
.Include("PhoneNumbers")
.Include("Examinations").ToList();
list.ForEach(p =>
{
p.ChangeTracker.ChangeTrackingEnabled = true;
if (!p.Addresses.IsNull() &&
p.Addresses.Count > 0)
p.Addresses.ForEach(a => a.ChangeTracker.ChangeTrackingEnabled = true);
if (!p.PhoneNumbers.IsNull() &&
p.PhoneNumbers.Count > 0)
p.PhoneNumbers.ForEach(a => a.ChangeTracker.ChangeTrackingEnabled = true);
if (!p.Examinations.IsNull() &&
p.Examinations.Count > 0)
p.Examinations.ForEach(e =>
{
e.ChangeTracker.ChangeTrackingEnabled = true;
});
});
return list;
}
}
catch (Exception ex)
{
return new List<Patient>();
}
}
Now I need when calling the Examination details form to go and get all the Images for the Examination relationship (LeftEyePictures, RightEyePictures). I guess that is called Lazy Loading and I dont understood how to make it happen while I'm closing the Entities connection immidiately and I would like to stay like this.
I use BindingSource components through the application.
What is the best method to get the desired results?
Thank you.
Self tracking entities don't support lazy loading. Moreover lazy loading works only when entities are attached to live context. You don't need to close / dispose context immediately. In case of WinForms application context usually lives for longer time (you can follow one context per form or one context per presenter approach).
WinForms application is scenario for normal attached entities where all these features like lazy loading or change tracking work out of the box. STEs are supposed to be used in distributed systems where you need to serialize entity and pass it to another application (via web service call).
I'm writing a WPF NHibernate Desktop App using Session Per Presenter. I have a list view showing all the saved SalesOrders and an Edit Sales Order form when you double click on a Sales Order.
Each of these forms has a Session Object which lasts for the lifetime of the form. When a SalesOrder is saved it publishes an Event which tells the list view to re-load. The EditForm is definitely saving to the database and the ListView is definitely selecting from the database. However, the session that belongs to the ListViewPresenter is not updating its entities with those retrieved from the database. It just returns the same values as when the listSession was first loaded before anything was saved.
Below is some code which best replicates the scenario:-
[Test]
public void SessionPerPresenter()
{
//This session is the one that is used to load all salesorders from the database. It's lifetime is the lifetime of the form but as you double click on an entry in the list to edit it will stay alive longer than the session in the edit form
ISession listSession = NHibernateHelper.OpenSession();
SalesOrder order = new SalesOrder("P123435", "ACME");
order.AddLine(new SalesOrderLine("Beans", 15));
order.AddLine(new SalesOrderLine("Coke", 24));
order.AddLine(new SalesOrderLine("Pepsi", 3));
order.AddLine(new SalesOrderLine("Apples", 4));
//this session is the equivalent of the one in the Edit Form as soon as the entity is Saved
//the session is disposed
using (ISession session = NHibernateHelper.OpenSession())
{
session.SaveOrUpdate(order);
ID = order.SalesOrderID;
}
//retrieve all SalesOrders from the database and store them in a list
IList<SalesOrder> salesOrders = listSession.CreateCriteria<SalesOrder>().List<SalesOrder>();
foreach (SalesOrder so in salesOrders)
{
Console.WriteLine(so.ToString());
}
//edit the selected order and update its order code value and resave
using (ISession session = NHibernateHelper.OpenSession())
{
hydratedSalesOrder = session.Get<SalesOrder>(ID);
hydratedSalesOrder.OrderCode = "1234-5678";
session.SaveOrUpdate(hydratedSalesOrder);
session.Flush();
}
//re-retrieve the list of orders from the database. Using SQLServer Profiler / NHibernate profiler
//you can see the query being sent to the database so I don't believe it is in the cache. Indeed, if you run
//the query directly against the database the value 1234-5678 is returned. Can't work out why
//the listSession does not have the values read from the database in it but has the values from the
//original list retrieval.
salesOrders = listSession.CreateCriteria<SalesOrder>().List<SalesOrder>();
foreach (SalesOrder so in salesOrders)
{
Console.WriteLine(so.ToString());
}
listSession.Close()
}
Can someone help me with what is going on here? What am I doing wrong? Am I missing something vital? If it didn't query the database I would think it was something to do with the first level cache but that seems unlikely.
On way to ensure that your entities are not cached is to clear the session with ISession.Clear(). Also you can evict individual entities by calling ISession.Evict(object entity).
If you not sure of what is happening in your application, consider a profiling tool such as nhprof.
Quick note: using a session for the lifetime of a dialog can be handy in small applications with no concurrency problems, but you will get in trouble on the long run. A session should be opened late, and closed early.