Best Approach for JPA lazy loading over Rest/AngularJS - angularjs

Sorry for the general question, but is there an approach for still using JPA lazy loading of entities, when developing a restful AngularJS application.
In the old JSF days, it would all just work when a backing bean accessed the list.
I am using EclipseLink and Spring Data, with Jersey for the restful end points.
Regards
I

Generally you'd have to trigger the lazy loading of the entities prior the EntityManager being closed during the lifecycle of the request.
To do so, you can use the "Open EntityManager in View" pattern. Spring provides a Filter you can apply: OpenEntityManagerInViewFilter (read the docs here: http://docs.spring.io/spring/docs/4.1.0.RELEASE/javadoc-api/org/springframework/orm/jpa/support/OpenEntityManagerInViewFilter.html).
Alternatively, you can manually call getMyLazyCollection() on your JPA entity(ies) prior to serializing them to JSON.

I think the best course depends on following.
Are you able to retrieve the fully resolved entity i.e. all of its
components without adversely affecting performance ?
If the answer is Yes then go for resolving the full entity using JPA fetch option=eager.
I the answer is No. I would go for the following approach:-
1) Expose every lazy JPA component/association explicitly as a sub-resource.
e.g.
#Entity
public class Employee {
#Id
private long id;
...
#OneToOne(fetch=FetchType.LAZY)
#JoinColumn(name="ADDR_ID")
private Address homeAddress;
...
}
#Entity
public class Address{
}
2) Expose service as controller(although you can have them separated but I don't recommend)
#RequestMapping(value="/api/employee")
#Controller
public class EmployeeSvc
public Employee getEmployee(#PathVariable empID){
.....
}
#RequestMapping(value="{empID}/homeaddress")
public Address getHomeAddress(#PathVariable empID){
// will serve http://localhost:8080/api/employee/11111/homeaddress
return this.getEmployee(empID).getHomeAddress();
//make sure you are using 2nd Level cache - as you retrieve object twice
}
}

Related

Spring MongoTemplate not a part of ongoing transaction

I am attempting to transition to using MongoDB Transactions via Spring Data Mongo now that MongoDB 4.0 supports transactions, and Spring Data Mongo 2.1.5.Release supports it as well.
According to the Spring Data Mongo Documentation, you should be able to use the Spring MongoTransactionManager and have the MongoTemplate recognize and participate in ongoing transactions: https://docs.spring.io/spring-data/mongodb/docs/2.1.5.RELEASE/reference/html/#_transactions_with_mongotransactionmanager
However, this following test fails:
#Autowired
private TestEntityRepository testEntityRepository;
#Autowired
private MongoTemplate mongoTemplate;
#BeforeTransaction
public void beforeTranscation() {
cleanAndInitDatabase();
}
#Test
#Transactional
public void transactionViaAnnotation() {
TestEntityA entity1 = new TestEntityA();
entity1.setValueA("a");
TestEntityA entity2 = new TestEntityA();
entity2.setValueA("b");
testEntityRepository.save(entity1);
testEntityRepository.save(entity2);
// throw new RuntimeException("prevent commit");
List<TestEntityA> entities = testEntityRepository.findAll(Example.of(entity1));
Assertions.assertEquals(1, entities.size()); // SUCCEEDS
entities = testEntityRepository.findAll(Example.of(entity2));
Assertions.assertEquals(1, entities.size()); // SUCCEEDS
entities = mongoTemplate.findAll(TestEntityA.class);
Assertions.assertEquals(2, entities.size()); // FAILS - expected: <2> but was: <0>
}
It appears that the testEntityRepository works fine with the transaction. The asserts succeed, and if I uncomment the exception line, neither of the records are persisted to the database.
However, trying to use the mongoTemplate directly to do a query doesn't work as it appears to not participate in the transaction.
The documentation I have linked shows using the template directly within a #Transactional method like I am attempting. However, the text says
MongoTemplate can also participate in other, ongoing transactions.
which could be interpreted to mean the template can be used with different transactions, and not necessarily the implicit transaction. But that is not what the example would indicate.
Any ideas what is happening or how to get the template to participate in the same implicit transaction?

Share class between endpoint and Objectify with different field subset

Say this is my classes
#Entity
public class Library{
...
}
#Entity
public class Book{
#Load
#Parent
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
private Ref<Library> libraryRef;
#Ignore
private Library library;
}
I want to send List<Book> to the "android" client: I don't want the android client to get libraryRef but I want the client to get library
Here is the data access method I have now
public static List< Book > getAllBooks(){
return OfyService.ofy().load().type(Book.class).list();
}
My endpoint will just return List<Book> to android. I believe I have accomplished the first part: make sure datastore does not store library but libraryRef. But how do I accomplish the second part: make sure the client gets library?
I am sure it is not yet loaded. How do I make sure it is loaded? Do I have to use my own for-loop for iteration?
My advice for anyone working with code shared between client and server is to make a clean separation between your API objects and your domain objects. It's a little more work up front to make DTOs but it makes your whole system more flexible - if you want to change your domain objects, you don't risk breaking a zillion mobile phone apps that are on a slow (or nonexistant) upgrade cycle.

Objectify with Cloud Endpoints

I am using appengine cloud endpoints and objectify. I have previously deployed these endpoints before and now I am updating them and it is not working with Objectify. I have moved to a new machine and running latest appengine 1.8.6. Have tried putting objectify in the classpath and that did not work. I know this can work, what am I missing??
When running endpoints.sh:
Error: Parameterized type
com.googlecode.objectify.Key<MyClass> not supported.
UPDATE:
I went back to my old computer and ran endpoints.sh on same endpoint and it worked fine. Old machine has 1.8.3. I am using objectify 3.1.
UPDATE 2:
Updated my old machine to 1.8.6 and get same error as other machine. Leaves 2 possibilities:
1) Endpoints no longer support objectify 3.1
or
2) Endpoints have a bug in most recent version
Most likely #1...I've been meaning to update to 4.0 anyways...
Because of the popularity of Objectify, a workaround was added in prior releases to support the Key type, until a more general solution was available. Because the new solution is available, the workaround has been removed. There are two ways you can now approach the issue with the property.
Add an #ApiResourceProperty annotation that causes the key to be omitted from your object during serialization. Use this approach if you want a simple solution and don't need access to the key in your clients.
Add an #ApiTransformer annotation that provides a compatible mechanism to serialize/deserialize the field. Use this approach if need access to the key (or a representation of it) in your clients. As this requires writing a transformer class, it is more work than the first option.
I came up with the following solution for my project:
#Entity
public class Car {
#Id Long id;
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
Key<Driver> driver;
public Key<Driver> getDriver() {
return driver;
}
public void setDriver(Key<Driver> driver) {
this.driver = driver;
}
public Long getDriverId() {
return driver == null ? null : driver.getId();
}
public void setDriverId(Long driverId) {
driver = Key.create(Driver.class, driverId);
}
}
#Entity
public class Driver {
#Id Long id;
}
I know, it's a little bit boilerplate, but hey - it works and adds some handy shortcut methods.
At first, I did not understand the answer given by Flori, and how useful it really is. Because others may benefit, I will give a short explanation.
As explained earlier, you can use #ApiTransformer to define a transformer for your class. This would transform an unserializable field, like those of type Key<myClass> into something else, like a Long.
It turns out that when a class is processed by GCE, methods called get{fieldName} and set{FieldName} are automatically used to transform the field {fieldName}. I have not been able to find this anywhere in Google's documentation.
Here is how I use it for the Key{Machine} property in my Exercise class:
public class Exercise {
#ApiResourceProperty(ignored = AnnotationBoolean.TRUE)
public Key<Machine> machine;
// ... more properties
public Long getMachineId() {
return this.machine.getId();
}
public void setMachineId(Long machineId) {
this.machine = new Key<Machine>(Machine.class, machineId);
}
// ...
}
Others already mentioned how to approach this with #ApiResourceProperty and #ApiTransformer. But I do need the key available in client-side, and I don't wanna transform the whole entity for every one. I tried replacing the Objectify Key with com.google.appengine.api.datastore.Key, and it looks like it worked just fine as well in my case, since the problem here is mainly due to that endpoint does not support parameterized types.

Recovering after exception when using a single dbcontext

I am using entity framework to persist data in a N-tier Wpf application. My dbcontext is shared amongst all repositories and is never disposed. When I persist data I mark an object as modifed and try to save changes. If an error accurs while persisting the object the object is still markted as modifed and if the user aborts the current opperation he will get the same error when saving another object.
I have solved this by overriding SaveChanges in my dbcontext and if any error accurs I accept all changes(see code below). So if an error accurs the object and all objects are marked unchanged even if theyr'e not persited.
This doesn't feel right...
Does anyone agree with this solution?
Another solution would be to new the dbcontext in each method in my repositores and disposing of them right away. That will make my repositories more complicated and "noicy" and I will also lose the ability to lazy load data...
Does anyone have a different solution for me?
//In my repositories
public void UpdateObject(Object object)
{
dbContext.Entry(object).State = EntityState.Modified;
dbContext.SaveChanges();
}
//In my dbcontext class
private ObjectContext ObjectContext()
{
return (this as IObjectContextAdapter).ObjectContext;
}
public override int SaveChanges()
{
try
{
return base.SaveChanges();
}
catch (Exception)
{
ObjectContext().AcceptAllChanges();
throw;
}
}
Our team uses an approach similar to below:
Repository:
public class StudentRepository
{
private readonly MyEntities _context;
public StudentRepository(MyEntities context)
{
_context = context;
}
// Basic CRUD methods etc
}
Business Logic:
public AddStudent(Student student)
{
using( var context = new MyEntities())
{
var studentrepo = new StudentRepository(context);
studentrepo.Add(student);
context.SaveChanges();
}
}
This is an oversimplified example, but should give you an idea. To reduce code, we also use a base generic repository class for the CRUD methods.
If the project we are working on includes a web service, we instantiate the dbcontext in the API Controller and override the Dispose method to get rid of it.
Having such a long lived context is not a good idea. It will get large and slow with all the entities and changes being tracked, concurrency related issues may arise and exceptions thrown by your context can impact your entire application.
http://msdn.microsoft.com/en-us/data/jj729737
Another solution would be to new the dbcontext in each method in my
repositores and disposing of them right away. That will make my
repositories more complicated and "noicy" and I will also lose the
ability to lazy load data
In a disconnected scenario I would create and dispose with each request/unit of work. Concerned about your repos getting complicated? Then don't use this extra layer of abstraction. Are the repos really necessary? What do you gain over using the DbContext directly?
As for lazy loading I think in a disconnected n-tier scenario that lazy loading is not really appropriate. You should probably use eager loading of required data for your view or have separate method calls to get the related data.

Asynchronous Begin/End pattern for webservices in silverlight project

I found that the proxy generated with SlSvcUtil.exe (or by adding reference to Web References) only supports Event based async model which is absolutely inappropriate from design point of view (events were 2nd class citizens from the first days).
I'm going to implement F#'s async builder approach and I found "old style" Begin/End are much easier to be generalized. I notices SlSvcUtil.exe generates Begin/End methods pair but marks them both with private keyword?
A couple options on top of my head are:
expose Begin/End methods by updating the proxy class by hand
use wsdl.exe and create wrapper library for missing System.Web classes
use other communication protocols (HttpClient, Tcp)
use third-party proxies (failed to find any so far)
Any ideas?
Say someone created a remote service with one method:
public interface CompressService
{
public byte[] Compress(byte[] inData);
}
After SlSvcUtil I got:
public class CompressServiceSoapClient: ClientBase<CompressServiceSoap...
{
private BeginOperationDelegate onBeginCompressDelegate;
private EndOperationDelegate onEndCompressDelegate;
public event System.EventHandler<CompressCompletedEventArgs> CompressCompleted;
public void CompressAsync(byte[] inData, object userState);
}
While in fact I need:
public class CompressServiceSoapClient: ClientBase<CompressServiceSoap...
{
public IAsyncResult BeginCompress(byte[] inData, System.AsyncCallback callback, object asyncState);
public byte[] EndCompress(IAsyncResult result);
}
Answer
The solution is to declare contract interface with async methods and do not use generated code inherited from ClientBase<>. The article http://msdn.microsoft.com/en-us/library/dd744834(v=vs.95).aspx describes this in more details.
You can access the begin/end methods by using the channel factory for the end point.
Basically just create a new ChannelFactory and pass in a binding and end point. You can use the host source to dynamically update the end point so it's not hard-coded. The resulting instance will expose the begin/end methods for you.

Resources