I am just getting to grips with Zend Framework 2 Database theory, having used version 1 for a long time. I am trying to discern the 'right' way of working with many tables when the business logic requires one table object to defer operations to another table object.
It seems like a lengthy and laborious process to instantiate a different table class in the existing gateway class. If I use the same process as the service manager factory, e.g.
$yourData = $myData['thisPart'];
$dbAdapter = $this->_myTableGateway->getAdapter();
$resultSetPrototype = new ResultSet();
$resultSetPrototype->setArrayObjectPrototype(new TestObject());
$tbl = new TestTable(new TableGateway('tbl_name', $dbAdapter, null, $resultSetPrototype));
$tbl->insertSomeData($yourData);
... then I suppose it will work, but the service manager is not supposed to be available in the table class. I could inject it using the factory definition but that doesn't seem like a great idea.
So I suppose my question is, what is the best way for a class (representing a table and using this pattern) to insert some of its data into another table using a different gateway class. Or is the method above the only/'right' way?
It seems like the best approach for this problem would be to implement the ServiceLocatorAwareInterface interface in each model that needs to interact with other table gateway classes. You can then define the setServiceLocator(ServiceLocatorInterface $sl) method which is passed a reference to the Service Locator automatically.
Related
I'm writing a variety of Model Tests in CakePHP (PHPUnit)
In TravisCI, I get something like: "Base table or view not found: 1146 Table 'test.events'
In Cake's test runner I get an assertion failure.
The problem I am having is there are methods in my ModelClasses that I am trying to test which call other models with App::uses. For example:
Method on User model:
public function getOtherData() {
App::uses('Event', 'Model');
$this->Event = new Event;
return $this->Event->find('all');
}
And the test:
public function testGetOtherData() {
$result = $this->User->getOtherData();
$this->assertTrue(!empty($result));
}
Note the above example is just that. An example, simplified to show the problem. I understand that the above example has better 'cake' ways of doing it.
Also, I am using defining required fixtures and they work just fine. (I know this by another method in the model which uses a join in the find, instead of App::Uses())
EDIT:
The code when run works, BUT the UnitTest is looking for the other models data (When using App::uses) in the default database, and not the test database. Why doesn't it use the test database? Am I missing something?
LAST NOTE
Using App::uses() and then instantiating the class will work at runtime. But during testing it will fail, as it attempts to use the default database connection, instead of the test database connection.
Per the selected answer, rather than using App::uses, Cakes built in class registry, ClassRegistry::init('Model', true);, you can include a Model from inside another model method.
It's not generally a good idea to instantiate an object in the middle of your functions using the new statement. This is why -- there's no way to block or redirect that call. Also, it's not necessarily easy to get the right parameters to the object's constructor when it's in the middle of another function, so it's best to keep that code separate.
The right way to do this is to use a different method call to get your object. If you use Cake's ClassRegistry::init() to create model objects, they should use the test database.
If you need to create other non-Cake objects, it's best to create them using some other function, e.g. $this->fetchMeOneOThemEventThingies(). Then, during testing, you can mock out that function and have it return something else. Or, you could use some other DI container like pimple, which will take the same role as Cake's ClassRegistry.
If you need a mock model object for testing, be sure to pass the appropriate arguments to the model's constructor as the third parameter to getMock(), or it may use the production database.
please excuse me there is something wrong in this question.
My application uses Entity Framework. Most of my forms are master-detail-subdetail type. In my master form, I create an instance of the Entity as below :
DataTestEntities dbContext = new DataTestEntities();
and I use it for the linq queries as below:
objRec = dbContext.ProdInfoes.Where(p => p.ProdCode == oProducts.ProdCode).First();
I can pass the dbContext to the details screen using System.Reflection. The reason I am passing the dbContext is to avoid creating the instance of the entity over and over.
Is this a good approach? Does it create any performance issues?
Is there any other way that I can a common dbcontext? How about using a class library with the datamodel?
I have a WPF application with MVVM. Assuming object composition from the ViewModel down looks as follows:
MainViewModel
OrderManager
OrderRepository
EFContext
AnotherRepository
EFContext
UserManager
UserRepository
EFContext
My original approach was to inject dependencies (from the ViewModelLocator) into my View Model using .InCallScope() on the EFContext and .InTransientScope() for everything else. This results in being able to perform a "business transaction" across multiple business layer objects (Managers) that eventually underneath shared the same Entity Framework Context. I would simply Commit() said context at the end for a Unit of Work type scenario.
This worked as intended until I realized that I don't want long living Entity Framework contexts at the View Model level, data integrity issues across multiple operations described HERE. I want to do something similar to my web projects where I use .InRequestScope() for my Entity Framework context. In my desktop application I will define a unit of work which will serve as a business transaction if you will, typically it will wrap everything within a button click or similar event/command. It seems that using Ninject's ActivationBlock can do this for me.
internal static class Global
{
public static ActivationBlock GetNinjectUoW()
{
//assume that NinjectSingleton is a static reference to the kernel configured with the necessary modules/bindings
return new ActivationBlock(NinjectSingleton.Instance.Kernel);
}
}
In my code I intend to use it as such:
//Inside a method that is raised by a WPF Button Command ...
using (ActivationBlock uow = Global.GetNinjectUoW())
{
OrderManager orderManager = uow.Get<OrderManager>();
UserManager userManager = uow.Get<UserManager>();
Order order = orderManager.GetById(1);
UserManager.AddOrder(order);
....
UserManager.SaveChanges();
}
Questions:
To me this seems to replicate the way I do business on the web, is there anything inherently wrong with this approach that I've missed?
Am I understanding correctly that all .Get<> calls using the activation block will produce "singletons" local to that block? What I mean is no matter how many times I ask for an OrderManager, it'll always give me the same one within the block. If OrderManager and UserManager compose the same repository underneath (say SpecialRepository), both will point to the same instance of the repository, and obviously all repositories underneath share the same instance of the Entity Framework context.
Both questions can be answered with yes:
Yes - this is service location which you shouldn't do
Yes you understand it correctly
A proper unit-of-work scope, implemented in Ninject.Extensions.UnitOfWork, solves this problem.
Setup:
_kernel.Bind<IService>().To<Service>().InUnitOfWorkScope();
Usage:
using(UnitOfWorkScope.Create()){
// resolves, async/await, manual TPL ops, etc
}
I am working on a Prism desktop application and would like to know the best way to deal with lookup / reference data lists when using a WCF backend. I think this question may cover a few areas and I would appreciate some guidance
For example, consider a lookup that contains Products(codes and descriptions) which would be used in a lot of different input screens in the system.
Does the viewmodel call the WCF service directly to obtain the data to fill the control?
Would you create a control that solely deals with Products with its own viewmodel etc and then use that in every place that needs a product lookup or would you re-implements say a combobox that repopulates the products ItemsSource in every single form view model that uses it?
Would I create a brand new WCF service called something like LookupData service and use that to populate my lookup lists? - I am concerned I will end up with lots of lookups if I do this.
What other approaches are there for going about this?
I suggest creating your lookup object/component as a proxy object for WCF service. It can work in several ways, but most simple coming to my mind would be:
Implement WCF service with methods to provide all Products entities and requested one (eg. basing on product code)
Implement component that will use WCF client to get products, let's call it ProductsProvider
Your view models will take dependency on ProductsProvider (eg. via constructor injection)
Key element in this model is ProductsProvider - it will work as kind of cache for Products objects. First, it will ask web service for all products (or some part of it, up to your liking) to start with. Then, whenever you need to lookup product, you ask provider - it's provider's responsibility to deal with how product should be looked up - maybe it's already in local list? Maybe it will need to call web service for update? Example:
public class ProductsProvider
{
private IList<Product> products;
private IProductsService serviceClient;
public ProductsProvider(IProductsService serviceClient)
{
this.serviceClient = serviceClient;
this.products = serviceClient.GetAllProducts();
}
public Product LookUpProduct(string code)
{
// 1: check if our local list contains product with given code
// 2: if it does not, call this.serviceClient.LookUpProduct
// 3: if service also doesn't know such product:
// throw, return null, report error
}
}
Now, what this gives you is:
you only need to have one ProductsProvider instance
better flexibility with when and how your service is called
your view models won't have to deal with WCF at all
Edit:
As for your second question. Control may not be needed, but having view model for Product entity is definitely a good idea.
I have created a class that inherits for DomainService and have a Silverlight app that uses System.ServiceModel.DomainServices.Client to get a DomainContext. I have also created POCO DataContracts that are used in the DomainServices's Query, Update, Insert, and Delete methods. I also have a ViewModel that executes all the LoadOperations. And now I'm at the part of my app where I want to add new Entities to the generated EntitySets but am unsure about what's going to happen when one user creates new and sets the Key value; all while another user creates a similar entity with the same Key value.
I have seen in the documentation that an ObjectContext is used, but in my situation I was not able to use the EntityFramework model generator. So I had to create my datacontracts by hand.
So I guess my question is, is there any way I can force other silverlight apps to update on database change?
When you make a Save operation to your DomainContext, depending on the load behavior, it will automatically refresh.
TicketContext.Load(TicketContext.GetTicketByIdQuery(ticketId),
LoadBehavior.RefreshCurrent,
x =>
{
Ticket = x.Entities.First();
Ticket.Load();
((IChangeTracking) TicketContext.EntityContainer).AcceptChanges();
}, null);
Here I've set the LoadBehavior to RefreshCurrent. When you make a save, RIA will send the entity back across the wire to the client and merge the changes with the entity already cached on your client side context. I don't know if that quite answers your question or not however.