Database Driven Front End Controller / Page Management Good or Bad? - database

I am currently working within a custom framework that uses the database to setup a Page Object which contains the information about Module, View, Controller, etc which is used by a Front Controller to handle routing and the like within an MVC (obviously) pattern.
The original reason for handling the pages within the database was because we needed to be able to create new landing pages on the fly from within a admin interface and because we also needed to create onLoad and onUnload events to which other dynamic objects could be attached.
However, after reading this post yesterday, it made me wonder if we shouldn't move this handling out of the database and make it all file structure and code driven like other frameworks so that pages can be tested without having the database be a component.
I am currently looking at whether to scrap the custom framework and go with one of the standard frameworks and extend it (which is what's most likely right now), but I'm wondering whether to extend the framework to handle page requests through database like we are now or if we should simply go with whatever routing / handling mechanism comes with the framework?

Usually I'm pretty lenient on what I will allow to go on in a "toy" application, but I think there are some bad habits that should be avoided no matter what. Databases are powerful tools, with reasonably powerful languages via stored procedures for doing whatever you need to have done... but they really should be used for storing and scaling access to data, and enforcing your low level data consistency rules.
Putting business logic in the data layer was common years ago, but separation of concerns really does help with the maintainability of an application over its lifespan.
Note that there is nothing wrong with using a database to store page templates instead of the file system. The line between the two will blur even further in the future and I have one system that all of the templates are in the database because of problems with the low budget hosting and how dynamically generated content need to be saved. As long as your framework can pull out a template as easily from a file or a field and process them, it isn't going to matter much either way.
On the other hand, the post from yesterday was about generating the UI layer elements directly from the database (at least, that's how I read it) and not an unusual storage location for templates. That is a deep concern for the reasons mentioned... the database becomes locked to web apps, and only web apps.
And on the third hand, never take other people's advice too much to heart if you have a system that works well and is easy to extend. Every use-case is slightly different. If your maintainability isn't suffering and it serves the business need, it is good enough.

Related

Heuristic for dividing front-end and back-end logic in libraries like backbone.js

am new to learning about MVC.
I am wondering if there is a heuristic (non programatically speaking) out there for dividing and deciding what logic goes on the front-end as opposed to the back-end especially when using front-end libraries like backbone.js.
That is, libraries like backbone.js separate data from DOM elements which makes it useful for creating sophisticated client side logic that, perhaps, used to be carried out on the server side.
Thanks in advance
Joey
The "classic" way to do Model - View - Controller is to have all three on the server. The View layer output of HTML and some JS is then rendered by the browser.
Rails is an excellent example of this.
The "new cool" way is to treat the browser as the main computing engine with the backend server providing services via APIs.
In this case, the Model, View and Controller software all run (as Javascript or coffeescript) on the client. Backbone is often a part of the browser-side solution but it has alternatives such as spine, angularJS and others.
On the backend server, you run the dbms and a good API system. There are some good frameworks being built on Ruby/Rack. See posts by Daniel Doubrovkine on code.dblock.org You have many choices here.
Advantages of MVC on the client
Responsive user interface for the user
Cool Ajaxy single page effects
Single page webapps can provide much faster UI to user than regular web sites
Good architecture, enabler for purpose build iPhone/Android apps
Depending on the app, can be used to create standalone webapps which work without a network connection.
This is what many cool kids are doing these days
Disadvantages
Need to decide on approach for old browsers, IE, etc
Making content available for search engines can be tricky. May require shadow website just for the search engines
Testing can be a challenge. But see new libs such as AngularJS which include a testability focus
This approach involves more software: takes longer to write and test.
Choosing
It's up to you. Decision depends on your timeframe, resources, experience, needs, etc etc. There is no need to use backbone or similar. Doing so is a tradeoff (see above). It will always be faster/easier not to use it but doing without it (or similar) may not accomplish your goals.
You can build a great MVC app out of just Rails, or PHP with add-on libs or other MVC solutions.
I think you're using the word heuristic in a non-programmatic sense correct? I.e. you're using it to mean something along the lines of 'rule of thumb'?
As a rule of thumb:
You want the server to render the initial page load for UX and SEO reasons.
You could also have subsequent AJAX partial page loads rendered by the server for the same reasons. Profile to see which is faster: having the server render and transfer extra data (the markup) over-the-wire vs. sending a more concise payload (with JSON) and having the client render it. There are tradeoffs especially if you take into consideration mobile devices where maybe rendering on the client will be slower, but then again there are mobile devices out there with slower internet connections...
Like any client-server architecture: You want the client to do things that require fast responsiveness on the client and then send some asynchronous operation to the server that performs performs the same task.
The take away is vague, but true: it's all about tradeoffs and you have to decide what your products needs are.
The first two things to come to mind for me were security & search..
You will always want to restrict read/write access on the server.
in most instances you will want to have your search functionality as close to the data as possible.

How to abstract my business model using dbExpress and Delphi (maybe DataSnap as well)?

If my question isn't clear, please help me improve it with comments.
I'm new to Delphi and dbExpress and I'm just getting acquaintance with TSQLDataset, TDataSetProvider, TClientDataSet and TDataSource classes.
The project that I'm working on uses this components in a way that feels strange to me. There is a huge data module unit that contains lots and lots of the quartet of classes previously described. I'm guessing that there are better (and more modularized) ways of doing this. DataSnap is used only to place this data module in a server application, so the clients access the data through it.
So, let me try to explain some of my doubts:
What is the role of each one of this classes? I read the documentation but I can't get a practical insight on this subject (specially about TDataSetProvider).
Which classes should be in the data module and which should be in my forms?
Is it possible to create an intermediate layer to abstract my business model from my database setup (maybe creating functions that return immutable datasets?)?
If so, is it wise to use DataSnap to do so?
I'm sorry if I'm not clear enough. Thanks in advance.
Components
TDataSource is the bridge between data-aware controls and the dataset (TDataSet descendant) from which they are to get their values.
TClientDataSet is one such dataset. TClientDataSet can be used in isolation, for example to access data contained in xml files, but can also be hooked up to a TDataSetProvider.
TDataSetProvider is the bridge between the in-memory TClientDataSet and the actual dataset that takes its data from a database through some kind of driver. In Client Server development you will usually see a TRemoteDataSetProvider (name may be different, I don't work with these components that often), which bridges the gap between the client and server.
TSQLDataSet is the actual dataset getting its data from some database.
It would feel strange to me to see this quartet all in one executable. I would expect the TSQLDataSet on the server side hooked up to the TRemoteDataSetProvider's counter part. However, I guess that with an embedded database it could be a way to support the briefcase model, which is where TClientDataSet is really helpful (TClientDataset is very powerful, that is just one of its strong points.)
Single datamodule
Ouch. A single huge datamodule is lazy programming or the result of misconceptions about how to use datamodules. It is perfectly normal to have a single datamodule that "hosts" the database connection which is then used by various other datamodules that are more tightly focused on aspects of the application.
Domain abstraction
With regard to abstracting your business model, dbexpress and datasnap should really not be anywhere in your business model. They should be part of your data layer.
TDataSource, TClientDataSet and custom TDataSetProvider descendant(s) can be used to leverage the power of the data-aware controls in the UI while still keeping the UI separate from the business model. In this case the custom TDataSetProvider would be the bridge between the client dataset and the collections and instances in the domain layer.
Even so, I would still expect to see a separate data layer, using TRemoteDataSetProviders or straight TDataSet descendants (like TSQLDataSet) to provide the domain layer with its data.
The single huge data module you mentioned could be part of that data layer, with the client datasets providing the business layer with its data. As you also mention TDataSource as part of a common quartet, the application was probably developed in a RAD-data-aware fashion where UI controls are basically hooked up straight to database columns/tables.
If you want to transform this application to have a more layered architecture, tread carefully and slowly. Get to know the current architecture first and get to know it well enough to see the impact that this kind of transformation would have. The links provided by Serg will certainly help you there. Pawel Glowacki has written extensively about DataSnap.
The project you are working on is a Delphi Multitier Database Application
Your question is too wide for SO and hardly can be answered in its current form - you should learn to understand the underlying architecture.
You can start from video tutorial by Pawel Glowacki:
http://www.youtube.com/watch?v=B4uxLLIUddg
http://cc.embarcadero.com/item/28188

Entity Framework 4 - lifespan/scope of context in a winform application

Sorry for yet another question about EF4 context lifespan, but I've been wondering about this for a while and I do not seem to find the answer. I'm not very familiar with lots of patterns or overcomplicated stuff (in my point of view) so I want to keep it simple.
I work with an ASP.NET application where the context is managed by each http request which is a very good approach in my opinion.
However I'm now working with a winforms application and I sometimes have transactions or reports that will not perform very well if I simply create the context for each query. Not that this performance issue is a very problematic thing, I just want to hear if there's a simple strategy as per HTTP request in ASP.NET for winforms?
Don't create a context for each query. At the same time, don't create a context that gets used for the entire lifetime of a Form (or your application).
Create a context for a single Unit of Work (which can encompass many queries).
That way you have all of your changes encapsulated in the context and you let Entity Framework wrap the database calls in a pretty little transaction without having to worry about it yourself.
I prefer to abstract the Repository and Unit of Work patterns out of the Entity Context so that I can use them independently (which makes everything very clear). MSDN actually has a decent article on the details:
MSDN - Using Repository and Unit of Work patterns with Entity Framework 4

Using VCL for the web (intraweb) as a trick for adding web interface to a legacy non-tiered (2 tiers) Delphi win32 application does make sense?

My team is maintaining a huge Client Server win32 Delphi application. It is a client/server application (Thick client) that uses DevArt (SDAC) components to connect to SQL Server.
The business logic is often "trapped" in Component's event handlers, anyway with some degree of refactoring it is doable to move the business logic in common units (a big part of this work has already been done during refactoring... Maintaing legacy applications someone else wrote is very frustrating, but this is a very common job).
Now there is the request of a web interface, I have several options of course, in this question i want to focus on the VCL for the web (intraweb) option.
The idea is to use the common code (the same pas files) for both the client/server application and the web application. I heard of many people that moved legacy apps from delphi to intraweb, but here I am trying to keep the Thick client too.
The idea is to use common code, may be with some compiler directives to write specific code:
{$IFDEF CLIENTSERVER}
{here goes the thick client specific code}
{$ELSE}
{here goes the Intraweb specific code}
{$ENDIF}
Then another problem is the "migration plan", let's say I have 300 features and on the first release I will have only 50 of them available in the web application. How to keep track of it? I was thinking of (ab)using Delphi interfaces to handle this. For example for the User Authentication I could move all the related code in a procedure and declare an interface like:
type
IUserAuthentication= interface['{0D57624C-CDDE-458B-A36C-436AE465B477}']
procedure UserAuthentication;
end;
In this way as I implement the IUserAuthentication interface in both the applications (Thick Client and Intraweb) I know that That feature has been "ported" to the web. Anyway I don't know if this approach makes sense. I made a prototype to simulate the whole process. It works for a "Hello world" application, but I wonder if it makes sense on a large application or this Interface idea is only counter-productive and can backfire.
My question is: does this approach make sense? (the Interface idea is just an extra idea, it is not so important as the common code part described above) Is it a viable option?
I understand it depends a lot of the kind of application, anyway to be generic my one is in the CRM/Accounting domain, and the number of concurrent users on a single installation is typically less than 20 with peaks of 50.
EXTRA COMMENT (UPDATE): I ask this question because since I don't have a n-tier application I see Intraweb as the unique option for having a web application that has common code with the thick client. Developing webservices from the Delphi code makes no sense in my specific case, so the alternative I have is to write the web interface using ASP.NET (duplicating the business logic), but in this case I cannot take advantage of the common code in an easy way. Yes I could use dlls maybe, but my code is not suitable for that.
The most important thing that you have to remember is this:
Your thick client .EXE process is used by one person at a time (multiple persons will have multiple instances of that .EXE).
Your intraweb .EXE process will be used by many persons at a time. They all share the same instance of the process.
That means your business logic must not only be refactored out into common units, the instances of the business logic must be able to reside into memory multiple times, and not interfere.
This starts with the business logic that talks to the database: you must be able to have multiple database connections at the same time (in practice a pool of database connections work best).
In my experience, when you can refactor your business logic into datamodules, you have a good starting point to support both an Intraweb and a thick client version of your app.
You should not forget the user interface:
Thick clients support modal forms, and have a much richer UI
Web browsers support only message dialogs (and then: those are very limited), all fancy UI stuff costs a lot of development time (though for instance, TMS has some nice components for Intraweb)
Then, to top it off, you have to cope with the stateless nature of the HTTP protocol. To overcome this, you need sessions. Intraweb will take care of most of the session part.
But you need to ask yourself questions like these:
what should happen if a user is idle for XX minutes?
how much session state can I store in memory? and what if it doesn't fit?
what do I do with the session state that does not fit into memory?
This is just a start, so let use know when you need more info.
If it gets very specific to your app, you can always contact me directly: just google me.
--jeroen
I think if you move your application to n-tiers will be a better solution, it will be easier after that to be used by the desktop and web applications.
you already made the first part by decoupling the business logic from the presentation, you can use RemObject SDK or DataSnap that bundled with Delphi.
after that you will have working desktop application, and you can use Intrawebm Asp.net or what ever for web part, and in this way you will not have to duplicate the business logic again for the web part.
usually converting desktop application to web not easy as you thought, because they work in different environment, and you need to build each one as it's nature.

Web application in drupal?

I am going to be creating a work order system with three roles
The "client" - The client can request projects to be completed by the worker. The project must be selected from a list of templates and various sub options all referred to as a campaign (campaign types come and go throughout the year)
The worker - The worker must be able to view work orders and mark them as accepted/rejected, work in progress and completed.
The overlord - He/She needs to see stats concerning the activity of the other two types of users.
So.
This is a web app. But a very simple one in terms of logic. Could something like drupal handle this? Or would I have to write my own modules? The other out of the box aspects of drupal make it attractive (admin, user creation, news feeds, etc...)
I have looked at Views and Webforms. Views seems great for querying and displaying data from the work order database (great for a portion of all three roles), but I am not clear as to how I interface with my work order database when creating and modifying work orders.
Webforms doesn't see to be the answer, I am sure I just missing something right under my nose.
Any hints in which direction to look would be great!
Thanks.
If you use a simpler, less powerful CMS, you may save time with the learning curve but lose time struggling with a less flexible framework. Also: Check how active the developer community is when evaluating Open Source software. You'll need support.
Views and Webforms may be tools that you'll end up using but what you're really talking about is work flow. You could build your own work flow with a combination CCK and views, yes. There are also work flow modules.
Are you and IRC user? See: http://drupal.org/irc
I am pretty certain that you can do this with drupal. I would suggest looking into using an easier CMS than drupal for something simple like this. Using something like MediaWiki for this application might be quicker to develop and have less of a learning curve. If you don't mind putting in the time to learn drupal, I think you will ultimately have more freedom.
First of all, don't underestimate Drupal's learning curve. Especially if your PHP and/or programming skills are relatively new. Drupal does a lot of things in it's own way, and it's good to know that way.
Secondly, Drupal is (imho) made first of all for outward facing sites, it can have a lot of stuff just for the users and not for the public, but a lot of its functionality is made for the CMS part of the system. You might consider using a more framework-style system like Zend Framework, which components are a bit more "loose" but also offers less functionality out of the box.
Thirdly, depending on what a work order is and how it should be treated a custom module could be needed. If a workorder has a really simple datamodel, it could probably be done without programming, but if it is complex you'll have to fire up your favorite editor. Don't worry, making a module sounds scarier than it really is.
I don't know how good your knowledge of drupal is, but to me this has CCK and Views2 and user roles written all over it.
Basically, use CCK to create your content types (remember the user reference field might come in handy to assign a node/record to a particular user)
Then create views for each user group (they could be shared, as you can assign them to more than one role type)
Creating a view where you filter the cck user reference field by the user looking at the screen may also come in handy here.
OKAY, there might be a little bit more to it than that, but what you want is doable.
UPDATE: To protect your site from unwanted eyes, check out the site security module as it puts a security wrapper around all of your website.
Views - Create lists - allow access by user roles
CCK - Define your own content types (add your own fields)

Resources