CakePHP: plugins in production environment - cakephp

In a production deployment, you wouldn't have your /app folder in the public webroot; it only takes one sloppy fingered mistake to turn private business assets such as PHP files into plaintext on the web. The advanced and production installation instructions in the book describe how to topologically isolate the webroot directory, so you can make folders containing PHP files readable only by you. This technique has worked great for me for a while - but now I want to use and develop plugins.
The CakePHP book omits the discussion of plugins in production deployments. For those of you that use CakePHP in production with plugins, how do you make the assets in a given plugin's webroot directory publicly accessible?
Do you create a plugins folder in webroot and then symlink plugins/my_plugin_name to the plugin webroot? Do you manually copy the plugin webroot folder into webroot? Do you bite the bullet and just relocate the plugins folder to webroot? Do you use some kind of fancy content feeding plugin?
What's the best solution?

Sounds to me like you should manually copy files from the plugin webroot to your own webroot. If it's in production, I presume you aren't going to be turning the plugin on/off frequently (as one might do in development), so this would be a one-time process.
Also, doing it this way keeps your private assets out of publicly-accessible folders, maintaining the security you are concerned about.
On the other hand
If the plugin is open source, take the easy way out and drop that whole sucker in the webroot. If it get's compromised, you shouldn't really care, since the files are freely distributed anyway.

Related

main.js(compiled js-files) file from a angular artifact replace in another build artifacts

We have multiple environments, and we have environment files which having backend configarations and we are using these files during the builds.
Example: ng build --c UAT
But I have a issue here, now we decided to build only once and deployment multiple environments same artifact.
I know this is quite achievable using an Angular service and theĀ APP_INITIALIZERĀ token, but some reason we can't use this.
So I decided to after the build, modify the compiled js files(main.js) with respective env configuration values. But it's becoming difficult because of increased number of env variable and its patterns.
So I thought of follow below process, please suggest it can usable or i should not
1, I'll build UAT webpack(dist/artifact) using the "ng build --c UAT".
2, I'll do same for all other environments, now I have total 3 dist folders(webpacks).
3, I'll deploy the UAT artifact to all environment s, but before deploying it to Preprod I'll replace "main.js" file with Preprod artifact main.js file because only main.js file have the all environment configarations. and keep all other js files same.
4, I'll repeat same with prod deployment.
Please suggest on this approach.
You made a good choice to decide against environment specific builds, as they'll always come back to haunt you. But with your approach you only shifted the problem since you still need to tweak the build artifact. When your configuration is highly dynamic, I would suggest to reconsider your decision of not using a Service to load the data dynamically at runtime, or at least state the constraints why this approach doesn't work for you.
Assuming you still want to rely on static file content, th article How to use environment variables to configure your Angular application without a rebuild could be of interest for you. It suggests to load the data from an embedded env.js script, and expose it from there as an Angular service. While one could argue that this also only shifts the problem further, it at least allows your build artifact to remain untouched. For example, if you run your app from say an nginx docker container, you could replace values in env.js dynamically prior to the webserver start. I'm using a simplified version of this approach, and while it still feels a bit hacky, it does work! Good luck!

Setting up an Ant script to upload files

I've run over only a few examples of how to do this and they didn't work for me. Mainly since i've only used an ant script to auto build jar files threw jenkins. Now though i need to build those files in jenkins then upload them to a 3rd party file site like sourceforge. This is both to save hard drive space on the server, since i don't own it, and to allow external downloads. Any help is welcome but no comments on the fact i don't know to much about ant scripts.
Also something related by a bit separate.The jar file i'm building depends on a another jar file with its own version. i also want to make a new folder each time it uploads with a different dependency version. This way the users that download this file can easily understand the main jar version it goes with while allowing me to upload 20+ sub builds.
There are several ways to upload files, so there are several kind of ant tasks to do the job.
For instance, if you want to upload to sourceforge, you can use the Ant task scp. But it seems also possible to upload there via FTP: so here is the task ftp.
Maybe you find some other service which requires you to upload via HTTP: ant-contrib have the post task.
I used to do publications as part of my ANT build logic, creating a special "publish" target that issued the scp or ftp command.Now I'm more inclined to leverage one of the publish over plugins for Jenkins.
The main reason for this shift is the management of access credentials. Using the ANT based approach I was forced to run my build on a Jenkins slave that was pre-configured with the correct SSH key to talk to the remote server. The Jenkins plugin manages private keys centrally and ensures all slaves are properly configured.
Finally if your build has dependencies on 3rd party jars, use a dependency manager like ivy to download them and include them in your project. It then becomes trivial to include their upload as part of your publish step.

Where in my app's directories should I put zepto.js?

I'm totally new to trigger.io and i was wondering if there was a suggested directory for putting files like zepto.js or jquery.js (if I were to use that).
In theory one can place it anywhere within the "src" directory tree. However, I would suggest placing it in the "js" directory. Or even better... to help separate your .js files and 3rd party .js files... place it in a "js/lib" subdirectory. Most (if not all) of the trigger.io examples in their documentation and Github tend to do this.
https://github.com/trigger-corp
https://github.com/amirnathoo
One of the great things about developing applications using trigger.io (or Phonegap) is that you can utilize your existing web programming knowledge and best practice.
Still... at the end of the day its up to you to decide how to organize your application. With smaller apps it might not be that important, but as your application grows having a "clean" and manageable structure helps.
Update: In their weather app tutorial they use a "resources" directory.
http://docs.trigger.io/en/v1.3/tutorials/weather/tutorial-2.html

oracle driver for java web application

Is it okay to place an oracle driver jar within the web application's /lib directory, or is it better practice to place it in tomcat's lib directory?
I'm wondering about this because on my local host, my web app runs fine with the jar in the web app's /lib directory, but when I move the web app to a real development server, I continue to receive null pointer exceptions when trying to close a connection pool object. I thought this issue may be why I'm unable to free the connection.
Thanks.
To answer the initial question, about placement of the .jar file, there are some things to consider:
Are there other apps on the same server that use this? If yes, and you want to ensure all of them use the same version, the placing on the server/lib folder would be better
If you want some flexibility in terms of what version of the .jar each app uses, then webapp/lib is better
If you are packaging your app as an ear or war, and there are size considerations, then the server/lib option has some advantages, given it makes sense considering the two points above.
If you run into classloader issues from dependencies, you may have to consider other jars when deciding on placement.
Whatever you decide, its best to make sure each jar exists only once in each apps classpath.

How to handle the deployment and updates of multiple installations of one CakePHP web-app?

I made a simple CMS with CakePHP to handle a small (but growing) number of websites. The CMS is constantly evolving as I regularly add features to a development version on my own machine.
I use SVN to trace the evolution of this development version, and that's pretty much it. To make a new website, I basically copy/paste the dev folder and modify the necessary files before uploading the new website by FTP.
One problem is in the end, every website installation is independent and if I want to add some new features to existing websites, I have to copy files by hand.
Another problem is that some websites have modified versions of the CMS because of specific needs: some controller classes have specific methods not present on the local version.
To sum it up:
I have one base CakePHP app regularly evolving
There are multiple versions (=websites) of this app already installed on different servers
Some websites have custom code included not present in the base version
I want to be able to easily update all the present and future websites when I improve the base app, without breaking some possible specific parts
Knowing it's a CakePHP app, what would you do? How should I change my code to manage at the same time the core and the specific code?
Thanks in advance!
... some controller classes have specific methods not present on the local version.
You might also consider the option of setting up additional class paths within each of your website applications. You can tell CakePHP to check other directories entirely for files missing from the current application. For example, you could have the following directory structure:
/app1 - a standard client's website application
/app2
/app3 - a custom client's website application (with custom controller)
/core - the core CMS application
/cake
By adding the following to your /appN/config/bootstrap.php files, you are telling CakePHP to look for controllers in /core/controllers if it can't find one it's looking for in the current application.
App::build(array(
'controllers' => array(ROOT . DS . 'core' . DS . 'controllers' . DS),
));
This would, for example, allow you to put all your CMS controllers in /core/controllers and delete all your /appN/controllers directories. Only where a client needed a controller customized (ie. /app3 above) would you create a controllers directory, copy the specific controller across and make modifications (adding methods and such).
(This is the approach the developer of Wildflower CMS took - note the /wildflower directory.)
Using version control software (like SVN) would probably do the trick for you. Each website you work on could be a branch that follows the main branch of development. Every time you make a change that you want to apply to every site, you'd update the main branch and then have every sub branch import the changes from the main branch.
Mind you, I'm more used to how Git works, so this scenario might work better in Git than in SVN, ymmv.

Resources