Team working on Drupal - Tips - database

I am working on a Drupal site with a few friends. Obviously we can Version control the code... but what do we do to keep each others databases in check?
I have managed to get all the theming into files (contemplate etc), but ideally my views settings, menu settings would be in line also... (Not worried about Content either way as we're just building the framework)
Any suggestions?

Using Features along with Context is very powerful. Context lets you create a "section" for your site. It's best illustrated through an example:
Lets say we define the "Forum" context as anything with the url of forums/*. Context lets us say: "I want to show these three views in the right side bar, only when I am in the "Forums" context.
Now, using Features, we can create "module" define by the context. So, we will end up with a module called "youSite_forums", which will include all the views, blocks, etc. that was define in your Forums context. It also will determine the correct dependencies, as well as Content Types used in the context. All will be bundled up nicely in a module.
As for versioning content such as node, you can user either Node Export, or just do a DB dump using Backup and Migrate. We use these occasionally, but we never have every node versioned in SVN.
Links:
Features
Context
Backup and Migrate

Problem Solutions in Database Migration from Development to Live Sites

You can find some more opinions on this here: Drupal DATABASE deployment strategies?

Related

2SXC/DNN - Delete ADAM Files in Entity

We're designing a system for a client where they are allowing authenticated users to upload images. We've created an API to upload the files but the client only wants the latest file and delete all previous ones so that there would only ever be one.
We've looked through the docs and can't come across a way for ADAM to handle this in both 2SXC and DNN's file system.
Internally when deleting images we see API calls like the following to the internal 2SXC API, but we're wondering if this is exposed somewhere within the public API?
https://somedomain.com/api/2sxc/app/auto/data/61393528-b401-411f-a001-f423ea46700a/b7d04e2c-c565-496c-8efb-aa133cf90d33/Photo/delete?subfolder=&isFolder=false&id=189&usePortalRoot=false&appId=3
We could probably use the same endpoint above, but we'd likely run into permission issues or changes to the APIs that could be problematic.
Thank you for any advice you can give! Perhaps #iJungleBoy can provide some thoughts on this.
As a solution from a completely different direction, if you are on the later release of 2sxc (v12.8+, v13+), and comfortable programming in C#, you might consider doing this as a "cleanup" from a Dnn Scheduled Task. This can be done with a relatively easy setup. We have a Gist in place that we use as a starter. You simply put the code in the /App_Code folder then setup a normal Dnn Scheduled Task. NOTE that you can scroll down to the first comment on the Gist to see a screenshot of a complete working setup.
Accuraty's AccuTasks template on GitHub Gists
There are two more key things to note:
You need to install Dnn's CodeDom 3.6 because the example uses the later versions C#'s string interpolation - OR remove the few $"ASL2021 - {this.GetType().Name}, Task Scheduled Email", bits or convert to string.Format() or something.
Since your task's code is NOT running in a (2sxc) module, if needed, you'll do stuff like this: 2sxc Docs - Use 2sxc Instance or App Data from External C# Code
So, if you are comfortable writing code that "finds and deletes stuff older than NN days" - this might be the way to go.

JSF application : should I use micro-services and how?

I have a web application developed with JSF 2 and primefaces. The project has been frozen for months, but it's quite advanced, the whole application run inside the same container under glassfish, so it's a monolith.
My application has an user interface and its purpose is to offer them the possibility to organize urls to tutorials (any kinds) as cards, with tags for the classification, into folders. So any user has its own tree, they can make a research inside the other users's tree create a link on a file in their own tree, copy a entire folder, reorganize it etc.
Nowedays we hear a lot about microservices, Spring boot, Angular Js, react etc. I like to develop with JSF it's a great framework, but I'm asking myself about refactoring my application, at least the necessary parts into microservices, and if JSF is appropriate for that or if I should user other tools.
What I like for example with JSF is the facility to create views, its component approach, and how it handle the full cycle of a request.
For example with a simple folder creation form :
I have to choose the parent folder, so I can bind a research component to a backing bean that makes a research indirectly in my DB using a DAO ( in my app an EJB using JPA). That happens at the "invoke application" phase and refresh my form list with ajax at the end. When I submit the form I can also bind a converter to the research component to retrieve directly a Folder object, the converter uses also a DAO to retrieve the object that I need at the "Invoke application" phase to finish the job.
I also use validators to control different attributes of a new folder, usually I declare them inside my entity class (Folder, User ...) with annotations like #NotNull etc. Before I save the folder on my db, I also check the user rights to see if he can write inside the parent folder and so on. I do that inside the backing bean, so at the 'invoke application' phase, and return a faces message if anything happens wrong.
When I read about micro-services I see that you can use them directly inside a form using json for communication, so it seems quite different. For example if I have a micro-service for the CRUD operations of my folders, are the validators, the converters, part of the service or are they stand alone services ? And what about the security checks ? that kind of architecture is quite mysterious to me.
ps : English is not my mother tongue so be indulgent please :)
AngularJs is pretty ancient man :)
You have to look at the pain points to identify ways to tear down your monolith. Monolith pains are usually slow and painful dev cycle and difficult manual test phases. If you did the entire arquillian thing and have full continuouos integration with single button deployments, you've slain the beast the hard way. Not many braved this route. But if you're looking at mounting feature creep with code freezes and manual test cycles then yeah you kind of want to try to pull some of those features out into a service you can redeploy very quickly

How to design permissions and conditional loading in extjs

Background
I have to migrte a existing javascript application (one page app) to extjs. The display and behavior of the application depends on the users permission.
Current design
The application is divided into plugins, which represent a feature set to which permissions are granted. Each of those plugins consists of a single javascript file. A user can have permissions for one or more plugins. Depending on the permissions, those files are loaded in the head of the page. Each of these plugins will add its entries to the main menu and expose the methods used to drive the application.
The permissions are stored in a mysql database.
ExtJs's default design
In ExtJs the source files contain each a class. During the build process, all .js files are concatenated to yield one big .js file, that contains everything.
What would be the best design approach?
I considered to use custom compilation with sencha cmd, and create that way a .js file for each plugin. Then I could load these plugins the same way I do it now. But this results in a complicated build and deployment process.
I also thought about creating one and only .js file with a standard Ext build process. I would then load the permissions from the server via ajax in order to construct the menu. All the objects and methods would exist, but only those are accessible where the user has permissions.
In my opinion, the second approach is much easier maintainable, but it seens to have a security problem, because everyone could look at the source and find out about the data interfaces exposed on the server and consumed by ajax.
Any comments, ideas or advices are welcome. Thanks !
Number two would be the way to go. If you keep your server side permissions in check (while updating data etc) you only need ExtJS to show/hide menu items based on permissions. That way, malicious users can turn certain plugins/items on or off, but they can never execute something that requires more permissions then they would normally have.

Drupal 7 Views and custom Modules?

I have implemented several custom modules which essentially bridge Drupal with some third party applications we build in house.
Now as my knowledge of Drupal grows I am curious to know whether or not it is possible to integrate my modules with Drupal VIEWS for reporting purposes.
My modules connect directly to several MySQL databases on an external server and perform queries and render those results within the context of Drupal.
What I would like to do, is use VIEWS somehow to SELECT and manipulate the query as a Drupal VIEW to avoid having to re-purpose the module queries each time a change is requested.
Basically, is it possible to expose the SQL tables in an external app to one which Drupal VIEWS can pick up? I have googled and found a few suggestions but no one seems to know for sure???
It seems drupal expects all content to be of it's own internal "content type" and thus connected to it's core "node" - is this an assumption VIEWS makes?
I don't want to export my third party data into Drupal "nodes, content types, etc" simply so VIEWS can effectively read and render the results.
Seems to me, this should be possible with a Drupal module bridging the gap somehow???
Cheers,
Alex
You could use data module http://drupal.org/project/data
It gives you option to use views on your custom tables. I hope this helps you.

How to merge Drupal database changes

We currently use an SVN repository to ensure everyone's local environments are kept up-to-date. However, Drupal website development is somewhat trickier in that any custom code you write (for instance, PHP code written for a node body) is stored in the DB and the changes aren't recognized by the SVN working copy.
There are a couple of developers who are presently working on the same area of a Drupal site, but we're uncertain about how to best merge our local Drupal database changes together. Committing patches of database dumps seem clumsy at best and is most likely inefficient and error-prone for this purpose.
Any suggestions about how to approach this issue is appreciated!
Unfortunately, database deployment/update is one of Drupals weak spots. See this question & answers as well as this one for some suggestions on how to deal with it.
As for CCK, you could find some hints here.
As for php code in content, I agree with googletorp in that you should avoid doing this. However, if for some reason you absolutely have to do it, you could try to reduce the code to a simple function call. Thus you'd have the function itself in a module (and this would be tracked via SVN). But then you are only a little step from removing the need for the inline code anyways ...
If you are putting php code into your database then you are doing it wrong. Some stuff are inside the database like views and cck fields plus some settings. But if you put php code inside the node body you are creating a big code maintenance problem. You should really use the API and hooks instead. Create modules instead of ugly hacks with eval etc.
All that has been said above is true and good advice.. To answer your practical question, there are a number of recent modules that you could use to transport the changes done by the various developers.
The "Features" modules is a cure the the described issue of Drupal often providing nice features, albeit storing lots of configs and structure in the DB. This module enables you to capture a feature and output it as a pseudo-module (qualifies as a module with .info and code-files and all). Here is how it works:
Select functionality/feature to export
The module analyses the modules, files, DB content that is required to rebuild that feature elsewhere
The module creates a pseudo-module that contains the instructions in #3 and outputs everything (even SQL to rebuild the stuff in the DB) into a module package (as well as sets dependencies for other modules required)
Install the pseudo-module on your new site and enable it
The pseudo-module replicates the feature you exported rebuilding DB data and all
And you can tell your boss you did it all manually with razor focus to avoid even 1 error ;)
I hope this helps - http://drupal.org/project/features
By committing patches of database dumps, do you mean taking an entire extract of the db and committing it after each change?
How about a master copy of the database? Extract all tables, views, sps, etc... into individual files, put them into svn and do your merge edits on the individual objects?

Resources