When I do a git pull origin master from Pantheon, it doesn't seem to pull the database - database

I'm new to this workflow using Git, and I feel like I'm missing one piece of information that's just not obvious to me. I setup a sandbox on Pantheon and did a Drupal install thru Pantheon. Works fine on dev. Then I cloned it to my local, but when I open my local up in a browser it wants to install Drupal, like it was never setup on Pantheon. My best guess is that it's not pulling the database, can't find it, and figures it's a fresh install. But how do I connect the dots here? Thanks!!

You are correct, the database does not pull down with git, only code.
You will either need to manually download the database from their UI or use their command line tool named Terminus. If you're comfortable with the command line, Terminus is the most convenient.
Another option would be to use Kalabox. This is a local dev environment tool which was just released. I haven't tried the latest release yet but have read users reporting it integrates with Pantheon nicely.

Related

Check the Database of a typo3-Website

it might be a strange question, but does anybody know how to
check the name of the database which is used for a typo3 website?
Because I need this DB but I can't remember its name and I have got a lot DBs.
Thanks if Somebody knows the answer.
You can log into the install tool, via url (/typo3/install) or Backend Module.
Depending on TYPO3 Version you will find it in different places there.
In latest version you will see the information directly after accessing the install tool.
Log into the install tool, either under typo3/install, or via the menu in the backend when logged in as admin.
Go to "all configuration", and check the settings under $TYPO3_CONF_VARS['DB'] - everything database related is listed there.
TYPO3 7 LTS
Open the Install Tool of your TYPO3 installation with the following link (only a example): http://example.com/typo3/install. Make sure your Install Tool is enabled with the file ENABLE_INSTALL_TOOL on the folder typo3conf.
After login to the Install Tool you can see the database information. The informations are available on "All configuration" too. Here you can find the database area Database [DB]. The name of the used database you can find on [DB][database].
As nobody mentioned it yet:
If you have file access you can directly view the file typo3conf/LocalConfiguration.php (typo3conf/localconf.php for TYPO3 before version 6.x).
All configurations from the install tool are saved there. Just search for database.

Deploying relevant magento backend changes

I'm thinking of a good deployment strategy for magento. I already have managed to deploy code with git from my local installation to my stage server. (The jump to live is not a problem then)
Now I'm thinking about how to deploy backend changes like the following:
I'm adding a new attribute set and I want it to be available on my stage and later the live server. Since these settings are in the database, I could just do a mysqldump and restore this dump on my stage/live systems.
But I can't do this, since the database has more data like orders, articles (with current stock availability) and a lot more stuff which I don't want to deploy from my testing system.
How are others handling this deployment "problem"?
After some testing, I chose the extension Mageploy, which is nicely to install via modman (I prefer modgit which relies on the same data for installation) and already captures a lot important backend settings.
If you need more, it's possible to extend it to more backend settings by yourself (and then contribute to the git project. Pullrequests are concidered quickly)

How to transfer live WordPress site to Wamp?

I've got a wordpress site that I have been using for a year now and it is hosted with HostGator. I have got a few tests i would like to run on the site, but I would like to test it offline using wamp first before making it LIVE.
The problem is previously I was always making changes to the LIVE site, usually at hours when I get little to no traffic. However, that has changed now and I do get traffic most hours through out a 24hr day.
So my problem is:
How do i download my existing website to laptop (wamp) and make those changes with new theme? (total newbie, sorry!)
I use Windows 7, so not sure what I need to be doing to get the site working like a live site offline.
Once I have implemented the new changes, what is the best way to upload the updated site back to the HostGator server without having any downtime or errors for site visitors?
Is there anything else I need to install or do inorder for this to work? I hope you can give me as much information as possible or any links to any guides or articles that explain how to do this.
Thanks so much for any help you can offer!!!
If you're using Hostgator, the process is simple:
Install XAMPP or WAMPP on your computer;
Go to your cPanel, backup and download your website;
Extract the backup to your computer, specially the homedir and the sql;
Go to your local environment, access http://localhost/phpmyadmin
Create a new database, doesn't matter the name but for the example let's call it "database";
Inside that database, import the one taken from the backup;
create a new folder inside your htdocs with the name of your website, "example.com";
Extract the content of the homedir there;
edit wp-config with the following data:
Host: 'localhost'
Username: 'root'
Password: blank
access http://localhost/example.com
You can check a good tutorial about the subject here.
About putting the site live, I recommend you to use a GIT repository, however it's understandable that might be a little complicated and perhaps too much work for what you're trying to achieve.
Try to move your files directly from your local to live environment using Filezilla or WinSCP, the drag and drop should replace the files live and the downtime should be minimal.
Instead of WAMP, you can always use VirtualBox to install CentOS or Ubuntu/Debian.
You can go one further and install either CentminMod to automate creating a LAMP, or a full panel like ISPConfig or Virtualmin.
That take care of create the environment.
Create a new account on the LAMP, using the same domain name.
You can FTP with Windows to get the files, but networking Windows and Linux is a pain. The better option is to use the command line (CLI) in the Linux VM to ftp the files from Hostgator to the VM. This guide will help with that process: http://www.tldp.org/HOWTO/FTP-3.html
Then your only concern is the MySQL database. And for this, you have several options.
For me, the easiest is to buy (or try!) SQLyog on Windows, and then copy the database from the Hostgator source to the localhost destination. Some mild networking is needed for Windows to see the Linux VM, but nothing as complex as file sharing (the FTP issue). SQLyog is far quicker than backing up the database, then restoring it -- especially since you can run into memory issues doing it this way. It fully depends on the size of the database.
The cheap/free backup>restore method is to use phpMyAdmin.
WordPress also has plugins, of varying cost, but you still have the possible backup>restore memory issue there as well.
When done, just copy it the other way, again using SQLyog and CLI ftp. You'll still have some downtime, but it will hopefully be minimal.
As a newbie, this probably seems like rocket science, but at least it gives you a good place to start. Welcome to the world of locally dev'ing sites!

Can I have multiple versions deployed on openshift?

For a research project I am comparing PaaS providers. I'm however not sure about the following. On App Engine I can have multiple live versions of my application. If I have a new version and I deploy it I can reach it on a non-default url like: versionX.myapp.appspot.com. I can use that url to test it while running on the PaaS. Once I'm happy with the result I will change the default version and my visitors will also see the changes.
I am wondering if Openshift has something simular? Only thing I found so far is that it deploys on git push and if it fails to build it will leave the old version live. This of course still leaves a risk for functional errors. If I then still have to install a test-server locally I am still doing system administration and it would be nice if this can be prevented.
How is this best resolved when using openshift?
Edit: I did found this article: https://www.openshift.com/blogs/release-management-in-the-cloud
Is that the way to or are there other common ways to do this?
The best way to re-create the google functionality would be to run a dev/qa instance on a separate gear and add those git repositories as remotes to your local git working copy, then you can git push to any environment for testing before you deploy to production.

Integration of different works by different people in moodle

We are developing a moodle site. We are a group of 5 people and each one is working on different module locally. But now we wwant to integrate the work of all in one machine or server. Is there any way to version control it or integrate it as the databse of each one is different because of different data. Please provide the solutuion as early as possible.
It is not completely clear as to whether you are separately working on the content of the site or the code for the new site, so I will attempt to answer both questions.
For content the easiest way to integrate it all together into one site is to use the Moodle backup and restore mechanism ( http://docs.moodle.org/26/en/Course_backup ) - backup each of the courses and then restore them onto the main site. If you have a lot of courses to transfer, then it may make more sense to write some code to automate certain aspects of this, but that can be quite a bit of work, so usually it is easier to just manually do the backup and restore.
For code the answer is Git. All the core Moodle code is version controlled via git. Make sure that each developer is working with their own clone of your main git repository (you can find the core Moodle repository at . Once they have committed each of their changes, then they can be pushed (to a central repository) or pulled to your production site. Read more at http://docs.moodle.org/dev/Git_for_developers
Note that if the code for each module has been written with the proper DB installation / upgrade code ( http://docs.moodle.org/dev/Upgrade_API ) then it should simply be possible to take the code from each of the developed modules, put them together into one codebase and then create a fully-working fresh install. Once you have that, you should be able to use backup and restore to transfer any required courses from the development servers to the live server.

Resources