Unfortunately an Ubuntu machine I manage has stopped working, and after much work it seems like I'll have to reinstall the system. All the data from the old system is intact and backed up.
Among this data is a PostgreSQL installation with some databases (that was running isolated on this machine.) My goal is to move this data as is, and run it on the fresh install.
Since the old system is not running, I can't do a pg_dump.
According to this article it should be possible to mode the data folder, but there are two restrictions mentioned. What I do not fully understand is if this will be a problem for me?
I cant seem to find much information on this online, since all refer to the preferred pg_dump-method.
Any help would be highly appreciated.
As per the suggestions in the comments to the initial answer, the Postgres directories where copied to a new host and the database started without any problems.
Related
Does anyone experiment in creating salesforce Package.xml automatically for continuous integration? If there any script or some idea please share.
You know incremental package.xml helps to deploy only the modified files rather than using complete package.xml that redeploy unmodified files as well which takes a lot of time.
Thanks in advance!
Tricky. And not really a programming-related problem, consider cross-posting this to https://salesforce.stackexchange.com/ or maybe even https://devops.stackexchange.com/
I don't think there's no clear answer, you'll have to experiment. Especially that you tagged "migration tool" (so old-school, battle-tested but lower priority Metadata API; seems that all focus is now on SFDX style of deployments). Do you use any version control (ideally Git) or do you hope to somehow compare source & target org, figure out the deltas and deploy only them?
Remember that often SF gets better at detecting "no changes" with every release (how old is your migration tool's jar file?). For example when I deploy my current project to an empty sandbox (exact copy of prod, no custom objects, code etc yet) the initial deploy takes ~7 minutes. But any subsequent deploy with same content or slight changes takes just 3-4. So try to calculate time lost in the grand scheme of things and decide what gains you want to see / how much time you want to spend on experimenting and tweaking the solution.
You could look into dedicated deployment solutions such as Gearset, Autorabit, Odaseva (I'm not affiliated with either and this list is not exhaustive). They often are capable of running a comparison for you.
There are several projects that try to compose package.xml based on Git diff(erence) between two commits. Of course you need to have a repo first and some regime:
https://github.com/cloudsandbox/sfdx-gen-pack saw presentation about it at Cloudforce London 2019
https://github.com/Accenture/sfpowerkit seems to have a "diff" command (disclaimer: I used to work for Accenture but not affiliated now, haven't worked on the tool, haven't used it personally)
https://cumulusci.readthedocs.io/en/latest/ this seems to be interesting and mature. Built by SF employees, not an official tool but used to CI deploy the non-profit packages they build (maybe you heard about Non Profit Starter Pack, especially if you ever considered enabling Person Accounts). I'm not sure if they do delta deployments as such but there seems to be a command that updates package.xml with files in repository so it's a start? https://cumulusci.readthedocs.io/en/latest/tutorial.html#part-4-running-tasks
I'm not saying CumulusCI will be a silver bullet but out of these 3 seems to be most actively maintained ;) But sounds like you'd have to get familiar with SFDX (if not whole thing then at least commands to convert the project back and forth between "source" (SFDX) structure and Metadata API structure
Answering my question by myself: I found git diff master feature/vat | force-dev-tool changeset create vat working!
Thanks to Roman answered in https://salesforce.stackexchange.com/questions/184332/is-there-a-pre-build-solution-for-generating-a-package-xml-from-a-git-repo
For some reason I can't get SQL Server 2017 installed on my Windows 10 machine.
First thing to do with this buggy installer is that I had to uninstall VCRuntime 2017 in order for the installer to work.
And now, the installer is stuck at this point exactly every time I try to install it:
What I've tried so far:
Killing msiexec process
Running the setup with additional parameter as mentioned here
Setup.exe /SkipInstallerRunCheck
Restarting ... reinstalling ... turning off anti-virus ...
[Solved]
The problem was due to a background download that was taking forever especially on a low internet speed (i.e. python or R-support component).
[Solution]
If you really need python or R-support just wait until download is complete
Else, deselect python and R-support from the component list.
(or) kill the child process for python or R-support component downloader from task manager.
UPDATE:
The actual problem turned out to be the R-support component(s) slowly downloading in the background locking up the installation GUI
with no notification or warning show to the user as to what is
actually going on.
So it seems this "locked install problem" can be caused by installing several different components, at least by Python or R-support. As mentioned below, please check any available logs or event logs for clues.
In summary, options:
Maybe try to unselect such components for install if you do not need them.
If you need the components, leave the setup to complete, and check progress in log files as explained below. Verify Internet access (proxy?).
Stuck Download?
UPDATE: Did you see this blog? Looks like the setup tries to download and install the Python runtime, and this can take forever. Are you behind a proxy btw? No direct connection to the Internet? If so I suppose this could also cause further problems. Probably not the cause, but worth a mention.
Apparently you can check the following log file for progress for the installation:
%ProgramFiles%\Microsoft SQL Server\140\Setup
Bootstrap\Log\DATE_TIME\RSetup.log
DATE_TIME in the above path must be converted to your valid values. For example: 20170804_162723 (date part and time part).
See this answer as well: SQL server 2016 installation freeze. You could also try the suggestion to deselect all components you do not need to prevent any background downloads?
General Debugging
Leaving in the general purpose debugging suggestions below.
Generic Advice: From experience I would create a new local admin user and try to install using that account. This is to avoid any "unclean" or special conditions that have occurred in your user profile or registry during regular Windows use. Might not do much, but sometimes it gets the job done with surprising ease. Worth a try I think.
Some Further Things: I wrote up a little check list a while back, I'll add it and see if it inspires some new ideas that can help you. See under "Core Deployment Problems". That first "check list" was condensed from a longer and somewhat excessive first writeup - one of those answers that unintentionally turned into a blog and maybe a hard one to read.
Logging: Did you check log files and / or event logs properly for clues as to what is happening? I find the best approach for deployment to enable logging for all MSI installations. The performance hit it triggers is minuscule compared to the benefit of having a real log-file always available when you suddenly need one. You can enable logging for all MSI files as explained on installsite.org (section: "Globally for all setups on a machine"). MSI log files will then just sit in your %TEMP% folder after installation. They have a random hex name, and you can flush them all regularly if you do not need them. You sort by modify date / time to find the latest one(s) created - obviously.
Jedi trick: You will want to go home and re-think your life if you don't enable logging for all MSI files. Moral of the story: MSI log files are cool. They are very verbose, but they are beautiful. There are some hints on interpreting them here (bottom).
My 2 cents: SQL Server Installer consists of several small MSI installers. MSI installers can only be installed one after each other (as fas as I know). In my case, I launched another MSI setup while installing SQL Server. This caused SQL Server Setup to hold until I finished the concurrently running setup.
So, at least in my case the problem was self-made.
You have to remove configuration settings for SQL Server from Windows Registry editor.
Sql server
2017
VS
So my company installed PostgreSQL on my computer, which I use, rarely and without understanding, for one specific function.
I'm trying to follow Lynda etc. tutorials to understand (Postgres)SQL better, since that's what we use, but all the tutorials ask students to reconfigure certain aspects of their system in order to follow along with example files (which I would really like to do).
Since I've messed up my dev env once already, I'm hesitant to touch anything that will cause issues with the local versions of our project.
I know this is an extremely wide-angle question with no easy answer, but if anyone has any general advice for playing with sample databases in MAMP Pro (or anywhere else) using Postgres without interfering with the servers I'm currently running, it would be a huge help.
i would recommend you use Vagrant and set up a isolated postgresql instance. Here is a great wiki you can follow to do this.
UPDATE: Given your comment,An easy solution is to just backup your data and proceed trying out the the Postgres examples you can always restore your data after you are done..
I've got a wordpress site that I have been using for a year now and it is hosted with HostGator. I have got a few tests i would like to run on the site, but I would like to test it offline using wamp first before making it LIVE.
The problem is previously I was always making changes to the LIVE site, usually at hours when I get little to no traffic. However, that has changed now and I do get traffic most hours through out a 24hr day.
So my problem is:
How do i download my existing website to laptop (wamp) and make those changes with new theme? (total newbie, sorry!)
I use Windows 7, so not sure what I need to be doing to get the site working like a live site offline.
Once I have implemented the new changes, what is the best way to upload the updated site back to the HostGator server without having any downtime or errors for site visitors?
Is there anything else I need to install or do inorder for this to work? I hope you can give me as much information as possible or any links to any guides or articles that explain how to do this.
Thanks so much for any help you can offer!!!
If you're using Hostgator, the process is simple:
Install XAMPP or WAMPP on your computer;
Go to your cPanel, backup and download your website;
Extract the backup to your computer, specially the homedir and the sql;
Go to your local environment, access http://localhost/phpmyadmin
Create a new database, doesn't matter the name but for the example let's call it "database";
Inside that database, import the one taken from the backup;
create a new folder inside your htdocs with the name of your website, "example.com";
Extract the content of the homedir there;
edit wp-config with the following data:
Host: 'localhost'
Username: 'root'
Password: blank
access http://localhost/example.com
You can check a good tutorial about the subject here.
About putting the site live, I recommend you to use a GIT repository, however it's understandable that might be a little complicated and perhaps too much work for what you're trying to achieve.
Try to move your files directly from your local to live environment using Filezilla or WinSCP, the drag and drop should replace the files live and the downtime should be minimal.
Instead of WAMP, you can always use VirtualBox to install CentOS or Ubuntu/Debian.
You can go one further and install either CentminMod to automate creating a LAMP, or a full panel like ISPConfig or Virtualmin.
That take care of create the environment.
Create a new account on the LAMP, using the same domain name.
You can FTP with Windows to get the files, but networking Windows and Linux is a pain. The better option is to use the command line (CLI) in the Linux VM to ftp the files from Hostgator to the VM. This guide will help with that process: http://www.tldp.org/HOWTO/FTP-3.html
Then your only concern is the MySQL database. And for this, you have several options.
For me, the easiest is to buy (or try!) SQLyog on Windows, and then copy the database from the Hostgator source to the localhost destination. Some mild networking is needed for Windows to see the Linux VM, but nothing as complex as file sharing (the FTP issue). SQLyog is far quicker than backing up the database, then restoring it -- especially since you can run into memory issues doing it this way. It fully depends on the size of the database.
The cheap/free backup>restore method is to use phpMyAdmin.
WordPress also has plugins, of varying cost, but you still have the possible backup>restore memory issue there as well.
When done, just copy it the other way, again using SQLyog and CLI ftp. You'll still have some downtime, but it will hopefully be minimal.
As a newbie, this probably seems like rocket science, but at least it gives you a good place to start. Welcome to the world of locally dev'ing sites!
This is a subject of common discussion, but through all my research I have not actually found a sound answer to this.
I develop my websites offline, and then launch them live through my hosting account.
I utilize codeigniter, and on that basis there are some fundamental differences between my offline and online copies, namely base urls and database configurations. As such I cannot simply develop and test my websites offline and then upload them as it requires small configuration changes which are easy to overlook and good lead to a none working live website.
The other factor is that when I am developing offline, I might add a database table or a column whilst creating some functionality. When I upload my local developments to my host, they often do not work as I have forgotten to upload the new database structure. Obviously this cannot happen - there cannot be any opportunity for a damaged or broken live website.
Further to this, I'd like to be able to have logs of my development - version control of sorts such that if i develop a feature, and then something else stops working I can easily look backwards to at least see the code changes which could have caused the change.
My fourth requirement is as follows: if i go away on holiday for a week without my development laptop, and then get a bug report, I have no way of fixing it. If i fix it on the live copy, not only is it dangerous, but i'll inevitably not update it on my local copy - as such when i update my live copy next time, that change will be lost. Is there a way that on any computer i can access my development setup, edit and test, launch to the live site, whilst also committing it such that my laptop local copy is up to date.
So yes.. in general im looking for a solution to make my development processes more efficient/suitable. Any ideas?
Thanks
Don't deploy by simply copying. Deploy by using a script (I use Apache Ant) that will automate the copy of specific files for each environment, the replacement of some values, etc.
This just needs rigor. Make a todo list while developing, and check that every modification on the server is done. You might also test the deploy procedure on a pre-production server which has an similar configuration as the production server, make sure everything is OK, and then apply the same, tested procedure on the production server
Just use a version control system. SVN or Git are two free candidates.
Make your version control server available from anywhere. If it's an open-source project, free hosting solutions exist. Of course, if you don't have a development computer wvailable, you'll have to checkout the whole project, and probably install some tools to be able to develop, test and deploy. Just try to make it as easy as possible, or always have your laptop available. If you plan to work, have your toolbox with you. If you don't plan to work, then don't work. When you have finished some development, commit to the server. When you go back to your laptop, update your working copy from the server.
Small additions and clarifications to JB
Use any VCS, which can work (in a good way) with branches - your local and prod systems are good candidates for separate branches, where you share common code but have branch-specific config. It'll require some changes in your everyday workflow (code in "test", merge finished with "prod", deploy /by tools, not hand/ only after merge...), but it's fair price
Changing of workflow, again. As JB noted - don't deploy by hand, don't deploy wrong branch, don't deploy "prod" before finished merge. But now build-tools are rather smart, you can check such pre-condition inside builder
Just use VCS, maybe DVCS will be somehow better. I say strong "No-no" for Git as first VCS, but you have wide choice even without it - SVN (poor branch|merge comparing to DVCS), Bazaar (not a tool of my dream, but, who knows), Mercurial, Fossil SCM, Monotone
Don't work on live, never do anyting outside your SCM. One source of changes is a rule of happy developer. Or don't work at all at free-time, or have codebase always reacheable for you (free code-hosting /GoogleCode, SourceForge, BitBucket, Github, Assembla, LaunchPad/ or own server), get it as needed, change, save, deploy