So my web hosting company has restored my hosting files (they were deleted due to a complication), but out of my 4 Wordpress installations 2 of them could not be restored.
I have googled how to restore the database but I only come across people who are restoring from a backup, but I have no backup to restore from; I only have the wordpress files.
What steps must I take to get the sites back online? Ie I guess rebuild the database from scratch but I already have the files (hope that makes sense).
I'm not a programmer or a sql expert, but do have a lot of hosting experience and I can tell you this for sure - if all you have are the Wordpress files (the files in your hosting account) but you don't have a copy of the actual database (usually a .sql file or gzip of it) then you cannot simply restore your Wordpress site content.
The files in your hosting account - the Wordpress files such as index.php and so forth - are not where your posts and page content are stored. They are just files that tell Wordpress how to function. All of your references to the actual content of your posts and pages are stored in the database of your Wordpress.
So, you need the database backup in order to restore your Wordpress to what it was before your mishap.
If you're on typical shared hosting (such as a cPanel host) then you can should be able to access your database through your hosting control panel. Most modern hosts provide you with a hosting control panel that includes direct access to your databases - either through a mySQL tool or phpMyAdmin tool within your hosting control panel.
So for example - if you're on a cPanel host you can log into your cPanel and then click the phpMyAdmin icon to discover the databases you have stored there, and from there you can obtain a copy (export) of your database.
If for some reason you don't have access to a hosting control panel with a tool for accessing your databases, then the next thing would be to get the database through shell (ssh) access, which in a lot of cases is not granted to shared hosting customers.
Ultimately if you have a Wordpress database on the server through your hosting account, your web host can give you a copy of your WP database(s) because they're stored on the web server (either locally on the same server as your web site, or on a mass database server where the host keeps them).
At that point you should submit a request to your host asking for copies / dumps / exports of all your databases, and when they provide you with the database files you could import them back into the corresponding database names via a tool within your hosting control panel (such as phpMyAdmin or mySQL section of your hosting control panel).
Bottom line - you can't restore your Wordpress pages and posts back to what they were without a copy of the database for each Wordpress site you run. Your host has those databases on their server still, unless you accidentally deleted them through one of the tools I've mentioned. If your host tells you that they cannot help you obtain a copy of your database files, then you have a real problem if you don't have your own backup. Database files are not something that you simply upload/download via FTP like your standard html / php files. Database files are stored on the web server of your host and in most cases your host can simply provide you with a dump / export / copy of your databases if you request them. If for some reason you don't have access to a hosting control panel toll where your databases are stored, then request them from your host. If they cannot provide the database file to you and you don't have a backup, then you may be looking at starting from scratch.
As a hosting support tech myself, I can tell you that any good host can easily dump a copy of each of your databases into a folder in your account so that you can import them back using a tool like phpMyAdmin within your hosting account. If they tell you they don't have copies of your databases then you either deleted them (not likely, unless you logged into your control panel and did so unwittingly) or you're with a bad service. (Not jumping to conclusions there, just pointing out the fact that if you didn't remove your databases then they're on the server and any good server admin can give you a copy to restore, along with the instructions.)
It doesn't matter that you have all of the regular files (such as the .php files and .jpg file etc...) on the server if you don't have the database in place that they were connected to, because the database is where much of the content paths and specifics are stored and organized. This characteristic is not exclusive to Wordpress - almost every PHP script is database driven relies on a database for serving the content.
NightOwl's answer is excellent; I would like to add just this. You might have automatic databases backup set up for your account and you don't know it. So I suggest you have a look at your control panel to see whether this is true.
Otherwise, ask again your provider for a more complete restore.
Related
I've been working with Umbraco to create my first website.
So far I almost completed the framework of my website and now I'm thinking about how to move it to my server. My idea is to upload my website when the framework is ready, and then create the contents remotely. I'm updating an existing website, which means that the host server already hosts one website, which I have to replace with the one I'm creating with Umbraco.
How can I do the deploy of the website?
Have to install Umbraco at the host server?
What about the Database?
I will use the database of the host server. I want to upload my new website, but still keep the old one in case something goes wrong. Please keep in mind that I'm a total newbie at this and have never uploaded any website. Would really appreciate your help!
Thanks in advance, have a great day!
Depending on the capabilities of your hosting provider. Create a new folder on the server and place there your Umbraco files.
And change the IIS webroot location to the new folder. (by changing it to the old location your old site is back again)
And you need to backup your local database, and restore on the database server from your hosting provider. and configure the connectionstring.
At low end hosting providers you can often not restore database backups.
one option may be to generate a SQL script including data. (often you need to change the database schema,user to the one you own on the hosting provider) Otherwise, start with a clean database and import your Umbraco items with a Umbraco package.
I have a Web application that I usually deployed using Web Deploy directly from Visual Studio (whatever branch I am currently using in VS - normally master). But now I'm introducing a second web app on Azure that will be built from the same repo but different branch. To make things simpler I will be configuring both Web apps on Azure to integrate directly with GitHub and associate them with specific branch.
I also added two additional web.config files: Web.Primary.config and Web.Secondary.config and configured app settings on Azure portal of each web app by adding additional value SCM_BUILD_ARGS and set them to
SCM_BUILD_ARGS=-p:PublishProfile=Primary // in primary web app
SCM_BUILD_ARGS=-p:PublishProfile=Secondary // in secondary web app
which I understand will transform correct config file with specific external services' configurations (DB connection, mail server, etc.).
Now the additional step that I would like to include in continuous deployment is run a set of SQL scripts that I have in my repo that I used to manually upgrade database during Web Deploy in VS. Individual scripts are actually doing specific database upgrade steps:
backup current tables - backup creates a set of Backup_OriginalTableName tables that are copied from existing ones and populated with existing data
drop whole DB model - all non-backup objects are being dropped from procedures, functions, types, views, tables...
create model - creates all tables, views and indices
create user types
create user functions
create stored procedures
restore data to new tables from backup tables - this step may occasionally break if we introduce new non-nullable columns to tables in the new model don't have defaults defined on them; I will somehow have to mitigate this problem by adding an additional script that will add missing columns to backup tables and give them some defaults, but that's a completely different issue.
I used to also have a set of batch files (BAT) in my VS solution that simply executed sqlcmd against specific database instance and executed these scripts in predefined order (as above). Hence I had batches:
Recreate Local.bat - this one used additional SQL scripts to not restore from backup but rather to recreate an empty DB with only lookup tables being populated and some default data for development purposes (like predefined test users)
Restore Local.bat - I used this script to simply restore database from backup tables discarding any invalid data I may have created while debugging/testing since last DB recreate/upgrade/restore
Upgrade Local.bat - upgrade local development DB executing scripts mentioned above
Upgrade Production.bat - upgrade production DB on Azure executing scripts mentioned above
So to support the whole deployment process I was now doing manually in VS I would now like to also execute these scripts against specific Azure SQL DB during continuous deployment. I suppose I should be running these right after code deployment because if that one fails, DB shouldn't be upgraded either.
I'm a bit confused where and how to do this? Can I configure this somewhere in Azure portal? I was looking for resources on the Web but I can't seem to find any relevant information how to do additional deployment steps to execute these scripts. I think this is some everyday scenario as it's hard to think of web apps not requiring databases these days.
Maybe it's just my process that is wrong for DB upgrade/deployment so let me also know if there is any other normal way that does DB upgrade/migration with continuous deployment on Azure... I may change my process to accommodate for this.
Note 1: I'm not using Entity Framework or any other full blown ORM. I'm rather using NPoco and all my DB logic is built in SPs that DAL is using.
Note 2: I'm aware of recently introduced staging capabilities of Azure, but my apps are on cheaper plan that doesn't support staging and I want to keep it this way as I may be introducing additional web apps along the way that will be using additional code branches and resources (DB, mail etc.)
It sounds to me like your db project is a good candidate for SSDT and inclusion in source control. You can create a MyDB.sqlproj that builds your db as a dacpac, and then you can use SqlPackage.exe Publish to accomplish your deployment to Azure.
We recently brought our databases under source control and follow a similar process to build and automatically deploy them (but not to a SQL Azure DB). We've found the source control, SSDT tooling support, and automated deployment options to be worth the effort of setting up and maintaining our project this way.
This SO question has some good notes for Azure deployment of a dacpac using SSDT:
How to publish DACPAC file to a SQL Server database project via SQLPackage.exe of SSDT?
I've recently been hired on as an intern to take over a previous intern's Access 2003 Database. I have no prior experience in Access, and only a fundamental understanding of relational databases/SQL.
I'm looking to make the database faster, and more secure. Right now it's split on the network drive, with the backend database in a subfolder within the main project folder. It's being used by around 70 employees to take tests and store certifications. Several admins use it to create and print these tests.
It's extremely slow. The files are currently stored on a server several states away. If I transferred this database to Sharepoint, would it be faster and more secure? Is it worth the time and effort to do so?
The employees that use this database currently access it from a .exe on their desktop. Would sharepoint be more user friendly for them?
Alternatively, would moving the .mdb files to a closer server solve the speed problem? I'm currently using Access 2010. The forms are painfully slow to use as of right now.
Thank you
Moving the files to a local server would alleviate a lot of the speed concerns. Moving the file to SharePoint wouldn't do much different in terms of performance. But I'm assuming the files aren't local already for an unstated reason? Ideally, it should be moved to MS SQL server if you want to move the database, but that requires MS SQL knowledge.
Moving to SharePoint will only work if you up-size the data tables to SharePoint lists.
You cannot place the Access mdb/accDB file on SharePoint in some shared folder and have multiple users update at the same time. The reason of course is SharePoint files cannot accept “partial” writes. You have to "pull whole" file to client, update, and send whole file back. So this is not a possible setup with Access.
Access requires in multi-user mode that individual users can update “ONLY bits and parts” of the file at the SAME time. When you place a Word or Excel or in this case an Access file on SharePoint then the WHOLE FILE must be downloaded to the client. User then edits and then saves the file back up to SharePoint. So SharePoint is whole document based not file based like windows is. There is no NTFS file system - only a web based up/down file system (very much like FTP).
So SharePoint is a web based interface and Access requires the windows networking system + ALSO the ability to update bits and parts of the file (something SharePoint does not support nor any web site for that matter).
However if you move your back end tables out of Access and up-size the data to SharePoint tables (lists), then the Access front end clients can connect + edit that data. This is not much different in concept of up-sizing the data tables to SQL server.
So Access front ends can connect to an Access back end on a file server (your current setup), or connect to SQL server tables, or connect to SharePoint tables.
I explain how to up-size data tables to SharePoint in this video:
https://www.youtube.com/watch?v=3wdjYIby_b0
In some cases Access to SharePoint tables will run absolute circles around Access to SQL server. However in other cases such a setup will run SLOWER then SQL server. Only an experienced Access developer on a case by case basis can determine if SharePoint tables would be appropriate for your application. As the other poster points out adopting SharePoint or SQL server will require experience with those technologies along with likely a few good years of Access experience. Remember Access has a rather long learning curve – in most cases longer then say learning c++
In your case due to the Wide Area Network (WAN), then I suggest terminal services is your best bet.
I explain in easy to grasp terms as to why your setup now is slow in this article and what solutions you can adopt:
http://www.kallal.ca//Wan/Wans.html
I am having one ubuntu local server in which we used to have all our development websites. They all are php based sites. I would like to know whether we can have script or something to cron backup the files and database daily to external harddisc ?
Please let me know.
Here is the list of services, which should help you
http://jarvys.io - command line tool/service, that provides server backup to the cloud, simple restore process.
https://bitcalm.com - SaaS for the Linux server files and DBs backups. It has web ui to configure and manage the backups for all you servers. It backs up your data to Amazon S3 (you can use your own storage), recovery is really simple.
I don't have enough reputation to post more links. But you can google services like gobitcan, backuprun, tarsnap - all these are the services that can solve your problem too.
I have a client who owns a business with a handful of employees. He has a product website that has several hundred static product pages that are updated periodically via FTP.
We want to change this to a data-driven website, but the database (which will be hosted at an ISP) will have to be updated from data on my client's servers.
How best to do this on a shoestring? Can the database be hot-swapped via FTP, or do we need to build a web service we can push changes to?
Ask the ISP about the options. Some ISPs allow you to ftp upload the .mdf (database file).
Some will allow you to connect with SQL management studio.
some will allow both.
you gotta ask the ISP.
Last time I did this we created XML documents that were ftp'd to the website. We had an admin page that would clear out the old data by running some stored procs to truncate tables then import the xml docs to the sql tables.
Since we didn't have the whole server to ourselves, there was no access to SQL Server DTS to schedule this stuff.
There is a Database Publishing Wizard from MS which will take all your data and create a SQL file that can then be run on the ISP. It will also, though I've never tried it, go directly to an ISP database. There is an option button on one of the wizard screens that does it.
it does require the user to have a little training and it's still a manual process so mabe not what you're after but i think it will do the job.
Long-term, building a service to upload the data is probably the cleanest solution as the app can now control it's import procedures. You could go grossly simple with this and just have the local copy dump some sort of XML that the app could read, making it not much harder than uploading the file while still in the automatable category. Having this import procedure would also help with development as you now have an automated and repeatable way to sync data.
This is what I usually do:
You could use a tool like Red-Gate's SQL Data Compere to do this. The tool compares data between two catalogs (on same or different servers) and generates a script for syncing them.