CiviCRM "database looks to have been partially upgraded" - drupal-7

I recently upgraded both Drupal and CiviCRM to the latest versions. Drupal works fine, and so does Civi except when I move to the Civi menu, I get a message that says "Database check failed - the database looks to have been partially upgraded. You may want to reload the database with the backup and try the upgrade process again." This happened earlier and reloading the most recent backup didn't help. We had to go back quite a ways before we found one that did, then had to reload a lot of data from .CSV files and by hand. I'd rather not go through with that again.
One thing we found when comparing the development site on my WAMP desktop (which was a new install that works well) with the one on my ISP's server is that the server version contained two MyISam-format files from, or generated by, CiviCase where Civi wants to see InnoDB-format files. My ISP, far more knowlegable than I am about MySQL, converted these two files two InnoDB and the problem remains. This leaves me with two questions:
could the MyISam files be the source of the "incomplete upgrade"? and
is there some way to reset a flag that tells Civi that the database is incomplete or to run the database check manually?
Thanks for any help. Civi seems to work OK as is, but the error message will be disturbing to my end users.

That message happens when you have begun the CiviCRM database upgrade but it hasn't finished. CiviCRM edits the version number in the civicrm_domain table to flag that you're in the middle of an upgrade, and when the upgrade completes, it should remove that.
The simple way to remove the message is to go edit that in the database, but it gets set there for a reason: your database upgrade never completed.
You should restore everything to the last version where it all was working--restore both the code and the database. Play around for a bit and make sure nothing funny is happening.
Run a normal CiviCRM upgrade, replacing the files and running the upgrade script. Take note of anything that seems funny when the upgrade script runs. You might try doing a minor upgrade--just a point release--simply to be sure that any upgrade is working fine.
At this point, you should either have no problems or a much more detailed problem.
Finally, please note that there is now a CiviCRM-specific StackExchange site, which is where you'll find the most CiviCRM experts to answer your questions.

Related

Can I temporarily install fresh Joomla and connect to old database while I fix it

My site is messed up and I am trying to fix it, and regardless of it I get help, it is going to take awhile likely, and it's really important that my site be live, even if it's a crappy version with just the articles and no template.
Would it not work to make a backup of the database, install Joomla fresh (the same version) and connect it to that duplicate database (then point my domain there) and then go back to working on fixing the current site that is live now? Are there any issues I should know about going in? There's a good chance the issues are related to the template or extensions (at least my understanding so far, see my other post for details on the issue) so I would think it would be faster to do this to get a working site rather than trying to turn off and on each extension, especially when I have to do it manually (and I don't know how yet) as I can't access the backend.
If this will work, do I choose the database when I install or just install empty and then change what database it connects to or do i install empty and import the tables (and how)? Still have to figure out if I can make a clone of the database and not all the files as it takes hours.
Thanks for the help, and if I should have appended this to the other post I apologize, but I figured its a separate issue.
First, ensure you have backups of both the files and the database. Then make a local copy of your site where you will work later.
The infection may lie:
in the Joomla core files, with extra content (which is usually fairly easy to spot, for example an eval of a large base64-encoded variable);
in extra files (keep in mind that even images could contain malicious code), these would be usually triggered outside of Joomla for spamming or other nefarious purposes
in the database content.
Fix:
Apply a fresh Joomla update package over your site; you will only fix n.1 above. This may restore some functionality for the first hour of survival.
Analyse the logs, and try to figure out how they got in. You need to step up security as obviously what you have is not enough.
Install a fresh Joomla, add all extensions that your site uses, copy the images folder, then connect it to a copy of the compromised database. This will fix n.1 and 2 above (as you got rid of any extra files). This may survive until they figure out you fixed it; but if you haven't patched your security, they will hack into your site again. Keep a copy of this, and restore as needed as you proceed with the following step.
Export the db to sql format (mysqldump or phpmyadmin may come in handy), then search for any xss traces, php code, javascripts that may have been injected. Since a complete control could take days, and assuming the malicious code links elsewhere, look for strings such as "https://" and "http://"; escape / as \/ and \\\/ to account for json-encoded data as well.
Once the db is clean, your local copy is reasonably safe; update all extensions and Joomla, and use it to restore the website until you fix your security.
It might work, i mean cloning the DB as far as joomla version is the same. It won't break like that, but may fail if files for extensions are not found. This is somewhat wrong, the question is how many extensions you are using and how much cleansing you need.
On the other side you mention that the site should be 'live'. Just do everything on localhost, test, fix templates, etc. Then if you're sure you're done, use akeeba backup and deploy new version to your server without long delays.
Any kind of cleansing needs some start.
You can clean the site while live, depends on complexity.
Clean might be done offline and deployed.
Sometimes import/export custom routines are needed, so you have to make own tools for everything. It occurs with large data, like when people used to made mess inside images folder or something like that.
4 ...
It's pointless to make copies of DB. You install the same version of Joomla on your local server, then you install the same template, you copy styles etc.
Then you import data with your own tools or paid ones. Estimated time is from few hours to few days, it's just data :)

WSo2 EMM - App Management Database Bug

Running WSo2 EMM 1.1.0, everything has been working just fine except for one big issue.
From the moment I first click on an app in the App Management tab, the WSO2EMM_DB.h2.db file starts to steadily grow as long as the server is running, even with absolutely no changes. Eventually, it gets so big that clicking an app on that tab takes a ridiculously long time to load the list of devices using the app. We're talking 5+ minutes, it becomes completely unusable. I have checked the error logs and found no errors at all, every time.
Restarting the server does nothing to correct the issue. Even if I click an app on the App Management tab once, and never again, the database file will continue to grow. Even restarting the server and not logging into the EMM page, it will continue to grow.
The only thing I've found so far that can possibly help is keeping backup copies of the database file and overwriting the current file when it gets too big. Obviously that's not a solution, as I'd need to create a new backup file every time there's a change on the server, and eventually the database file would grow too big from that too.
It's not an issue with the H2 database either. Not only have I tried starting over fresh several times and have had the same behavior, but here is the only info I could find regarding this issue, and they were having the issue regardless of whether or not it was on H2 or MySQL.
I've been trying to find a solution for this for over a month with no success. Any help would be appreciated!
EDIT: It looks like this might be the subject of EMM-826. Unfortunately there seems to be no response to that bug report so far.
EDIT 2: EMM-826 was closed with a message saying the following:
This issue is fixed in the EMM 1.1.0 GA latest pack. Please get all the patches for the product/build the product from the latest source [ https://github.com/wso2/product-emm ] and try again.
Unfortunately, that did not work for me. I'm not sure what exactly I'm doing wrong, so I'll list the what I did to try to fix it:
Downloaded the EMM 1.1.0 zip from http://wso2.com/products/enterprise-mobility-manager/.
Downloaded the zip from https://github.com/wso2/product-emm and pasted the files from that into my EMM_HOME directory.
When that didn't work, I searched for patches and found I was only using patches 1-6. In the documentation I found I could download patches 7-12 here. Patches 9 and 10 didn't work right for some reason; causing me not to be able to reach the EMM dashboard or publisher. I could only access the Carbon manager. I was able to make patches 7, 8, 11, and 12 work though - with no change in behavior.
Here are the steps I take to reproduce the issue:
After setting a fresh copy of the EMM up, I log in to the EMM dashboard as Admin, set up a user account, and upload an app through the Publisher.
Register a device to the user account I set up. In this case, an Android device running Android 4.2.2.
From the dashboard, I go to App Management and click the app I uploaded. The list of devices loads, but from that point on, the database file starts growing and eventually, after several hours, becomes so large it the device list will never load.
Please help!
Found this happening also, from a quick look it's the WSO2EMM_DB.notifications table. Seems to keep a history of all notifications over time, and the info for app installs is taken from non-optimized queries, which degrade as the table grows. You 'could' delete all rows from the table, and it will re-populate as devices 'check back' and report their info.
But you'd probably want to write a query to just keep the latest notification of each type of each user (I'll leave that to someone else...) and as was mentioned, it is apparently fixed in the latest version.
Issue appears to be resolved in EMM 2.0, which can be found here.

VS 2013 Express - VB Update Database

I've looked all over and can't find an answer to my question. I'm trying to do the simplest database I can. I want to be able to add/edit/delete items and have it save them to the database, so they're in there the next time I open it.
I'm currently using VS 2013 Express and have the 2012 express still installed. was going through the MSDN tutorials and even tried the Northwind Database, but it said I had to upgrade it, but that it wouldn't allow me to. So now I'm stuck.
Currently, I've created a database, then a new project. I connected the new project to the database and drag-n-dropped one of the tables onto the form in gridview. In the interest of saving space, I won't go into a lot of detail. I basically followed the msdn tutorial on creating two tables called Customers and Orders, with a few variables and 3 entries a piece. There is a primary key in customers and a foreign key in orders.
I used the code MSDN provided, that was similar to what was already in the save buttons auto-generated code, except in a try/catch code. What I have so far is:
`Private Sub CustomersBindingNavigatorSaveItem_Click(sender As Object, e As EventArgs) Handles CustomersBindingNavigatorSaveItem.Click
' Me.Validate()
' Me.CustomersBindingSource.EndEdit()
'Me.TableAdapterManager.UpdateAll(Me.SampleDatabaseDataSet)
Try
Me.Validate()
Me.CustomersBindingSource.EndEdit()
Me.CustomersTableAdapter.Update(Me.SampleDatabaseDataSet.Customers)
MsgBox("Update successful")
Catch ex As Exception
MsgBox("Update failed")
End Try
End Sub`
I apologize for the code not all being in the block, I couldn't get it to work properly. So, from what I can determine, it looks like this code updates the dataset and not the database. Is that true? If so, how do I update the database? When I run the program and add a row, I can click save and it returns a message box, saying it ran successfully. However, I can't rerun the program and have it display, much less shut the VS Express down and bring it back up to have it show the changes. Can anyone please tell me what I'm doing wrong here? Thanks.
Okay, I think I've found the answer to this question. It appears that what's happening is that there are two versions of the database. One version is stored in the root of the project folder, the other is stored in the bin/debug folder.
Apparently, for whatever reason, when you run your program to test your code it pulls from the root database. Any changes you make and save are copied to the one in bin/debug. I assume this is so that you can have stock information in your database and not have it altered by code that may or may not work the way you want. I understand that, but I think there is just as much reason to want it to save changes. I wish there were an option to make it that way and if anyone knows how, please share that.
I've spent a couple of years playing with databases, on and off, thinking I was doing something wrong, when in reality it's just the way VS was handling it. You can do one of two things to test if this is your issue. One is that you can go into the bin/debug and copy the .mdf and the .ldf, and paste them into the root folder, overwriting the two there. I'm not sure what the .ldf does, but it wouldn't run without it. Then when you go to runtime, your changes will be reflected. Another option is build your project and run the .exe in your bin/debug folder. It will run the program and any changes you make will stick, but again, only in the bin/debug version. I assume that a published project will save to the root directory.
I found the article that led to this breakthrough here: http://visualstudiomagazine.com/blogs/tool-tracker/2012/05/dealing-with-local-databases-or-why-your-updates-dont-stick.aspx

Tidy up DotNetNuke database tables

I've inherited the maintenance of a DotNetNuke (v6.2.0.1610) site, and one of the things I'd like to do is to tidy up the database tables being used.
It looks like there might have been two installations of DNN into the same database (I'm guessing, I don't know its history and cannot find out), I'm making this assumption because there are two sets of DotNetNuke tables.
For example, we have:
dbo.Portals, dbo.PortalSettings, dbo.Profile, dbo.Roles, etc.
However, then we also have the same set, prefixed with dnn_ -
dbo.dnn_Portals, dbo.dnn_PortalSettings, dbo.dnn_Profile, dbo.dnn_Roles, etc.
I spent a good while tearing my hair out when I could not get our portal to load, when I discovered it is because I was editing the dbo.PortalAlias table and I needed to be editing the dbo.dnn_PortalAlias table instead.
I wanted to avoid this future maintenance headache, so I backed up the database, and set about deleting all the tables without the dnn_ prefix (web.config specifies objectQualifier="dnn_"). I diligently ensured there was a matching dnn_ table before deleting any.
At first it seemed fine - the portal loaded and all the content was there, I thought I was on to a winner. However when I logged in and accessed the site admin section, that's when I started to get lots of error messages. So I figured I'd deleted too much, I restored the backup, and all is well - portal working again.
However, I really would like to get rid of the unnecessary tables, because no doubt at some point in the future I'll start doing some work on the database, forget about the dnn_ prefix and waste a bunch of time wondering why something isn't working.
So, as a bit of a DotNetNuke newbie, I'm after some help - how can I know what tables are in use, what aren't, and how can I set about tidying up the SQL Server tables? Thanks.
I suggest you to delete only the tables which have an equivalent with the "dnn_" prefix.
The DNN database should contain at least the "aspnet_" prefixed tables which are used for the authentication on the portals.
Then, you could have some extensions which could use tables without the "dnn_" prefix. It depends on the sql scripts that those extensions have used during their installation. I hope that those extensions don't run queries on the dnn tables without the "dnn_" prefix. Otherwise it could explain the errors you've encountered.
You could use the SQL Server Profiler to check it.
It turns out there was a view, dnn_Lists which was still referencing dbo.Lists without the dnn_ prefix.
I fixed this view and it's fine now.
(PS: Turns out that it's useful to set IsSuperUser = 1 in the users table for who you're logged in as, because then you get the full exception details and can fix it.)
Thanks
It would make sense to delete all tables WITHOUT "dnn_", but you said you got a problem.
If you have time and patience and is adamant about tiding things up, I would delete 1 table at a time and test the admin feature it broke last time until you find the culprit. That is a long shot, but that is how I would approach.
What might be happening here is that you may have a 3rd module installed that ignores the objectQualifier and when you deleted those tables, you then broke that module.

SQLite vs.SQLCE Deployment

I am in the process of writing an offline-capable smartclient that will have syncing capability back to the main backend when a connection can be made. As a side note, I considered the Microsoft Sync Framework but since I'm really only going one-way I didn't feel it would buy me enough to justify it.
The question I have is related to SQLite vs. SQLCE and ClickOnce deployments. I've dealt with SQLite before (impressive little tool) and I've dealt with ClickOnce, but never together. If I setup an installer for my app via ClickOnce, how do I ensure during upgrades the local database doesn't get wiped out? Is it possible to upgrade the database (table structure, etc. if necessary) as part of the installer? Or is it better to use SQLCE for something like this? I definitely don't want to go the route of installing SQL Express or anything as the overhead would be far too high for what I am doing.
I can't speak about SQLLite, having never deployed it, but I do have some info about SQLCE.
First, you don't have to deploy it as a prerequisite. You can just include the dll's in your project. You can check this article which explains how. This gives you finite control over what version is being used, and you don't have to deal with installing it per se.
Second, I don't recommend that you deploy the database as a data file and let ClickOnce manage it. When you change that file, ClickOnce will publish it again and put it in the data directory. Then it will take the previous one and put it in the \pre subfolder, and if you have no code to handle that, your user will lose his data. So if you open the database file to look at the table structure, you might be unpleasantly surprised to get a phone call from your user about the data being gone.
If you need to retain the data between updates, I recommend you move the database to the [LocalApplicationData] folder the first time the application runs, and reference it there. Then if you need to do any updates to the structure, you can do them programmatically and control when they happen. This article explains how to do this and why.
The other advantage to putting the data in LocalApplicationData is that if the user has a problem and has to uninstall and reinstall the application, his data is retained.
Regardless of the embedded database you choose your database file (.sqlite or .sdf) will be a part of your project so you will be able to use "Build Action" and "Copy to Output Directory" properties of that file to control what happens with the file during the install/update.
If you choose "Do not copy" it will not copy the database file and if you choose "Copy if newer" it will only copy if you have a new version of your database file.
You will need to experiment a little but by using these two properties you can have full control of how and when your database file is deployed/updated...

Resources