Old links to backend still within frontend but inaccessible, cannot update them - database

My users are working with a Access database which has been split into a frontend (DB.accdb) and a backend (DB_be.accdb). As they occasionally have to move the files, I've written a function to relink the tables upon startup.
Now, somehow they managed to break the file, I really don't know how. When the RefreshLink function is called for a table, there's always a run-time error (different ones, actually).
For example, error 3022:
The changes you requested to the table were not successful because they would create duplicate values in the index, primary key, or relationship. Change the data in the field or fields that contain duplicate data, remove the index, or redefine the index to permit duplicate entries and try again
I opened the frontend in exclusive mode, deleted the tables and manually relinked them. But a 1 is appended to their names: someTable --> someTable1. Seems like the tables already exist? Maybe they're still in a system table? As relinking would insert the linked table's names there, there would obviously be several tables with duplicate names.
I opened the connection manager, and indeed, it listed the old, wrong links among the new ones I just added.
I cannot refresh the old links - "duplicate values" etc.
I can refresh the new links, but of course I cannot rename the tables (removing the 1) because somehow tables with those names already exist.
I cannot delete the old tables either, as they're not displayed in the sidebar! They don't appear even if I turn on "Show system objects" etc.
I cannot remove the new links and then update the old links, as the connection manager button is greyed out then. Presumably Access thinks there are no tables.
And when I try to Compress & Repair the database, it uses the old links again...
How can I completely remove all traces of the previous links?

To recover from what appeared to be some corruption in the old front-end database file, the solution was to
create a new empty .accdb file,
import all of the Queries, Forms, etc. from the old front-end file into the new one, and then
create the proper table links in the new front-end file.

Related

WordPress site without db table prefix. Best way to add a prefix?

I'm trying to migrate a WordPress site to some new hosting. The site is an older site, created probably 7+ years ago. I'm getting an error from wordpress upon bringing it to the new server that gave me a heads-up that the WP tables have no prefix. Sure enough, I login to Phpmyadmin, and there are no prefixes on the tables. I have since tried several methods to add prefixes and get the site migrated, but all are coming up short:
Direct DB dump from phpmyadmin on old server, import into new server. Triggers "tables have no prefix" error
Use updraft plus to get the backup from old server, restore to new server (triggers "tables have no prefix" fatal error)
Revised #1(above) approach - db dump/import/restore, manually add (via operations tab in each table) wp_ prefix to all tables. Site loads, but logging in as admin then yields no access to wp-admin dashboard (black bar at top after login, but no options for doing anything on the dashboard). I have tried the various tricks suggested, like adding a new temp admin user or trying to make necessary changes in usermeta table to my admin user, but that doesn't resolve it.
Tried using the All In One WP Security plugin to add new prefix. It recognizes the current no prefix, but won't successfully complete adding the prefixes to the DB. Tried another plugin and didn't even get that far.
Using Phpmyadmin, I've tried to use the built-in add prefix, and that seems to not work either.
I just looked at phpmyadmin again after trying unsuccessfully to use the built-in add prefix function, and am noticing that while all tables are InnoDB, the collation is a mix of utf8_general_ci and utf8mb4_unicode_ci . I understand that the second option is probably what I want all of them to be. I was able to add prefixes to the tables that are utf8mb4_unicode_ci, but the rest of them simply do not respond to that function in phpmyadmin.
The next thing I tried was to change all the collation to utf8mb4_unicode_ci using phpmyadmin. I used the "change all tables collations" and "change all tables columns collations". That seemed to work and then allowed me to add a prefix. However, after that, while the site was still working, I could again no longer get into the wp-admin dashboard. If I were to try the same conversion but only doing the "change all tables collations" and skipping the second option, I wonder if that might work. Just tried converting the tables without converting all the columns. The site seems to still work ok after doing that. But trying again to add prefix using the AIO WP security plugin still doesn't complete.
It would seem that adding the prefix is causing the issue. Is this something that a theme could be interfering with for some reason? What am I missing here?
Any tips for moving forward with this site? Thanks so much.
You can simply dump the database, open it with a text editor, replace all occurrences of "CREATE TABLE " with "CREATE TABLE yourprefix_" and restore the database.
It should be easy, depending on the size of the dump.

I've renamed a bunch of tables in Access on the back-end and the front-end isn't reflecting those changes. Quick fix?

I tried using the link manager but it didn't work. I changed the names of a bunch of BE tables because they just didn't reflect their content (taking over DB from previous DBA, so it's a mess). Is there a quick way to make the FE tables link back to their BE counterparts?
There are also queries and forms that I assume I will need to edit as well.
The link manager did work but it doesn't add new tables which your renamed tables are.
So just link the new tables, and either rename the linked table names or adjust all your queries etc. to the default linked table name.

MS Access linked tables different for each user

I have a database that i have split into 3 pieces: a Front End, and two back ends that contain tables. Copying the front end to a users desktop cuts the runtime from 90 minutes to 30 minutes. However, when I move the back end to the desktop as well, the runtime is less than 8 minutes. The problem I am facing doing this is that I had to manually update the table links.
Is there a way to make it so that Access automatically updates the links based on what users computer it is on?
For example, I created a batch file to move the database files from the shared drive to a folder on the users destop using:
"%userprofile%\Desktop\Folder1\"
as the location to move the files to. "The "%userprofile%" automatically identifies the user and route the files properly. I didn't know if there was something similar to automatically update the links in Access.
Please let me know if you don't understand what I am trying to ask.
Assuming the 1st backend is on the user computer, and the 2nd backend is on the shared network, I think what you can do is create a qry upon opening the database to update the user's table, then after exiting, a 2nd qry that will export user's table back to the 2nd backend on the shared network. You'll have to add a field in the table so that your database knows which record to be added or to be exported.

Delete redundant MOSS content database entry

In a sharepoint content database we have noticed there are a couple of records in the all_docs table that reference documents that no longer exist. The listid guid that they are associated with is not in the site so we have no way to view them and delete them. I think this was a result of moving a content database from another environment into this one.
Can anyone suggest the best approach to clean this up? I need to delete it as it is referencing a page layout that I cannot delete until this page has been removed.
Have a look at Fear and loathing, it describes an undocumented stsadm command to delete an object by its id.

Development standards for SQL Server supporting services?

I am trying to find some development best practises for SQL Server Reporting Services, Analysis Services and Integration Services.
Does anyone have some useful links or guidance they can offer on this subject?
I can only talk specifically to SSIS although some of this wil be applicable to the others as well.
Save your packages as files and put them in Source Control.
Where possible use variables for things that will change from server to server or run to run.
Use configuration files to save the configuration for differnt environments.
When processing data that comes from an outside source, assume it will change format without warning (ie check to see that the data you expect in each column is the data you got!) Nothing like putting the emails in the lastname field (or as happened to us once in DTS, the social security number into the field that said how much to pay the person, sure glad we caught that before someone got paid that amount.).
Things I have seen happen include adding new columns, removing columns that are critical to your process, reaarranging the order of the columns (especially bad when the file itself does not have the column names), leaving the column titles the same but changing the data they contain (yes once I got a file where the last name data was in the column labelled First_name and vice versa), data with new values that don't have a match to values in your system (i'm think of look up type things here like medical specialties), flat out strange data such as notes in an email field, names in this format lastname - 'Willams, Jo' first_name - 'hn' (combine the two fields to get the whole name - apparently their data entry people just typed the name until they ran out of spaces and continued on in the next field no matter where they were in the name!).
Don't put uncleaned data into your database.
Always retain a copy of any files that you process or send out. Amazing how often you will need to research back.
Log errors and log records that needed cleaning, espcially if the problem in the field was such that it caused the process to fail. It is a whole lot easier to see the errors in a table than to know your 20 million record file failed because one record had an extra | in it and try to figure out which one it was.
If you do a lot of similar imports in SSIS, create a template project that has all the standard logging and data cleaning it it. It is a whole lot faster to start from a template and adjust to new mappings based onteh new file you are working with and make minor adjustoments to things specific to that file than to rewrite every SSIS package from scratch.
Store meta data. Sooner or later you will be asked, how often did it fail or how soon after the file was received did the import happen or even when was the last import. All our pacakges start and end with a task to store start and stop times in our meta data table. All failure paths include a task to mark the import as failed in our meta data. Eventually you can build a system that knows how many records to expect and fail it if the new file is significantly off. Meta data can also be used to store things like number of records which can help identify when they sent a partial file instead of the whole file you were expecting and prevent you from blowing away 300,000 sales targets they actually still want.

Resources