Running EXEC sp_msforeachtable #command1="ALTER TABLE ? NOCHECK CONSTRAINT ALL" will disable Foreign keys on existing tables.
What if the tables and insert data queries that enforce foreign key constraints run after this query,?
I am encountering this issue during build automation and What I am ideally look for is a permanent switch to disable all constraints on the database (i can do that since the database is created as a part of build process).
NOTE: See the 5 steps mentioned towards the last to get an idea of the issue faced during build automation
I have created a build step before processing the scripts to disable all existing foreign key constraints. The next step would be package and run all release sql scripts that may contain tables created, data inserted. The earlier build step to disable constraints have no clue about forth coming database tables and insert scripts which will enforce foreign key constraints after running the data insert, failing my build process.
Is there a way i am set a flag in the database to stop checking for foreign keys?
Adding some more context to what i am doing specifically.Automating build using bamboo and following steps are performed on a high level
locate last available deployed db schema
build a database using the schema generated script (no master data copied).
disable all foreign keys (unable to disable FK for tables yet to be created in next step)
merge all release specific db scripts(may contain new db and insert scripts)
apply other transformations like running codegeneration, script compare, delta finding etc.
Step 3 is the challenge.
Note: This is automating a legacy system with 300ish master datables and data, since Codesmith tools are used, schema changes has to be detected and auto generated code has to be checked against last deployed schema. Since the master data is so huge, keeping a reference db with data for build purposes is out of the question hence the referential integrity constraint issue will be more prominent.
The only thing I can think of is to create a DDL trigger which listens for constraints' creation and, if any are detected, drops them. However, I'm not sure this approach is viable if a constraint is created as a part of the create table statement. You should test it thoroughly before using.
Personally, however, I usually solve this by properly ordering the sequence in which the data is inserted. It's much safer, not prohibitively difficult and, last but not least, always possible to do.
Your basic problem is that your database migrations that are creating your database are running in the wrong order. Adjust the order of tables and data insertion so that only data that references already existing data, is inserted at any one time
Turning all the constraints off, loading data, and turning them all back on at the start and end of each script that does DB data alterations, is also an option, but you should separate your scripts that do schema changes from your scripts that do data loading and run all the schema changes first
Related
Initial question (solution comes afterwards):
I have the following challenge: I have an Oracle database where a software (Infor Supplier Exchange) once created tables and filled them with data. This db shall be migrated to SQL Server, then an upgrade of the Infor software shall be executed with the migrated data.
A colleague of mine already used a script by Microsoft to migrated the Oracle db to SQL Server which is now available for me. Even though the "Keep Identity" flag was set, no primary key in the new db has its Identiy (autoincrement) set - but that is needed by the Infor software to add data later.
I found a way via SSMS to change the Identity (as well as its seed) for each relevant db table: Right-click on the table, design, change the "Identity Specification" manually. But I have over 300 tables: The effort would cost hours (and sanity).
I also found out that I can use SSMS's "export data" task. You have to know that the Infor software provides a db installer which creates all necessary tables, keys, identity properties, etc. with an EMPTY database. So I can basically export the data from the "Oracle migrated old db" to the "Infor prepared new db" since they (should) have the same table names, keys etc. - except the Identity property and the user data.
In the export data task you can check "Enable identity insert". The problem is that this SSMS feature aborts when it processes a table with foreign keys where the referenced table does not exist, yet. So I could go through the old db again, execute the "copy data" task for all tables without primary keys first, then try the remaining tables until all data is copied to the new db. But this is again much effort since I have to go back on every error or check all contraints beforehand.
Do you have a better approach? Is it possible to copy data from db A (with 300+ tables) to db B (with the same table structure), hoping that a tool solves the correct order of tables because of their foreign key constraints?
If you have questions on the issue I can explain in more detail. Thanks in advance.
Solution:
I solved the task by disabling constraints and triggers temporarily. The steps are:
EXEC sp_MSForEachTable "ALTER TABLE ? NOCHECK CONSTRAINT all"
sp_msforeachtable "ALTER TABLE ? DISABLE TRIGGER all"
EXEC sp_MSForEachTable "DELETE FROM ?"
I had to clear the target database's tables since they are filled with some sample data by the Infor installer. The data export task can append rows or can try to remove existing rows (with same primary keys). But this uses TRUNCATE internally which doesn't work with foreign key contraints, even when they are disabled by the above command.
Next: Execute the SSMS database task "Export data". Ignore datatype conversion errors (some types differ from Oracle-Migration to target SQL schema, like varchar to nvarchar which I checked and judged as not critical).
exec sp_MSForEachTable "ALTER TABLE ? WITH CHECK CHECK CONSTRAINT all"
sp_msforeachtable #command1="print '?'", #command2="ALTER TABLE ? ENABLE TRIGGER all"
Using the vendor's SQL Server database schema and loading the data yourself is typically the correct approach for migrating to SQL Server with packaged software. But there may be additional guidance available from the vendor.
Instead of trying to load the tables in an order that is compatible the foreign key constraints, which is not always even possible, disable all the foreign keys before loading the database and re-enable them after. See eg Temporarily disable all foreign key constraints
Once FK Constraints have been created via Entity Framework Core 1.1 with Migrations in an ASP.NET Core Code First app, would it be ok to temporarily enable/disable the constraints directly in SQL Server without using EF - would it break the migrations created via EF etc? Background: I need to truncate data from a table that has been referenced by several tables via FK's that were created through EF Code First. SQL Server, as expected, complains that you can truncate the table since it's been referenced by a FK etc.
No, it would not break migrations. If you are doing anything on database and then reverting back the schema design of database to earlier state then migration will just run fine. Migrations when applied expect that shape of EF managed objects in database to be remain same (as it would have known earlier). Any temporary change is invisible to migration. And state needs to be same afterwards because when future migrations applied then appropriate objects are present otherwise DDL can cause failure.
Anything you change in the database schema in SQL Server, without the code, will break the migrations. You should delete the foreign key references in the code for the operation you want to do and then recreate them later. Though, be careful, if your data is left in inconsistent state, you might not be able to recreate constraints without losing data.
I have a relational database in my server where I've used for developing a system. Now I want to make it live and truncate all data from the tables. I've manually deleted data from tables and after that I've run the truncate command, but it show this error :
Cannot truncate table 'dbo.Building' because it is being referenced by a FOREIGN KEY constraint.
Is there any way to empty my database by using a single command? I've searched google, all of them told to use truncate command. But I can not use it for all the tables because the error occurred.
I want to entry data from the ID no 1 in all tables.
Please give me a guideline to truncate all the data from my database.
Now I want to make it live and truncate all data from the tables
You are approaching this completely wrong. Even if you succeed, you will deploy a system which will be impossible to upgrade. As you continue to develop you will modify the development database and then when you have to deploy your next version of your application you'll realize you need to modify the production database and keep all of its data.
Stop the deployment right now and go back to the drawing board to design a proper deployment strategy. I recommend migrations. Another alternative is using diff tools.
Truncating tables is completely irrelevant for what you're actually trying to achieve.
There are two options I could think off..
You need to drop (not just disable ) all foreign keys, then finally run truncate to delete all table data using any method.. and finally recreate all foreign keys
You also can script out only DDL and deploy database using that script instead of providing database to deployment team..
I've got a reasonably large / complicated DB which I need to upgrade in the field from version 1 to version 2. There's a lot of changes in schema and importantly data between the two.
Yes, I know this should have been version controlled alla:
http://www.codinghorror.com/blog/2008/02/get-your-database-under-version-control.html
but it wasn't - it will be when I am done.
So, current problem, I'm face with the choice of either going through all the commits or trying to diff between two versions of the db. So far I've tried:
http://opendbiff.codeplex.com/
http://www.sqldelta.com/
http://www.red-gate.com/
However none of them seem to be able to successfully generate schema upgrade scripts because they don't also do the data at the same time. This results in foreign key violations when adding new keys to tables as the table it references is new and while the schema for the table has been created, the data whcih it contains has not. Well it could be, but that requires me to use a different part of the tool and then mix together the two scripts.
I know this may look like a duplicate of:
What is best tool to compare two SQL Server databases (schema and data)?
which is where I found most of the existing tools I've tried, but so far I've not managed to get any of these to produce a working schema migration script (I'm really not too fussed about the data, but I do need the data which is required for foreign keys - which tbh is all the difference as I've deploy old version and new version).
Am I expecting too much?
Should I give up and start manually stitching together what I do have?
Or do I go through all the commits and manually create upgrade scripts?
I can't think of more powerful tools available than the ones you seem to have tried. If those fail, my homegrown versioning system probably won't help you much either.
However, you should be able to generate an update script and then manually edit it to add the data transformations to it.
And/or you could disable the foreign key constraints for the time that the update script runs.
There is no such thing as doing schema and data "at the same time". Even if you have them in one big script you would still be doing the schema first and then the data. If the schema script creates a new table and adds a constraint to it there is no reason you should get a referential integrity violation error as there are no rows in those tables.
In any case, you should give our xSQL Schema Compare and Data Compare tools a try, you will be impressed with the performance and the level of control you get.
I'm fairly new to Vs Data capabilities, and this is my first data generation plan. I have implemented a database using a Vs2010 database project, and used it to deploy to a sql server express 2008 database. All the tables use identity columns as their primary keys, and they're related to one another with foreign keys.
I set up a data generation plan, but when I try to generate data with it, the tables are simply populated in alphabetical order, which is of course going to fail. The only tables that populate correctly are the lookup tables and other sorts of independent entities with no FK constraints. The rest are skipped after the first table fails.
Supposedly the generation plan determines the population order based on FK dependencies. What happened?
edit: someone with the rep for it should make a visual-studio-data-tools tag, since DBPro is no longer (nor really ever was) a product name.
So apparently according to this thread the data generation plan blows up when you have a table containing only a primary key and no other columns. It turns out that one of my independent entities, whose only purpose is to serve as a joinder to one of my other tables, fit this description. After adding a harmless Description column, I was able to proceed fixing other problems until the generation plan completed successfully.