when generating a database in PowerDesigner I have the option to drop all Tables/Tablespaces and so on before generating the Database: Option to Select
But this Option isnt there when applying changes to a database. I need this, because some local databases could already have some of the Changes I made to the model and I cant drop all tables because of the data.
I'm not sure I understand the question, when you are ready to drop some tables instead of modifying them, but you don't want to drop all tables.
It seems to me that you could use Database > Apply Model Changes to Database with the Connect to a Data Source to compare the model with an existing database, and just generate the necessary modification scripts.
Maybe you could use a saved Selection to generate an Alter Script for most tables, and another to generate a Drop+Create script for some "transient" tables.
Related
I need to sync data from several tables in a legacy SQL Server db (source) to a single table in a Postgres db (target). The schema of the source db is absurd, so the query to select the data takes a very long time to run. I'm planning to create an indexed view in the source db, and then somehow sync that indexed view to the Postgres table.
Right now, I simply have a scheduled task that drops the Postgres table (target) and then recreates it from scratch by running the complex query in the source db. This was quick to set up, and it ensures that changes in the source db always eventually make it to the target db, but recreating the table every few hours is (understandably) very slow and expensive. I need a way to replicate ongoing changes (only the new/updated data) from the source view to the target table. Is there a (relatively) simple way to do this?
I'm somewhat familiar with CDC, but I understand that CDC cannot be used on a view, so I don't believe that's an option. Adding "updated at" timestamps to the source tables is not an option, so I can't use that approach. I could add a hash column to the source tables, or maybe add a hash column to the view, so that's an option if that would work. Is there an existing tool/service that does what I need?
If you want to view SQL Server DB data in PostgreSQL, then you can also tds_fdw.
https://github.com/tds-fdw/tds_fdw
Also, there are some third-party tools which could help you to achieve your goal, for example, SymmetricDS
http://www.symmetricds.org/about/overview
I have a scheduled SSIS package where it loads the data overnight into the data warehouse. Before loading, it drops the entire database and drops all the tables. But now I had a situation where I don't want to drop one table and want to do an incremental load using merge SQL statement. Because it is dropping the entire database, I won't be able to do that in the current scenario. If I change drop database to delete database, I think, I should be able to do incremental load on the table I want. Are there any possible complications of doing that. Can you foresee any problems if I change drop database to delete database, will I be missing something. Any thoughts highly appreciated. Thanks for your time.
As far as I know with delete database you only delete the rows whereas with drop database you delete all tables incl. the rows. If your logic works, you could do a delete database, then drop all tables except the one you want to keep.
A drop/delete of the database will remove all of the contents of the database. If the requirement is to retain a single table, you'll need to retain the schema and database that holds it as well.
If I'm understanding correctly, you're dropping the target database. Is this a STAGE database for the data warehouse? If so, you'll also have a TARGET (the main tables of the warehouse) that are loaded from STAGE. If this is the case, you should be able to run a MERGE statement from the newly STAGED table to the TARGET table.
We have a legacy database that has dozens of schemas in it, and we're looking to split that database up into several smaller distinct databases instead.
Is there any way I can create a new database on the same physical server, and then transfer an entire schema over to the new database?
Our tables look like:
Foo.Table1
Foo.Table2
Foo.Table3
...
Bar.Table1
Bar.Table2
...
Xxx.Table1
Xxx.Table2
...
...and I want to move Foo.* to a new database.
Typically this is recommended to be done by some kind of per-table export/import, but that's quite cumbersome with the 150+ tables in the schema.
As far as my trivial research goes the options appear to be:
Export/import each table individually.
Backup the entire database, restore in a different destination and delete everything else (painful, since the entire database is ~900GB).
Deploy the dacpac of the single schema to the new database, and do a cross database initial seeding, aka:
INSERT INTO newDb.Foo.Table1 SELECT * FROM oldDb.Foo.Table1;
INSERT INTO newDb.Foo.Table2 SELECT * FROM oldDb.Foo.Table2;
INSERT INTO newDb.Foo.Table3 SELECT * FROM oldDb.Foo.Table3;
...
All of these options are a lot of effort... is there any other approach that will simply move an entire schema into a new database?
I am not aware of any fully automated way but this can be done relatively simply with the help of Excel.
In SSMS you can use "Object Explorer Details" to easily (with few mouse clicks) script schema of multiple tables.
With the help of system views (sys.tables, sys.columns etc.) and Excel you should be able to generate 'INSERT INTO .... SELECT ...' scripts for all of your tables in minutes.
In Excel (or a similar application) you paste the list of your tables (obtained using sys.tables) and then write a formula to generate a script for each table.
you can create a filegroup for each schema and move the tables of a schema into the related filegroup. after that you backup each filegroup and restore.
I need our end-users to "clone" a database from our web application UI. I am able to "clone" a database by backing up a source database and restoring it into a new database. In this way, I "clone" the table schema and data.
My question is - is there any way I can "clone" just the table schema, without the data? I am aware that we can script the database manually, and run that script. But our table schema changes frequently (we add new columns and tables regularly) and we wouldn't want to update this script. Thank you.
basically you have to:
save the datatase
restore the backup to a new database (use the with move option)
connect to the new database and then exec sys.sp_MSforeachtable 'truncate table ?'
PS: you may have to disable FK constraints if any.
Antother solution: use Entity Framework Database First to build a small program. Launch said program with a connection string pointing to a new db.
Right Click over database -> Tasks -> Generate Script
On Set Scripting Options page click Advanced
General -> Types of data to script = Schema Only
With SQL Server data tools you can generate the schema difference and apply only the changes to target database.
Is there any way to log the changes made in Schema of a Table whenever I do the schema changes?
I was reading an article here about DDL Triggers. But it does not tell about the specific changes made in schema of a table.
this would be very difficult as quite often in SSMS the table is actually dropped and rebuilt in the background (depending on the complexity of the schema change & whether or not you enabled the "Prevent saving changes that require the table to be re-created " option in SSMS) - logging all the different types of changes would be a nightmare. (constraints being dropped, only to be re-created - bulk re-inserts, renames etc when all you might have done is re-arranged columns in joined table)
If you're serious about tracking schema changes i'd strongly recommend you script the schema (using the generate scripts option in MSSMS) & check the resulting file into SVN / SourceSafe / TFS & use the many comparison tools available for those systems.
OR, you can use 3rd party products that do all this for you, such as Red Gates SQL Source Control:
http://www.red-gate.com/products/sql-development/sql-source-control/
Edit: You may find this useful - it makes use of the Service Broker (SQL 2005+) and SSB queues:
http://www.mssqltips.com/sqlservertip/2121/event-notifications-in-sql-server-for-tracking-changes/
For this issue i would probably use Event Notifications. Although DDL trigger's in my opinion do tell about specific changes made to table, just trigger definition:
Create Trigger tr_DDLNotikums
On DataBase
For **DDL_DATABASE_LEVEL_EVENTS**
Use DDL Trigger In Below Format
CREATE TRIGGER tr_DDL_Database ON DATABASE
FOR DDL_SCHEMA_EVENTS
AS Begin
Insert Into LogTable (XmlColumn)
SELECT EVENTDATA()
End