Importing data from SQL Server DB to another using SSMS Import/Export wizard - sql-server

When I'm importing data from SQL Server 2008 DB to another using SSMS 2008, I get errors during the importation because it tries to insert data in a "read only" fields, or cuz some conflicts of relationships between tables' keys.
I'm wondering, how could I close the eyes of the SSMS until he finish the transformation :D
Thanks, Regards

Yes, if you reorder your table data insert statements you should be able to resolve foreign key/relationship issues. You could use an ER diagram (e.g. in SSMS Database Diagrams, select all tables) to first insert the data for the tables other tables depend on/point to with foreign keys, and then work your way down the dependencies.
I wonder how you generated these scripts; I'd imagine that any tool worth it's salt would generate data insert scripts in the proper order.

Related

Can in-memory tables be added to a database diagram

I have a SQL Server 2016 database with in-memory tables. I'd like to use the database diagram feature to create a graphic to match.
Running SSMS 18.3.1. When I start a new diagram, the in-memory tables are not shown in the drop down. Is there another way to get them on the diagram?
Note: In the official documentation these are called memory-optimized tables. See Introduction to Memory-Optimized Tables
You can't add OLTP object in Database Diagram, not in even in SQL Server 2019.
I thought there should be a way to modify [definition] column in [dbo].[sysdiagrams] but it is HexString of unknown file type. (I tried many formats but its obviously an internal Microsoft type)
Unfortunately, there is no reference to mention that is a not-supported feature. (I send a comment to this page )
OLTP is not supported for database diagram. You do not have access to in-memory tables in the diagram because the diagram does not recognize the essence of a in-memory tables as a table, in fact SQL Server generates a DLL for each created Memory-Optimized Table Type that includes the functions required
for accessing the indexes and retrieving data from the related Memory-Optimized Table Variable
If you run the SQL Profiler tool you'll see there is a column name IsMemoryOptimized in the table data result set that is returned for the memory-optimized table. I think since the Database Diagrams functionality is older (since mssql 2000) and not updated regularly it does not support viewing the newer memory-optimized tables.
more info here:
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/71aa7b6e-c281-4417-8149-2eb6f3830110/sql-server-2016-memory-optimized-tables-not-visible-in-database-diagrams?forum=sqlinmemory

How to easily copy a SQL Server database without using restore or attach?

I would to copy a SQL Server 2012 database from one server to another with the least amount of manual work and without doing a restore or attach database because I don't have access to the source server or backup files.
I would like to have a copy of all the objects and data. This includes tables with primary (including identity designation) and foreign keys, views, stored procedures, constraints and triggers.
If I use SSMS, I have to use a combination of data imports and scripting the objects. One issue with this is that I have many tables and manually enabling identity inserts is a hassle. Maybe one way is to use a diff tool to do all this work for me if possible or find a way to script the identity properties across the tables.
Is there a simpler more straightforward way to copy a database?
Replication. There are different types. Transactional will keep your copy updated with any changes. Snapshot will not, etc.
Without access to the server, I am not sure what you can do at all?

Exporting data from MS Access to MS SQL with schema and table changes

I'm working on the old C++ MFC project (> 10 years old). Database application works with migrating from MS Access (2007) to MS SQL Server (2008 R2) and I faced some hurdles on the way. For exporting data I used MS SQL Management Studio ("Import" option in the menu)
As it's known, there are some differences in data types between Access and MS SQL. That turned into some troubles.
Columns "ID" from Access (Autonumber, not NULL, primary keys) become just usual columns in SQL Server (int, not NULL and without any autoincrement). So I got lots of mistakes while inserting new rows into the tables.
Yes/No type in Access (-1/0; NULL is not allowed) becomes bit (1/0/NULL), logic of work shouldn't be broken as in the most of the places it is comparision of being not equal to 0:
query.Select()
.Buff("ID", &code)
.FromS("%Table_Name%", NULL)
.Where().Str("Aktiv <> 0")
.Execute();
Looking for a solution I saw the advice to use SSMA (SQL Server Migration Assistant) for Access. It's much better and more intellectual as it recreated primary/foreign keys, created CHECK's, indexes. But unfortunately lots of the FOREIGN KEYs' action Update/Delete operation become not Cascade but No Action. Warning message after schema import:
FOREIGN KEY constraint "Reference77" on MS Access table %Table1% may cause circular or multiple cascade paths. The cascade option from table %Table2% to table %Table1% was set to No option in SQL Server.
And that's not a surprise application gets some errors while deleting objects, though it was all OK in Access. For testing I selected one delete operation (in application) which got errors. I watched error messages and changed No Action -> Cascade for the involved FOREIGN KEYS via SSMS (SQL Server Management Studio). After that delete operation in the application succeeded.
My questions are:
Am I right I need only to change No Action -> Cascade for the FOREIGN KEYs to get the database application can work completely proper? Or there can appear another issues I don't know?
How can it be realized? I would like it to be a good solution for applying it on clients' SQL Servers.
Thanks for help, I really appreciate it!
Thanks for your answer. The solution for my problem is ... exporting data directly from Access (2010) to SQL Server.
I tried:
"SQL Server Import and Export Data", result - copying of only data from Access database, no any primary oк foreign keys, no transformation of autonumber to a column with IDENTITY and autoincrement.
SQL Server Migration Assistant for Access, result - a lot of foreign keys lost CASCADE property for update/delete operations. But all another things are OK.
Access 2010! Database Tools -> SQL Server -> ... using wizard -> all is OK with schema and data. Application works fine with the SQL Server database imported from Access.
So direct export from Access to SQL Server gave the required result.
Probably, but you will still need to test.
For a reusable solution, I would script the database that SSMA created (checking that all the types and foreign keys are correct). Having this script you can create an empty SQL Server database on any number of servers.
To populate these databases I'd use an Integration Services package. It's very easy to create by using Import wizard: going thru all the steps, but saving the package instead of running it immediately. Then you can open this package and edit it (adding data conversions or any other logic if necessary).

What is the best way to move data between postgresql and SQL Server databases

If we have the same database schema in a database on Postgresql and SQL Server (table, primary keys, indexes and triggers are the same) what would be the best way to move data from one database to another? Currently we have one in-house .NET program that does the following through two ODBC connections:
read a row from source database table 1
construct an insert statement
write a row into destination database table 1
Go to 1 if there are more rows in the table
Move to next table in database and go to 1
Needless to say: this is a very slow process and I would be interested if there was a better/faster solution to this?
If it's a "one off" migration, there's a tool you get with SQL Server which allows you to move data around between databases (I'm not on a Windows machine right now, so can't tell you what it's called - something like import/export tool).
If it's an ongoing synchronisation, you can look at the MS Sync framework, which plays nice with SQL Server and Postgres.
The answer is bulk export and bulk loading. You can go much faster by using the copy command in PostgreSQL https://www.postgresql.org/docs/current/static/sql-copy.html to dump data from the tables in the CSV format and then use the bulk insert in SQLServer Import CSV file into SQL Server. A rule of thumb is to harness parallelism for the process. Check if you can load the data ins CSV in parallel to SQL Server and if you have many tables then you can also have a parallelism on the level of separate tables. By the way, loading or migrating data row by row is one of the slowest ways.

MaxDB Data and Schema Export to SQL Server 2005/8

I am tasked with exporting the data contained inside a MaxDB database to SQL Server 200x. I was wondering if anyone has gone through this before and what your process was.
Here is my idea but its not automated.
1) Export data from MaxDB for each table as a CSV.
2) Clean the CSV to remove ? (which it uses for nulls) and fix the date strings.
3) Use SSIS to import the data into tables in SQL Server.
I was wondering if anyone has tried linking MaxDB to SQL Server or what other suggestions or ideas you have for automating this.
Thanks.
AboutDev.
I managed to find a solution to this. There is an open source MaxDB library that will allow you to connect to it through .Net much like the SQL provider. You can use that to get schema information and data, then write a little code to generate scripts to run in SQL Server to create tables and insert the data.
MaxDb Data Provider for ADO.NET
If this is a one time thing, you don't have to have it all automated.
I'd pull the CSVs into SQL Server tables, and keep them forever, will help with any questions a year from now. You can prefix them all the same, "Conversion_" or whatever. There are no constraints or FKs on these tables. You might consider using varchar for every column (or the ones that cause problems, or not at all if the data is clean), just to be sure there are no data type conversion issues.
pull the data from these conversion tables into the proper final tables. I'd use a single conversion stored procedure to do everything (but I like tsql). If the data isn't that large millions and millions of rows or less, just loop through and build out all the tables, printing log info as necessary, or inserting into exception/bad data tables as necessary.

Resources