I have some problems moving a database to another computer having tried both manual-file-copy with attach in SSMS and backup/restore in SSMS (Moving SQL Server 2019 database to another computer).
However the entire database is only a few tables and 15 megabytes.
So why not simply export to pure SQL I am thinking... That could work for me in this case. So in SSMS I right click database, tasks and select export data...
It appears the closest I can get in SSMS is picking "flatfile" which is essentially a CSV file which can contain single table... I want a SQL file that can create database, tables, add data etc.
What am I overlooking? Is this not possible in SSMS?
Related
I need to export 700,000 records from an SQL Server 2008 R2 table to a Microsoft Access database in 2002-2003 format. I am using the SQL Server Import and Export Wizard. This is currently taking over 2.5 hours. Because this is all taking place on a high secure server I am limited in my choice of tools. I could export to a text file but that loses some of the formatting.
I need a copy of one table from the database in either Access or Excel with formatting preserved. Exporting to text/CSV is not available as some of the fields may have commas. Also I cannot use Excel as the target because 2008 R2 does not support mode that 64K rows
Are there any ways to speed this us?
Using Access it should be a snap:
Link the table via ODBC, create an empty table in Access as it should appear.
Then run an append query using the linked table as source, and writing the data to the local table. The query can also rename the fields (alias) and perform minor modifications as to your needs.
If you don't have Access (Office) 2016 installed, I believe a 30 day evaluation version is for download.
I've created the structure of my database first in PhpMyAdmin and exported it to a .sql file.
Now I'm looking everywhere in SQL Server Management Studio where I can import/add the data in a new database.
Does anybody where to look or what to click?
I'm using the 2014 version (CTP2)
If you have a .sql file which contains SQL statements, you can just copy and paste the contents (or open the file in a query window) and run it. This assumes it has all of the create table etc. statements to create the schema/structure and not just insert statements for the data.
Check the top of the file to make sure that it is first selecting the correct database, if not add a USE statement to select the correct database.
You didn't say how big the file was, but if it is quite large and has the insert statements (data as well as schema), then you'll probably want to run by CLI using sqlcmd command. Much faster and SSMS won't freak out.
Another alternative option to running the .sql file/code is to set up a data source for mysql and just use odbc to access the database itself.
Bear in mind that there are real and very annoying differences between mysql and t-sql that can make migration a pain. If you're just creating a few tables, it may not be an issue, but if there are a ton of tables with lots of fields of different data types, you may run into issues.
If you are looking to import table structure, you can copy-paste the content and run inside SSMS in a query window. Beware of syntax differences with MySQL and SQL Server. You will most likely get errors. You need to convert your SQL script from MySQL dialect to SQL Server dialect (or just add them manually if they are not too many). If you set the databases to a SQL standard-compatibility mode at the very beginning, you will have much less trouble.
If you are ONLY looking just to import the data into existing tables inside the SQL Server only, you can do the same (i.e. copy-paste and run in query window). You will have less trouble with that.
Open the server, open "Databases" and right click the database, go to "Tasks" and then Import Data...
I have had the most 'trouble free' success importing to SQL via a flat file method (comma delimited .txt file), the only stipulation when creating a flat file (i.e from Access) make sure the text identifier is set to {none} and not "".
To import the file: in the SQL Server Management Studio right click on Databases and create a new database. Then right click on the new database -> Tasks -> Import Data... The import window opens: in the DATA SOURCE option select Flat File Source and select the .txt file...click NEXT. In the DESTINATION field select SQL Server Native Client 11.0 and go through the import process. This worked very well for me.
Can SQL Server or db2 do entire database exports like oracle (using exp command)?
I've searched the internets and found bcp for SQL Server. But it seems I would have to iterate over all the tables to get what I want.
For db2 it looks to be roughly the same. Is there something I'm missing? Anyone have any suggestions and/or any opinions? Thanks ahead of time.
This is for SQL SERVER
Backup & Restore
To take an entire database with SQL Server, you can do a BACKUP and RESTORE
BACKUP: http://msdn.microsoft.com/en-us/library/ms186865.aspx
RESTORE: http://msdn.microsoft.com/en-us/library/ms186858.aspx
Export and Import
You can right click a database in SQL Server Management Studio, and under TASKS, click on EXPORT DATA. Follow the Wizard to choose the objects you want to export and put them into the appropriate location.
Custom SSIS for Raw format
Build a SSIS package that will read data from source table and put it into a RAW file on disk for later use. Raw files holds the structure of the table and the data.
DB2 for Linux, UNIX, and Windows has a utility called db2move, which generates the DDL to rebuild the database from scratch, and iterates through all the tables to dump their contents to flatfiles via the EXPORT command.
How can i export table data with column name in text file in sql server 2005?
safest way to do that is to backup data and then restore it on other server!
right click on database in ms sql management studio>tasks>backup , choose directory to backup the file. Zip it ( saves a lot of space) move to other location, create database with the same name right click on database>tasks>restore.
In backup and restore always choose overwrite in options menu otherwise it will be appended to existing dataset.
Other way to do that is right click on database, tasks, export data and move it to another location. more advanced though, backup works faster and you supposed to use it unless you want partial move, like some tables and some stored procedures.
If it's a small amount of data, you can upgrade your Management Studio to SSMS 2008 (you do not need to upgrade your server). The Results tab has the ability to copy data with headers, so you can cut and paste into Excel.
A larger amount of data can be extracted in several ways such as an SSIS package, the Import/Export wizard, or doing a pull through ODBC from Excel.
You can do this using an SSIS package.
I am replacing an Access application with a web app, but the client is using SQL Server 2000, and I am using SQL Server 2008.
So, I have the database redesigned, with foreign keys, but now I need to get the data on the client's system.
Part of the problem is that they have images that are over 32k, so osql failed as the command buffer filled up.
I should be able to use osql to import the new schema at least, and perhaps all of the data except for the images.
The Export wizard just wouldn't work, even though I tried the Native SQL Driver and the OLE DB Sql Driver.
Flat files seems like a bad choice, as I don't know if it can do the images.
So, what is a good way to copy a 330M database from 2008 -> 2000?
Not sure about performance or time needed, but you could always try a tool like
Red-Gate SQL Compare / SQL Data Compare
Apex SQL Diff / SQL Data Diff
These will allow you to compare both the schema of two databases, as well as the data, and allow you to create synchronization scripts, or synchronize online.
Marc
I set the image column to null, which reduced the size of the insert statements.
This enabled me to import the data into the target database.