I've created the structure of my database first in PhpMyAdmin and exported it to a .sql file.
Now I'm looking everywhere in SQL Server Management Studio where I can import/add the data in a new database.
Does anybody where to look or what to click?
I'm using the 2014 version (CTP2)
If you have a .sql file which contains SQL statements, you can just copy and paste the contents (or open the file in a query window) and run it. This assumes it has all of the create table etc. statements to create the schema/structure and not just insert statements for the data.
Check the top of the file to make sure that it is first selecting the correct database, if not add a USE statement to select the correct database.
You didn't say how big the file was, but if it is quite large and has the insert statements (data as well as schema), then you'll probably want to run by CLI using sqlcmd command. Much faster and SSMS won't freak out.
Another alternative option to running the .sql file/code is to set up a data source for mysql and just use odbc to access the database itself.
Bear in mind that there are real and very annoying differences between mysql and t-sql that can make migration a pain. If you're just creating a few tables, it may not be an issue, but if there are a ton of tables with lots of fields of different data types, you may run into issues.
If you are looking to import table structure, you can copy-paste the content and run inside SSMS in a query window. Beware of syntax differences with MySQL and SQL Server. You will most likely get errors. You need to convert your SQL script from MySQL dialect to SQL Server dialect (or just add them manually if they are not too many). If you set the databases to a SQL standard-compatibility mode at the very beginning, you will have much less trouble.
If you are ONLY looking just to import the data into existing tables inside the SQL Server only, you can do the same (i.e. copy-paste and run in query window). You will have less trouble with that.
Open the server, open "Databases" and right click the database, go to "Tasks" and then Import Data...
I have had the most 'trouble free' success importing to SQL via a flat file method (comma delimited .txt file), the only stipulation when creating a flat file (i.e from Access) make sure the text identifier is set to {none} and not "".
To import the file: in the SQL Server Management Studio right click on Databases and create a new database. Then right click on the new database -> Tasks -> Import Data... The import window opens: in the DATA SOURCE option select Flat File Source and select the .txt file...click NEXT. In the DESTINATION field select SQL Server Native Client 11.0 and go through the import process. This worked very well for me.
Related
I accidentally deleted YEARS of data in SQL Server Management Studio from a table. Unfortunately the person in this position before me didn't back anything up. Nor did I before I tried to fix an issue. I understand that it cannot be retrieved from SQL but I have all the data I need in a separate file on my desktop. Is their anyway to get that data and input it back into the table that is in SQL? Or is there a query I can run to input the data again into the table? I'm not sure if I am making any sense :/
You can also used Management Studio without SSIS. Right click on the database in MS and select Tasks -> Import Data. You should then be able to select the type of source (flat file) and the format. The rest of the wizard is pretty self-explanatory.
If it is a flat file like .txt or .csv or even an Excel file like(.xls), you can build an SSIS package and dump the data to a new table. Depends, on what kind of data you have in your hand.
What is the best method to import data from an Excel worksheet? As of now I am use SSMS Express so I don't have access to SQL Import Wizard. I also don't have permissions to execute the BULK INSERT command.
My current workflow is as follow: Clean up the excel file, save as CSV, and import it into a SQLite database. Use an IDE like RazorSQL to generate SQL INSERT statements.
This worked nicely until I hit an Excel file about 75000 rows. SSMS just gives an error saying "query finished with errors" or something like that. No error message is shown. I tried adding GO at the end of each line but I got out of memory error.
What are my options?
To answer your question, the best method to import data from excel, in my past experience has been to read excel data into c#, do any clean up and formatting as necessary since excel likes to mess with the data, then use SqlBulkCopy (you only need select/insert permissions) to insert into SQL Server. See this SO answer if you need help reading excel from C#
Update: Given you're not a dev, try using the bcp utility (you should only need select/insert permission)you may need to save the excel file as CSV first, then import it directly into sql server, see this SO answer
You can use following:
bcp utility (between file system data dump and database),
OPENQUERY (can be used from SSMS, works between external datasource like Excel/csv and database),
BULK INSERT (can be used from SSMS, works between external file with user-defined structure and database),
SSIS (usually as dtsx package, has its own GUI, works with various souces and destinations)
Set of INSERT statements (all of them one after another, eventually sliced with GO or packed with UNION ALL)
Set of records serialized in XML variable (can be used from SSMS only; you have to serialize/deserialize it by your self using FOR XML and XML functions)
There are surely other possibilities, but these are maybe most used ones.
EDIT: It seems to me that you could try with GO after every 5-10K lines in your script.
If that doesn't work, XML serialization/deserialization could be the way to go.
Could you use a linked server to connect to the Excel Document? How to use Excel with SQL Server linked servers and distributed queries
A quick and dirty workaround: pull the rows in batches of 50k.
select * from employee limit 50000
select * from employee limit 50000, 100000
From
http://www.razorsql.com/articles/mysql_limit_query.html
Want to create a script to export Data and tables and views to a sql script.
I have SQL Server 2008 r2.
So far I've only been able to automatically generate an sqlscript for all tables and views. But the data wasn't included.
OR is there any easier way to export data, tables, views, from one SQL Server to my ISP's SQL server?
Regards
Tea
If for some reason a backup/restore won't work for you, SSMS' Generate Scripts tool includes an advanced scripting option to include data:
Here are some options to think over (prioritised in terms of what I would recommend):-
A simple backup and restore will be the easiest and quickest solution;
Using a data scripting tool (like Red-Gate's Data Compare) could solve your needs;
Use the database comparison as part of Visual Studio.
A SSIS package could be developed to pump data back and forth between the two instances; or
Write your own script using the SET IDENTITY INSERT ON / OFF command for the identity seeded tables
The easiest way to do this is to create a backup, copy the .bak file to the other server, and restore the backup there.
Like #jhewlett said that will be the best way to do it. to answer the question in the comment section. no it shouldn't be a problem. Just make sure that the SQL Server Versions are the same. Had a bit of an issue not to long ago where there were two pc's with different releases of the R2 installed and couldn't restore the backup. Other thing you can also do is to script the entire database with data, but this will not be recommended as it could take a long time to generate the script and for it to finish running on the other computer.
Or you can simply just stop the SQL server instance and copy the database away onto an external hard drive and re-attach it to the other server. just remember to start the instances after doing this step.
I use Navicat Premium for these kind of things in mysql. It generates sql from data, tables, views and anything else. It provides tools to copy or synchronize table from one database on different server or platforms as well. For example I use it so much to transfer my tables from MySQL to a SQLite database, So easy and fast. Otherwise I had to transfer it manually with so much trouble.
very good tool and required for any DB admin or programmer. It support MySQL, Oracle, MS SQL Server, PostgreSQL and SQLite.
To Generate a schema with data follow these steps.
Select database to generate a schema '>' right click '>' Tasks '>' Generate schema '>' click NEXT in popup window '>'
select DB object to generate schema and click NEXT '>'
Go to advance option and scroll down '>'
Find Type of data to script and select one option as you need. '>'
and then Next Next and finish it.
Enjoy it.
If you don't want to port all tables data (for example you need to port just some base data in special tables) scripting options is not useful for you. In this case you'll have two options. First is using some third parties tools such as Red-Gate and Second way is writing the script by yourself. I prefer Second option because except the expensive price of most of them i want to run just little script for little delete, update and inserting purpose. But the important problem is here: maybe the record count is too long to write scripts record by record. I Think the linked Server is good point to solve that. It's enough for describing Just Declare Linked Server as you see in Images and get new script in your source DB and write scripts with access to both source and destination DB. Attached image must be clear.
Create New Linked Server:
Write Destination SQL Server Address:
Fill Login Info:
Now you have Linked Server:
Write script and enjoy:
Hope this help.
I have a set of large CSV files with many columns each that I need to import into a SQL Azure database. Ordinarily I would use the import wizard in SQL Server Management Studio. However, the wizard does not appear to be an option when connecting to SQL Azure in SSMS. Is that correct? And if so, what is the recommended tool for accomplishing this task? I'm looking for a tool that will infer from the data what the columns should be allowing me to override the data type as needed. Since I have a lot of columns in each of the files I'd like to avoid the tedious work of manually writing the SQL code to generate the tables.
This worked for me:
Open SQL Server Management Studio
Connect to Azure
Right-click the database
Go to Tasks > Import Data
Select your flat file(s)
Upload to Azure SQL and create an SSIS package based on this workflow
I sometimes get errors with CSV files this way, but either using an Excel file or inspecting the options of the CSV data columns in the Import Wizard should suffice.
Make sure you have appropriate permissions assigned to your user account.
They could've / should've made this easier, like a SFTP + insert or a GUI import directly to Azure SQL like in Hue.
When you are transferring any data to SQL Database, the data should be structured. The proces will be to convert your CSV to a table structure and then migrate it directly to SQL Azure. Actually you can write a stored procedure in SSMS to do it all in one.
Because CSV file could be tab, comma, or any other character delimited, you can do bulk insert in local DB first as described here and then sync the table to SQL Azure.
I would like to copy a table from one database to another. I know you can easily do the following if the databases are on the same SQL Server.
SELECT * INTO NewTable FROM existingdb.dbo.existingtable;
Is there any easy way to do this if the databases are on two different SQL Servers, without having to loop through every record in the original table and insert it into the new table?
Also, this needs to be done in code, outside of SQL Server Management Studio.
Yes. add a linked server entry, and use select into using the four part db object naming convention.
Example:
SELECT * INTO targetTable
FROM [sourceserver].[sourcedatabase].[dbo].[sourceTable]
If it’s only copying tables then linked servers will work fine or creating scripts but if secondary table already contains some data then I’d suggest using some third party comparison tool.
I’m using Apex Diff but there are also a lot of other tools out there such as those from Red Gate or Dev Art...
Third party tools are not necessary of course and you can do everything natively it’s just more convenient. Even if you’re on a tight budget you can use these in trial mode to get things done….
Here is a good thread on similar topic with a lot more examples on how to do this in pure sql.
SQL Server(2012) provides another way to generate script for the SQL Server databases with its objects and data. This script can be used to copy the tables’ schema and data from the source database to the destination one in our case.
Using the SQL Server Management Studio, right-click on the source database from the object explorer, then from Tasks choose Generate Scripts.
In the Choose objects window, choose Select Specific Database Objects to specify the tables that you will generate script for, then choose the tables by ticking beside each one of it. Click Next.
In the Set Scripting Options window, specify the path where you will save the generated script file, and click Advanced.
From the appeared Advanced Scripting Options window, specify Schema and Data as Types of Data to Script. You can decide from here if you want to script the indexes and keys in your tables. Click OK.
Getting back to the Advanced Scripting Options window, click Next.
Review the Summary window and click Next.
You can monitor the progress from the Save or Publish Scripts window. If there is no error click Finish and you will find the script file in the specified path.
SQL Scripting method is useful to generate one single script for the tables’ schema and data, including the indexes and keys. But again this method doesn’t generate the tables’ creation script in the correct order if there are relations between the tables.
Microsoft SQL Server Database Publishing Wizard will generate all the necessary insert statements, and optionally schema information as well if you need that:
http://www.microsoft.com/downloads/details.aspx?familyid=56E5B1C5-BF17-42E0-A410-371A838E570A
Generate the scripts?
Generate a script to create the table then generate a script to insert the data.
check-out SP_ Genereate_Inserts for generating the data insert script.
Create the database, with Script Database as... CREATE To
Within SSMS on the source server, use the export wizard with the destination server database as the destination.
Source instance > YourDatabase > Tasks > Export data
Data Soure = SQL Server Native Client
Validate/enter Server & Database
Destination = SQL Server Native Client
Validate/enter Server & Database
Follow through wizard