I'm not a good SQL programmer, I've got only the basics, but I've heard of some BCP thing for fast data loading. I've searched the internet and it seems to be a command-line only utility, and not something you can use in code.
The thing is, I want to be able to make very fast inserts and updates in a SQL Server 2008 database. I would like to have a function in the database that would accept:
The name of the table I want to execute an insert/update operation against
The names of the columns I'll be feeding data to
The data in a CSV format or something that SQL can read stupid-fast
A flag indicating weather the function should perform an insert or update operation
This function would then read this CSV string and genarate the necessary code for inserting/updating the table.
I would then write code in C# to call that function passing it the table name, column names, a list of objects serialized as a CSV string and the insert/update flag.
As you can see, this is intended to be both fast and generic, suitable for any project dealing with large amounts of data, and thus a candidate to my company's framework.
Am I thinking right? Is this a good idea? Can I use that BCP thing, and is it suitable to every case?
As you can see, I need some directions on this... thanks in advance for any help!
In C#, look at SQLBulkCopy. It's what SSIS uses in the background.
For true bcp/BULK INSERT, you'd need bulkadmin rights which may not be allowed
Have you considered using SQL Server Integrated Services (SSIS). It's designed to do exactly what you describe. It is very fast. You can insert data on a transactional basis. And you can set it up to run on a schedule. And much more.
Related
I have an SQLite3 database. I also have an SQL Server database with the same structure. I need to export the data from SQLite and insert it into the SQL Server database.
The export from SQLite and the modification of the generated export needs to be 100% scripted. Inserting into the SQL Server database will be done manually through SQL Server Management Studio.
I have a mostly good dump of the database through this answer here. I can modify most of the script as needed with sed.
The one thing I'm stuck on right now is that the SQLite database stores timestamps as number of seconds since UNIX epoch. The equivalent column in SQL Server is DATETIME. As far as I know, inserting an integer into a DateTime won't work.
Is there a way to specify that certain fields be converted a certain way upon dumping from SQLite? Meaning, specify that the integer fields be dumped as proper DateTime strings that SQL Server will understand?
Or, is there something I can run on the Linux command line that will somehow find these Integer timestamps and convert them?
EDIT: Anything that runs in a Bash script on Ubuntu is acceptable.
Three basic solutions: (1) modify the data before the dump; (2) manipulate the file after the dump, or (3) modify the data on import. Which you choose will depend on how much freedom you have to modify schemas.
If you wish to do it in SQLite, I'd suggest adding text columns with the dates stored as needed for import to SQL Server, then ignore or remove the original columns on dump. The SQLite doc page for datetime() may help, as might answers to this question.
Or, you can write a function in SQL Server that handles the import. Perhaps set it on an insert trigger.
Otherwise, a script that manipulates your dump file would work too. It sounds like you have a good handle on how to do this.
Want to create a script to export Data and tables and views to a sql script.
I have SQL Server 2008 r2.
So far I've only been able to automatically generate an sqlscript for all tables and views. But the data wasn't included.
OR is there any easier way to export data, tables, views, from one SQL Server to my ISP's SQL server?
Regards
Tea
If for some reason a backup/restore won't work for you, SSMS' Generate Scripts tool includes an advanced scripting option to include data:
Here are some options to think over (prioritised in terms of what I would recommend):-
A simple backup and restore will be the easiest and quickest solution;
Using a data scripting tool (like Red-Gate's Data Compare) could solve your needs;
Use the database comparison as part of Visual Studio.
A SSIS package could be developed to pump data back and forth between the two instances; or
Write your own script using the SET IDENTITY INSERT ON / OFF command for the identity seeded tables
The easiest way to do this is to create a backup, copy the .bak file to the other server, and restore the backup there.
Like #jhewlett said that will be the best way to do it. to answer the question in the comment section. no it shouldn't be a problem. Just make sure that the SQL Server Versions are the same. Had a bit of an issue not to long ago where there were two pc's with different releases of the R2 installed and couldn't restore the backup. Other thing you can also do is to script the entire database with data, but this will not be recommended as it could take a long time to generate the script and for it to finish running on the other computer.
Or you can simply just stop the SQL server instance and copy the database away onto an external hard drive and re-attach it to the other server. just remember to start the instances after doing this step.
I use Navicat Premium for these kind of things in mysql. It generates sql from data, tables, views and anything else. It provides tools to copy or synchronize table from one database on different server or platforms as well. For example I use it so much to transfer my tables from MySQL to a SQLite database, So easy and fast. Otherwise I had to transfer it manually with so much trouble.
very good tool and required for any DB admin or programmer. It support MySQL, Oracle, MS SQL Server, PostgreSQL and SQLite.
To Generate a schema with data follow these steps.
Select database to generate a schema '>' right click '>' Tasks '>' Generate schema '>' click NEXT in popup window '>'
select DB object to generate schema and click NEXT '>'
Go to advance option and scroll down '>'
Find Type of data to script and select one option as you need. '>'
and then Next Next and finish it.
Enjoy it.
If you don't want to port all tables data (for example you need to port just some base data in special tables) scripting options is not useful for you. In this case you'll have two options. First is using some third parties tools such as Red-Gate and Second way is writing the script by yourself. I prefer Second option because except the expensive price of most of them i want to run just little script for little delete, update and inserting purpose. But the important problem is here: maybe the record count is too long to write scripts record by record. I Think the linked Server is good point to solve that. It's enough for describing Just Declare Linked Server as you see in Images and get new script in your source DB and write scripts with access to both source and destination DB. Attached image must be clear.
Create New Linked Server:
Write Destination SQL Server Address:
Fill Login Info:
Now you have Linked Server:
Write script and enjoy:
Hope this help.
Is there any handy tool that can make updating tables easier? Usually I got an Excel file with the original value in one column and new value in another column. Then I write a formula in Excel to create the 'update' statement. Is there any way to simplify the updating task?
I believe the approach in SQL server 2000 and 2005 would be different, so could we discuss them both? Thanks.
In addition, these updates usually request by "non-programmer" (which means they don't understand SQL, so it may not feasible to let them do query), is there any tool that can let them update the table directly without having DBAs do this task? Also, that tool needs to limit the privilege to only modify certain tables. And better has a way rollback the change.
Create a DTS package that will import a csv file, make the updates and then archives the file. The user can drop the file in a specific folder designated for the task or this can be done by an ops person. Schedule the DTS to run every hour, day, etc.
In case your users would insist that they keep using Excel, you've got several different possibilities of getting the data transferred to SQL Server. My preferred one would be to use DTS/SSIS, as mentioned by buckbova.
However, another method is by using OPENROWSET(), which makes it possible to query your Excel file as if it was a table. I wrote a small article about it here: http://blog.hoegaerden.be/2010/03/29/retrieving-data-from-excel/
Another approach that hasn't been mentioned yet (I'm not a big fan of letting regular users edit data directly in the DB), any possibility of creating a small custom application for them?
There you go, a couple more possible solutions :-)
Valentino.
I think the best approach is to expose a view on your data accessible to users who are allowed to do updates, and set up triggers on the view to perform the actual updates on the underlying data. Restrict change to only the columns they should be changing.
This technique can work on SQL Server 2000 and 2005.
I would add audit triggers on the underlying tables so you can always track changes.
You'll have complete control, and they can connect to it with Access or whatever and perform their maintenance.
You could create some accounts in SQL Server for these users and limit their access to only certain tables and columns along with onlu select / update / insert privileges. Then you could create an access database with linked tables to these.
I am tasked with exporting the data contained inside a MaxDB database to SQL Server 200x. I was wondering if anyone has gone through this before and what your process was.
Here is my idea but its not automated.
1) Export data from MaxDB for each table as a CSV.
2) Clean the CSV to remove ? (which it uses for nulls) and fix the date strings.
3) Use SSIS to import the data into tables in SQL Server.
I was wondering if anyone has tried linking MaxDB to SQL Server or what other suggestions or ideas you have for automating this.
Thanks.
AboutDev.
I managed to find a solution to this. There is an open source MaxDB library that will allow you to connect to it through .Net much like the SQL provider. You can use that to get schema information and data, then write a little code to generate scripts to run in SQL Server to create tables and insert the data.
MaxDb Data Provider for ADO.NET
If this is a one time thing, you don't have to have it all automated.
I'd pull the CSVs into SQL Server tables, and keep them forever, will help with any questions a year from now. You can prefix them all the same, "Conversion_" or whatever. There are no constraints or FKs on these tables. You might consider using varchar for every column (or the ones that cause problems, or not at all if the data is clean), just to be sure there are no data type conversion issues.
pull the data from these conversion tables into the proper final tables. I'd use a single conversion stored procedure to do everything (but I like tsql). If the data isn't that large millions and millions of rows or less, just loop through and build out all the tables, printing log info as necessary, or inserting into exception/bad data tables as necessary.
Via a web service, remote computers will be sending a set of rows to insert into our central sql server.
What is the best way (performance wise) to insert these rows? There could be anywhere from 50-500 rows to insert each time.
I know I can do a bulk insert or format the data as XML that insert it that way, but I've never done this in an enterprise setting before.
Updates
using wcf web services (or maybe wse not sure yet) and SQL Server 2008 standard.
Unless you're running on a 10 year-old computer, 50-500 rows isn't very many; you could literally send over SQL statements and pipe them directly into the database and get great performance. Assuming you trust the services sending you data, of course :-)
If performance really is an issue sending over a bcp file is absolutely the fastest way to jam data in the database. It sounds from your question that you already know how to do this.
A mere 50-500 records does not constitute a "bulk insert". The bulk insert mechanism is designed for really massive import of data which is to be immediately followed up with a back up.
In web service I would simply pass the XML into SQL server. The specifics would be version dependent.
What kind of web service is this?
If it's .Net, usually the best way is to load the input into a DataTable, then shoot it up to the SQL server using the SqlBulkCopy class
50-500 rows shouldn't be a problem! There is no need to do performance tuning! Do normal (prepared) SQL Statements in your application.
Don't kill it with complexity and overengineering.
When you should insert more than 250.000 rows, you should think about scaling!
Don't turn of the constraints, you might kill the DB.
To echo all other answers, 500 rows is no issue for SQL server. If you do need to insert a large number of records, the fastest way is with a built-in stored proc called BulkInsert,
BulkInsert
which (I Believe) is an entry point to a SQL Server utility designed specifically for doing this called bcp.exe
bcp