Import data from Excel/CSV to SQL Server - sql-server

I have some questions about importing data from Excel/CSV File into SQL Server. Let me first explain the overall scenario.
We receive data from multiple sources and in either Excel/CSV format. This data is to be imported into SQ Server into a table. Because we receive the data from multiple sources we have a requirement to map the columns in the Excel files to the columns in our SQL Server table.
I understand that either DTS or the Import / Export wizard is the way to import this data if we were to do this import manually. But I have the following questions
Are there alternatives available to DTS/Import export wizard?
If I were to write an application for importing data what are the .net framework classes that I would or could use? For some reason I don't want to use or build a SQL script within the application. What would be the best way of going about doing this?
Is there any way we can reduce the effort involved in mapping data?
Any suggestions, help would be most welcome
Regards
Romi

Are there alternatives available to DTS/Import export wizard?
-- bulkinsert.
If I were to write an application for importing data what are the .net framework classes that I would or could use? For some reason I dont want to use or build an SQL Script within the application. What would be the best way of going about doing this?
-- SSIS.
Is there any way we can reduce the effort involved in mapping data?
-- ?
SSIS is a very powerful tool. May want to explore that option first. You can even build custom component using .net as well.

Related

Ways to do Oracle DB to Cassandra DB migration or Import without local files

I'm tying to figure out a way to do DB to DB migration without using local files(CSV). Is there any direct way to do SQL/Oracle DB to cassandra Db. Mostly I'm just going to import a table from oracle to cassandra. I have checked the methods I'm aware of and they are all using an temporary file to extract the data and then load it into cassandra.
Can someone suggest any other alternate methods that can be carried out possibly in Windows with a free ETL tool or any other suggestion on how to do it.
Thanks
You used to be able to use sqoop to import data but this is deprecated now so depending what version of Cassandra you use you might be able to employ this.
I would recommend using the Datastax bulk loader but you will have to export from oracle to json or csv first. You'd probably also have to do some transformation on your data anyway since its unlikely you'll have the same data model in Cassandra as you did with Oracle.

Importing data from multiple SQL servers

We are looking at collecting data from partners' Microsoft SQL Servers and importing it into our own SQL Server. Part of what we want to do is to take all of their data separately and then combine it all together so that we can create baselines on how they are performing against one another comparatively. I am curious to learn what best practices or recommendations there might be to achieve this?
The easiest approach that I can think of, is to set them up as linked servers on our SQL Server and then write stored procedures (and automate a schedule using SQL Server Agent) to import the data from each to local tables. I've also started looking at 3rd party systems to do this (e.g. stitchdata) but am not seeing ones that will import data back locally, most of them appear to import data to a cloud DB solution.
Has anyone done something similar before and can help steer us in the right direction?
Thank you!
To Solve this problem using SQL Tools an approach is you create a staging database to load all external information.
To gather the data you can use SSIS packages to connect directly to the sources. and schedule the packages on SQL ServerAgent
I avoid using the linked server to ETL proposes for many reasons, but the most important to me are:
If the remote server is unavailable, all ETL process can be broken.
The process would have been strongly linked to the origin and if the source changes you will need to reconstruct many things.
Tou can use or not SP to load and compare the tables between the final database and the stage. It will depend if the database is on the same server, performance, etc.

XML : save data from sql server to file

I'd like to save data from a SQL Server database table to a file, then load it into another database that has the same table already created in it
How can I do this? I know there should be some simple way of doing it, but stackoverflowsearch and google aren't yielding good answers (or I'm not asking a good question).
SQL Server Import and Export Wizard
It's located under the SQL Server startup menu folder.
As far as I know there is no simple way to do this in SQL only.
Probably best way to handle this is to create your own simple application in whatever programming language that will query first database, write the file where it needs to be written and also import data into another database.

How is the best way to import data into SQL Server Express?

I want to import data into SQL Server Express, from Access, Excel and txt files. I'm creating a decent database, and I must to import these old formated data. When working with few records, I copy and paste directly through Visual Web Developer DB Explorer.
But now I'm dealing with a few more records (40k). I think copy/paste unsafe, slow and unprofessional. I haven't any other interfaces to control SQL server. How can I do that?
Thanks!
There is an "Import and Export Wizard" that comes with SQL Express. It allows you to import from Access, Excel, ODBC, SQL Client etc.
I don't think there's a clear answer but I really think MSACCESS 2000 or higher is a very versatile tool for doing this..
Linking in tables and using Append queries to other linked tables works really well, plus utilizing the power of VBA helps in some cases too (like calling a vba function from query designer (like InStr or Mid etc..) (if your familiar with this)
Does anyone else agree?
The BCP (Bulk Copy) works well for importing into SQL Server: http://msdn.microsoft.com/en-us/library/ms162802.aspx
There is also the "bulk insert" command: http://msdn.microsoft.com/en-us/library/ms188365.aspx which has the caveat that the file must be physically accessible from the server.
Both of these methods can import comma delimited files, so you'd need to be able to create those from your data source.
I recommend loading all the objects from one SQL table into a JSON object and then indexing through an array of object and translating them into the new table. I have some open source MySQL to JavaScript bridge code that can help with this if you need.
In case you have not found a solution to this yet, try http://www.razorsql.com/download_win.html
I am not affiliated with them, but I was looking for this same solution and this is working.

Programmatically export SQL Server views to Excel

I posed this question about 8 months ago, and the fact that we were running SQL Server 2000 seemed to be the limiting factor. We recently upgraded to SQL Server 2008 and I still can't find a solid solution to this problem.
We have an Access application interfacing with a SQL Server database and we need to find a way to programmatically export a given view to an Excel spreadsheet -- or at least an Excel compatible spreadsheet (CSV, tab delimited, etc.) I can use bcp, however several of the views contain fields with linefeeds in them, which proves troublesome when importing to Excel. These views are also varied and have unpredictable columns, so to the best of my knowledge using OPENROWSET is also not an option, as you need to have an Excel template with rows predefined.
Any help here would be appreciated. I know my way around Access and SQL Server, but my knowledge is somewhat limited.
When you say "programatically", which language were you hoping to implement the solution in?
I would suggest SQL Server SSIS as a good starting point. If you needed to code this dynamically, rather than using the BIDS/Visual Studio Designer there is plenty of support for this in the .NET libraries, allowing you to do it in c#/VB
If you want to export data into CSV you may write your own function to escape special characters in fields
(It is a pain in one place)
If you use SSIS to export your data into Excel it will put apostrophe and the beginning of every cell
If you willing to spend some money you may consider using our Advanced ETL processor
First of all it works correctly with Excel all the time.
Plus it has data export component which allows you to select the tables/views to
export using mask
You can use scheduler to automate it or you can run a package from the command line
This on-line tutorial gives you a quick introduction to data export
http://www.dbsoftlab.com/online-tutorials/advanced-etl-processor/advanced-etl-pro-exporting-data-from-mysql-database-into-text-files.html

Resources