I posed this question about 8 months ago, and the fact that we were running SQL Server 2000 seemed to be the limiting factor. We recently upgraded to SQL Server 2008 and I still can't find a solid solution to this problem.
We have an Access application interfacing with a SQL Server database and we need to find a way to programmatically export a given view to an Excel spreadsheet -- or at least an Excel compatible spreadsheet (CSV, tab delimited, etc.) I can use bcp, however several of the views contain fields with linefeeds in them, which proves troublesome when importing to Excel. These views are also varied and have unpredictable columns, so to the best of my knowledge using OPENROWSET is also not an option, as you need to have an Excel template with rows predefined.
Any help here would be appreciated. I know my way around Access and SQL Server, but my knowledge is somewhat limited.
When you say "programatically", which language were you hoping to implement the solution in?
I would suggest SQL Server SSIS as a good starting point. If you needed to code this dynamically, rather than using the BIDS/Visual Studio Designer there is plenty of support for this in the .NET libraries, allowing you to do it in c#/VB
If you want to export data into CSV you may write your own function to escape special characters in fields
(It is a pain in one place)
If you use SSIS to export your data into Excel it will put apostrophe and the beginning of every cell
If you willing to spend some money you may consider using our Advanced ETL processor
First of all it works correctly with Excel all the time.
Plus it has data export component which allows you to select the tables/views to
export using mask
You can use scheduler to automate it or you can run a package from the command line
This on-line tutorial gives you a quick introduction to data export
http://www.dbsoftlab.com/online-tutorials/advanced-etl-processor/advanced-etl-pro-exporting-data-from-mysql-database-into-text-files.html
Related
I want to import data into SQL Server Express, from Access, Excel and txt files. I'm creating a decent database, and I must to import these old formated data. When working with few records, I copy and paste directly through Visual Web Developer DB Explorer.
But now I'm dealing with a few more records (40k). I think copy/paste unsafe, slow and unprofessional. I haven't any other interfaces to control SQL server. How can I do that?
Thanks!
There is an "Import and Export Wizard" that comes with SQL Express. It allows you to import from Access, Excel, ODBC, SQL Client etc.
I don't think there's a clear answer but I really think MSACCESS 2000 or higher is a very versatile tool for doing this..
Linking in tables and using Append queries to other linked tables works really well, plus utilizing the power of VBA helps in some cases too (like calling a vba function from query designer (like InStr or Mid etc..) (if your familiar with this)
Does anyone else agree?
The BCP (Bulk Copy) works well for importing into SQL Server: http://msdn.microsoft.com/en-us/library/ms162802.aspx
There is also the "bulk insert" command: http://msdn.microsoft.com/en-us/library/ms188365.aspx which has the caveat that the file must be physically accessible from the server.
Both of these methods can import comma delimited files, so you'd need to be able to create those from your data source.
I recommend loading all the objects from one SQL table into a JSON object and then indexing through an array of object and translating them into the new table. I have some open source MySQL to JavaScript bridge code that can help with this if you need.
In case you have not found a solution to this yet, try http://www.razorsql.com/download_win.html
I am not affiliated with them, but I was looking for this same solution and this is working.
I would like to transfer the whole Database i have in Informix to Oracle. We have an an application which works on both Databases, one of our customers is moving from Informix to Oracle, and needs to transfer the whole Database to Oracle (the structure is the same).
We need often to transfer data between oracle/Mssql/Informix sometimes only one table and not the whole Database.
Does anybody know about any good program which does this kind of job?
The Pentaho Data Integration ETL tools are available as open source (also known under the former name "Kettle") for cross-database migration and many other use cases.
From their data sheet:
Common Use Cases
Data warehouse population with built-in support for slowly changing
dimensions, junk dimensions
Export of database(s) to text-file(s) or other databases
Import of data into databases, ranging from text-files to excel
sheets
Data migration between database applications
...
A list of input / output data formats can be found in the accepted answer of this question: Does anybody know the list of Pentaho Data Integration (Kettle) connectors list?
It supports all databases with a JDBC driver, which means most of them.
Check this question of mine, it includes some very good ideas: Searching for (freeware) database migration tool
you could give the Oracle Migration Workbench a try. See http://download.oracle.com/docs/html/B15858_01/toc.htm If you want to read Informix data into Oracle on a regular basis, using the Heterogeneous Services might be a better option. Check for hs4odbc or dg4odbc, depending on the Oracle release you have.
I hope this helps,
Ronald.
I have done this in the past and it is not a trivial task. We ended up writing out each table out to a pipe delimited flat file and reloading each table into Oracle with Oracle SQL Loader. There was a ton of Perl scripts to scrub the source data and shell scripts to automate the process as much as possible and run things in parallel as well.
Gotchas that can come up:
1. Pick a delimiter that is as unique as possible.
2. Try to find data types that match as close as possible to the Informix ones as possible. ie date vs. timestamp
3. Try to get the data as clean as possible prior to dumping out the flat files.
4. HS will most likely be too slow..
This was done years ago. You may want to investigate Golden Gate (now owned by Oracle) software which may help with the process(GG did not exist when I did it)
Another idea is use an ETL tool to read Informix and dump the data into Oracle (Informatica comes to mind)
Good luck :)
sqlldr - Oracle's import utility
Here's what I did to transfer 50TB of data from MySQL to ORacle. Generated csv files from MySql and used sqlldr utility in oracle to export all the data from the files to oracle db. It is the fastest way to import data. I researched on this for a few weeks and done lot of benchmark test cases and sqlldr is hands down best and fastest way to import into oracle.
We have several SQL Server databases containing measurements from generators that we build. However, this useful data is only accessible to a few engineers since most are unfamiliar with SQL (including me). Are there any tools would allow an engineer to extract chosen subsets of the data in order to analyze it in Excel or another environment? The ideal tool would
protect the database from any accidental changes,
require no SQL knowledge to extract data,
be very easy to use, for example with a GUI to select fields and the chosen time range,
allow export of the data values into a file that could be read by Excel,
require no participation/input from the database manager for the extraction task to run, and
be easy for a newbie database manager to set up.
Thanks for any recommendations or suggestions.
First off, I would never let users run their own queries on a production machine. They could run table scans or some other performance killer all day.
We have a similar situation, and we generally create custom stored procedures for the users to "call", and only allow access to a backup server running "almost live" data.
Our users are familiar with excel, so I create a stored procedure with ample parameters for filtering/customizations and they can easily call it by using something like:
EXEC YourProcedureName '01/01/2010','12/31/2010','Y',null,1234
I document exactly what the parameters do, and they generally are good to go from there.
To set up a excel query you'll need to set up the data sources on the user's PC (control panel - data sources- odbc), which will vary slightly depending on your version of windows.
From in excel, you need to set up the "query", which is just the EXEC command from above. Depending on the version of Excel, it should be something like: menu - data - import external data - new database query. Then chose the data source, connect, skip the table diagram maker and enter the above SQL. Also, don't try to make one procedure do everything, make different ones based on what they do.
Once the data is on the excel sheet, our users pull it to other sheets and manipulate it at will.
Some users are a little advanced and "try" to write their own SQL, but that is a pain. I end up debugging and fixing their incorrect queries. Also, once you do correct the query, they always tinker with it and break it again. using a stored procedure means that they can't change it, and I can put it with our other procedures in the source code repository.
I would recommend you build your own in Excel. Excel can make queries to your SQL Server Database through an ODBC connection. If you do it right, the end user has to do little more than click a "get data" button. Then they have access to all the GUI power of Excel to view the data.
Excel allows to load the output of stored procedures directly into a tab. That IMO is the best way: users need no knowledge of SQL, they just invoke a procedure, and there are no extra moving parts besides Excel and your database.
Depending on your version of SQL server I would be looking at some of the excellent self service BI tools with the later editions such as Report Builder. This is like a stripped down version of visual studio with all the complex bits taken out and just the simple reporting bits left in.
If you setup a shared data source that is logging into the server with quite low access rights then the users can build reports but not edit anything.
I would echo the comments by KM that letting the great unwashed run queries on a production system can lead to some interesting results with either the wrong query being used or massive table scans or cartesian joins etc
I've looked around and can't seem to find anything that answers this specific question.
What is the simplest way to move data from an MS SQL Server 2005 DB to a Postgres install (8.x)?
I've looked into several utilities like "Full Convert Enterprise", etc, and they all fail for one reason or another, ranging from strange errors that make it blow up to inserting nulls rather than actual data (wth?).
I'm looking at a DB with all table except for a single view, no stored procs, functions, etc.
At this point I'm about to write a small utility to do it for me, I just can't believe that's necessary. Surely there's something somewhere that can do this? I'm not even too worried about cost, although free is preferable :)
I don't know why nobody has mentioned the simplest and easiest way using robust MS SQL Server Management Studio.
Simply you just need to use the built-in SSIS Import/export feature. You can follow these steps:
Firstly, you need to install the PostgreSQL ODBC Driver for Windows. It's very important to install the correct version in terms of CPU arch (x86/x64).
Inside Management Studio, Right click on your database: Tasks -> Export Data
Choose SQL Server Native Client as the data source.
Choose .Net Framework Data Provider for ODBC as the destination driver.
Set the Connection String to your database in the following form:
Driver={PostgreSQL ODBC Driver(UNICODE)};Server=;Port=;Database=;UID=;PWD=
In the next page, you just need to select which tables you want to export. SQL Server will generate a default mapping and you are free to edit it. Probably you`ll encounter some Type Mismatch problems which take some time to solve. For example, if you have a boolean column in SQL Server you should export it as int4.
Microsoft Docs hosts a detailed description of connecting to PostgreSQL through ODBC.
PS: if you want to see your installed ODBC Driver, you need to check it via ODBC Data Source Administrator.
Take a look at the Software Catalogue. Under Administration/development tools I see DBConvert for MS SQL & PostgreSQL. Probably there are other similar tools listed.
You can use the MS DTS functionality (renamed to SSIS in the latest version I think). One issue with the DTS is that I've been unable to make it do a commit after each row when loading the data into pg. Which is fine if you only have a couple of 100k rows or so, but it's really very slow.
I usually end up writing a small script that dumps the data out of SQLServer in CSV format, and then use COPY WITH CSV on the PostgreSQL side.
Both those only take care of the data though. Taking care of the schema is a bit harder, since datatypes don't necessarily map straight over. But it can easily be scripted together with a static load of the schema. If the schema is simple (just varchar/int datatypes for example), that part can also easily be scripted off the data in INFORMATION_SCHEMA.
Well there are .NET bindings for MS SQL Server 2005 (obviously) and also for PostgreSQL. So it would only take a few lines of code to code up a program that could transfer data safely from one to the other. The view would probably have to be done manually as Postgres doesn't use the same language for views as SQL Server.
This answer is to help summarize current connection string because someone may overlooked the comment.
Current version of ODBC connection string is:
For 32-bit system
Driver={PostgreSQL UNICODE};Server=192.168.1.xxx;Port=5432;Database=yourDBname;Uid=postgres;Pwd=admin;
For 64-bit system
Driver={PostgreSQL UNICODE(x64)};Server=192.168.1.xxx;Port=5432;Database=yourDBname;Uid=postgres;Pwd=admin;
You can check the driver name by typing ODBC in windows search.
And open ODBC Data Source Administrator
What is the best way to import highly formatted data from Excel to SQL server.
Basically I have 250+ Excel files that have been exported from a reporting tool in a format that our business users would prefer. This is a 3rd party tool that can not export data in any other format. I need to "scrub" these files on a monthly basis and import them into a database. I want to use SQL Server 2005
File formats look like this:
Report Name
Report Description
MTH/DEC/2003 MTH/JAN/2004 MTH/FEB/2004
Data Type Data Type Data Type
Grouping 1 1900 1700 2800
Grouping 2 1500 900 1300
Detail 300 500 1000
Detail 1100 200 200
Detail 100 200 100
you could write a simple parser application. there are many api that will handle reading excel files.
I have written one in java and it only took a day or two.
here is one api.
Good Luck
EDIT: Forgot to mention we will also need a sql api such as JDBC. Again we use JDBC for the majority of our applications and works great.
Personally I would do it using SSIS. It might not be trivial to set up as the file format looks relatively complex (but that I suspect might be true no matter what tool you use), but as long as it stays consistent, it will run quickly each month and SSIS packages are easy to put under source control. Since SSIS is part of SQL Server it is easy to make sure allthe servers have it available. The key is do have a good understanding of how that format relates to how you store data in the database. THat's the hard part no matter what tool you use.
Assuming that you have Microsoft Excel, you can also use Excel's own exposed ActiveX interface. More information here:
http://msdn.microsoft.com/en-us/library/wss56bz7(VS.80).aspx
You could use that in anything that can use ActiveX (C++, VB6, VB.NET etc...) to create a parser as well, to follow-up on what Berek said.
I have done this before with perl and MYSQL. I wrote a simple perl script which parsed the file and output the contents to a .sql file. Then, this can eather be done manually or included in the perl script, open MYSQL and use the .sql file.
This may seem a bit simplistic, but you could simply dump the data in csv format and do some parsing of the output to convert to insert statements for SQL.
For a java based application , POI (http://poi.apache.org/) is pretty good for Excel integration applications.
You might want to look at CLR procedures and functions in SQL Server. With a CLR procedure you could do all of your scrubbing in a VB or C# .NET application but still run the jobs from SQL Server just like any other stored procedure or UDF.