I've looked around and can't seem to find anything that answers this specific question.
What is the simplest way to move data from an MS SQL Server 2005 DB to a Postgres install (8.x)?
I've looked into several utilities like "Full Convert Enterprise", etc, and they all fail for one reason or another, ranging from strange errors that make it blow up to inserting nulls rather than actual data (wth?).
I'm looking at a DB with all table except for a single view, no stored procs, functions, etc.
At this point I'm about to write a small utility to do it for me, I just can't believe that's necessary. Surely there's something somewhere that can do this? I'm not even too worried about cost, although free is preferable :)
I don't know why nobody has mentioned the simplest and easiest way using robust MS SQL Server Management Studio.
Simply you just need to use the built-in SSIS Import/export feature. You can follow these steps:
Firstly, you need to install the PostgreSQL ODBC Driver for Windows. It's very important to install the correct version in terms of CPU arch (x86/x64).
Inside Management Studio, Right click on your database: Tasks -> Export Data
Choose SQL Server Native Client as the data source.
Choose .Net Framework Data Provider for ODBC as the destination driver.
Set the Connection String to your database in the following form:
Driver={PostgreSQL ODBC Driver(UNICODE)};Server=;Port=;Database=;UID=;PWD=
In the next page, you just need to select which tables you want to export. SQL Server will generate a default mapping and you are free to edit it. Probably you`ll encounter some Type Mismatch problems which take some time to solve. For example, if you have a boolean column in SQL Server you should export it as int4.
Microsoft Docs hosts a detailed description of connecting to PostgreSQL through ODBC.
PS: if you want to see your installed ODBC Driver, you need to check it via ODBC Data Source Administrator.
Take a look at the Software Catalogue. Under Administration/development tools I see DBConvert for MS SQL & PostgreSQL. Probably there are other similar tools listed.
You can use the MS DTS functionality (renamed to SSIS in the latest version I think). One issue with the DTS is that I've been unable to make it do a commit after each row when loading the data into pg. Which is fine if you only have a couple of 100k rows or so, but it's really very slow.
I usually end up writing a small script that dumps the data out of SQLServer in CSV format, and then use COPY WITH CSV on the PostgreSQL side.
Both those only take care of the data though. Taking care of the schema is a bit harder, since datatypes don't necessarily map straight over. But it can easily be scripted together with a static load of the schema. If the schema is simple (just varchar/int datatypes for example), that part can also easily be scripted off the data in INFORMATION_SCHEMA.
Well there are .NET bindings for MS SQL Server 2005 (obviously) and also for PostgreSQL. So it would only take a few lines of code to code up a program that could transfer data safely from one to the other. The view would probably have to be done manually as Postgres doesn't use the same language for views as SQL Server.
This answer is to help summarize current connection string because someone may overlooked the comment.
Current version of ODBC connection string is:
For 32-bit system
Driver={PostgreSQL UNICODE};Server=192.168.1.xxx;Port=5432;Database=yourDBname;Uid=postgres;Pwd=admin;
For 64-bit system
Driver={PostgreSQL UNICODE(x64)};Server=192.168.1.xxx;Port=5432;Database=yourDBname;Uid=postgres;Pwd=admin;
You can check the driver name by typing ODBC in windows search.
And open ODBC Data Source Administrator
Related
I have a scenario where I get queries on a webservice that need to be executed on a database.
The source for these queries is from a physical device so I cant really change the input to my queries.
I get the queries from the device in MSSQL. Earlier the backend was in SQL Server, so things were pretty straight forward. Queries would come in and get executed as is on the DB.
Now we have migrated to Postgres and we don't have to the option to modify the input data (SQL queries).
What I want to know is. Is there any library that will do this SQL Server/T-SQL translation for me so I can run the SQL Server queries through this and execute the resulting Postgres query on the database. I searched a lot but couldn't find much that would do this. (There are libraries that convert schema from one to another but what I need is to be able to translate SQL Server queries to Postgres on the fly)
I understand there are quite a bit of nuances that will be different between SQL and postgres so a translator will be needed in between. I am open to libraries in any language(that preferably runs on linux : ) ) or if you have any other suggestions on how to go about this would also be welcome.
Thanks!
If I were in your position I would have a look on upgrading your SQL Sever to 2019 ASAP (as of today, you can find on Twitter that the officially supported production ready version is available on request). Then have a look on the Polybase feature they (re)introduced in this version. In short words it allows you to connect your MSSQL instance to other data source (like Postgres) and query the data in as they would be "normal" SQL Server DB (via T-SQL) then in the background your queries will be transformed into the native pgsql and consumed from your real source.
There is not much resources on this product (as 2019 version) yet, but it seems to be one of the most powerful features coming with this release.
This is what BOL is saying about it (unfortunately, it mostly covers the old 2016 version).
There is an excellent, yet very short presentation by Bob Ward (
Principal Architect # Microsoft) he did during SQL Bits 2019 on this topic.
The only thing I can think of that might be worth trying is SQL::Translator. It's a set of Perl modules that have been around for ages but seem to be still maintained. Whether it does what you want will depend on how detailed those queries are.
The no-brainer solution is to keep a SQL Server Express in place and introduce Triggers that call out to the Postgres database.
If this is too heavy, you can look at creating a Tabular Data Stream (TDS is SQL Server network transport) gateway with limited functionality and map each possible incoming query with any parameters to a static Postgres query. This limits any testing to a finite, small, number of cases.
This way, there is no SQL Server, and you have more control than with the trigger option.
If your terminals have a limited dialect demand then this may be practical. Attempting a general translation is very likely to be worth more than the devices cost to replace (unless you have zillions already deployed).
There is an open implementation FreeTDS that you could use if you are happy with C or Java.
My question is this: Are there alternatives to ODBC that would allow us to connect our SQL Server to MS Access?
Here's the situation: My company works with a proprietary, SQL database (ProVenue) that up and decided to "no longer support ODBC" to MS Access, our front-end tool, without telling us.
We are currently migrating away from ProVenue, but in the meantime , we're stuck with a vendor, which "no longer supports" our ODBC connection(s). The vendor also has no incentive to help since we're leaving in several months.
I've devised a workaround where I manually export the ProVenue tables (ASCII), proof (yes, the export utiliy pulls unreliably), convert and upload on a daily basis into Access. That said, it is unreasonably time consuming given the number of tables. This work-around could be a full-time job.
Do you know of any alternatives?
Do NOT consider using ADP. It has been dropped from Access 2013 and hence is a technology with no future.
From what you're saying, you don't "own" your own MSSQL database - you're simply connecting to an instance that the provider manages, correct? I would guess that they've disabled ODBC connections to MSSQL because they don't like the load placed on their servers and/or that they've decided they want to change some underlying structures and don't want to have to cope with anybody whining about those changes.
That said, do they allow direct MSSQL connections? Via SQL Management Studio, for example? If so, you should be able to define an export & import process which is less buggy than theirs, and simply re-point your Access database to the local copy of data. True, this would still require some (possibly automated) import process, so you'd be out of synch with the server, but it'd give you the solution.
You might try connecting an .adp file to the server, to see if they'll still let you access things in that manner. That would possibly require significant modifications to your Access solution, but would also be a bit easier on their servers than linked tables via ODBC.
You could have a look at Access Data Projects (ADP) which are tied directly to one SQL Server database. I don't think they use ODBC at all, but they have their own limitations, and of course, aren't available in older Access versions.
I'm looking for the best approach (or a couple of good ones to choose from) for extracting from a Progress database (v10.2b). The eventual target will be SQL Server (v2008). I say "eventual target", because I don't necessarily have to connect directly to Progress from within SQL Server, i.e. I'm not averse to extracting from Progress to a text file, and then importing that into SQL Server.
My research on approaches came up with scenarios that don't match mine;
Migrating an entire Progress DB to SQL Server
Exporting entire tables from Progress to SQL Server
Using Progress-specific tools, something to which I do not have access
I am able to connect to Progress using ODBC, and have written some queries from within Visual Studio (v2010). I've also done a bit of custom programming against the Progress database, building a simple web interface to prove out a few things.
So, my requirement is to use ODBC, and build a routine that runs a specific query on a daily basis daily. The results of this query will then be imported into a SQL Server database. Thanks in advance for your help.
Update
After some additional research, I did find that a Linked Server is what I'm looking for. Some notes for others working with SQL Server Express;
If it's SQL Server Express that you are working with, you may not see a program on your desktop or in the Start Menu for DTS. I found DTSWizard.exe nested in my SQL Server Program Files (for me, C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn), and was able to simply create a shortcut.
Also, because I'm using the SQL Express version of SQL Server, I wasn't able to save the Package I'd created. So, after creating the Package and running it once, I simply re-ran the package, and saved off my SQL for use in teh future.
Bit of a late answer, but in case anyone else was looking to do this...
You can use linked server, but you will find that the performance won't be as good as directly connecting via the ODBC drivers, also the translation of the data types may mean that you cannot access some tables. The linked server might be handy though for exploring the data.
If you use SSIS with the ODBC drivers (you will have to use ADO.NET data sources) then this will perform the most efficiently, and as well you should get more accurate data types (remember that the data types within progress can change dynamically).
If you have to extract a lot of tables, I would look at BIML to help you achieve this. BIML (Business Intelligence Markup Language) can help you create dynamically many SSIS packages on the fly which can be called from a master package. This master package can then be scheduled or run ad-hoc and so can any of the child packages as needed.
Can you connect to the Progress DB using OLE? If so, you could use SQL Server Linked Server to bypass the need for extracting to a file which would then be loaded into SQL Server. Alternately, you could extract to Excel and then import from Excel to SQL Server.
I'm looking for a way that I can keep a database in one single file, no server hosting it, and with the ability to use ADO (In delphi, specifically TADOConnection and/or TADOQuery). Please pardon my lack of terminology on this one. I'm only familiar with SQL Server databases, and nothing about any others. In fact, the only other ways I know to read/write files are Plain Text, INI, and XML. As for any official "databases", I know nothing.
So what I would like to do is keep a single file as a database, similar to how QuickBooks has a single "Company File". I should not have anything to host the data, such as SQL Server. And it needs to be compatible with ADO, so I can use simple select, update, delete, etc. It doesn't need to be so complex as to have relations, security, etc. But it does need to have some same syntax rules as SQL Server, like commands such as join, alter, distinct, etc.
I'm looking for the lightest-weight method to do so. The files need to be flexible enough to be able to copy/paste (so long as the application isn't using it), similar to an excel file. In fact, my original idea was to use Excel, as I know I can use ADO, but I also don't want to require Microsoft's excel drivers (it would have to presume that MS Office / Excel is installed on user's computer). It's obviously going to need some drivers, but I need the most standard method which is compatible everywhere.
You can use MS-Access MDB files. It can be used via Microsoft OLEDB Jet 4 engine (Which is build in into Windows since at least Win XP) and is perfect for local desktop DB applications, with the ability to create Tables, PKs, Indexes, Queries/Views, Transactions, Multi-User, replication, compact/repair and much more with almost perfect compatibility to MS SQL-Server SQL syntax (since MS-Access is the ancestor of MS SQL-Server).
MS-Access product (i.e MS Office) dose not have to be installed on the client machine.
No extra drivers or files to install, and completely integrable with existing MS-Office products.
Edit: MDB files could be also Protected/Encrypted.
You have several options for store your data in single database file.
SQLite
Firebird
Interbase
All of them can be accesed via ADO using a ODBC or OLEDB driver. my personal recomendation is Firebird, because is free, fast, stable and had a Embedded version.
This is a pretty useful comparison of a number of embedded databases. Of the ones tested these ones support (odbc), (oledb) or (both) and use a (single) file for the database:
Accuracer (odbc) (single)
NexusDB (odbc) (single v4 and newer)
Firebird (both) (single) - multiple odbc implementations and the commercial IBProvider supports three different ways to connect to the ADO components.
TurboDB (odbc) (single v4 and newer)
Note: Most of these also supply ADO.Net Providers as well.
The other's in the comparison (Advantage, ElevateDB, DBISAM and Apollo) use a file per table/index scheme.
This should be simple. I'm trying to import data from Access into SQL Server. I don't have direct access to the SQL Server database - it's on GoDaddy and they only allow web access. So I can't use the Management Studio tools, or other third-party Access upsizing programs that require remote access to the database.
I wrote a query on the Access database and I'm trying to loop through and insert each record into the corresponding SQL Server table. But it keeps erroring out. I'm fairly certain it's because of the HTML and God knows what other weird characters are in one of the Access text fields. I tried using CFQUERYPARAM but that doesn't seem to help either.
Any ideas would be helpful. Thanks.
Try using the GoDaddy SQL backup/restore tool to get a local copy of the database. At that point, use the SQL Server DTS tool to import the data. It's an easy to use, drag-and-drop graphical interface.
What error(s) get(s) thrown? What odd characters are you using? Are you referring to HTML markup, or extended (eg UTF-8) characters?
If possible, turn on Robust Error Reporting.
If the problem is the page timing out, you can either increase the timeout using the Admin, using the cfsetting tag, or rewrite your script to run a certain number of lines, and then forward to itself at the next start point.
You should be able to execute saved DTS packages in MS SQL Server from the application server's command line. Since this is the case, you can use <cfexecute> to issue a request to DTSRUNNUI.EXE. (See example) This is of course assuming you are on a server where the command is available.
It's never advisable to loop through records when a SQL Update can be used.
It's not clear from your question what database interface layer you are using, but it is possible with the right interfaces to insert data from a source outside a database if the interface being used supports both types of databases. This can be done in the FROM clause of your SQL statement by specifying not just the table name, but the connect string for the database. Assuming that your web host has ODBC drivers for Jet data (you're not actually using Access, which is the app development part -- you're only using the Jet database engine), the connect string should be sufficient.
EDIT: If you use the Jet database engine to do this, you should be able to specify the source table something like this (where tblSQLServer is a table in your Jet MDB that is linked via ODBC to your SQL Server):
INSERT INTO tblSQLServer (ID, OtherField )
SELECT ID, OtherField
FROM [c:\MyDBs\Access.mdb].tblSQLServer
The key point is that you are leveraging the Jet db engine here to do all the heavy lifting for you.