We are having around 75~ table and 100~ stored procedures. We have created a custom NodeJS app with Sequelize to migrate the tables and its data. But we wanted to migrate the stored procedures too.
The only possible options that we do have is, is to manually convert every stored procedure.
Manually converting each stored procedure is a tedious task. So is there any way other than manually converting the code? I hope someone can guide/help me with this.
FYI:
SQL Server version: 16+
Postgres version: 12+
There soon will be, Amazon is launching an open-source tool under Apache to act as a translation layer between traditional SQL applications and a Postgres database. This translation layer allows your code to operate under its current SQL setup, but it gets translated for the Postgres DB. It's called Babelfish for Postgresql. It's slated for 2021, but it is not currently available. https://babelfish-for-postgresql.github.io/babelfish-for-postgresql/
There is absolutely no possibilities to automatically convert Transact SQL procedures to PG PL/SQL functions because of many lack of functionalities :
PG does not do pessimistic lock that SQL Server uses by default
PG do not support nested transaction that SQL Server support. In
this case the behaviour will be different and the results not the
same.
String data have collations CI/AS by default in SQL Server that PG
do not support completly (ICU collations are not supported for LIKE
and raise an error as an example).
PG does not conform to the SQL Standard regarding the string
datatype. PG use only CHAR/VARCHAR. No NCHAR/NVARCHAR, but strinsg
in PG are NCCHAR/NVARCHAR
PG support function overloading that is not supported in SQL Server.
The function using a generic code with the sql_variant datatype must
be translated into function overloading
PG does not make differnces between function and procedure (which is
a lack of security). SQL Server does it...
There will be many other functionalities that is completly different, and I am writing a series of papers about the differences between PG and SQL Server. The first one is about performances of DBA queries, the secound about COUT performances and the third a complete panorama of functional differences...
Related
I have a scenario where I get queries on a webservice that need to be executed on a database.
The source for these queries is from a physical device so I cant really change the input to my queries.
I get the queries from the device in MSSQL. Earlier the backend was in SQL Server, so things were pretty straight forward. Queries would come in and get executed as is on the DB.
Now we have migrated to Postgres and we don't have to the option to modify the input data (SQL queries).
What I want to know is. Is there any library that will do this SQL Server/T-SQL translation for me so I can run the SQL Server queries through this and execute the resulting Postgres query on the database. I searched a lot but couldn't find much that would do this. (There are libraries that convert schema from one to another but what I need is to be able to translate SQL Server queries to Postgres on the fly)
I understand there are quite a bit of nuances that will be different between SQL and postgres so a translator will be needed in between. I am open to libraries in any language(that preferably runs on linux : ) ) or if you have any other suggestions on how to go about this would also be welcome.
Thanks!
If I were in your position I would have a look on upgrading your SQL Sever to 2019 ASAP (as of today, you can find on Twitter that the officially supported production ready version is available on request). Then have a look on the Polybase feature they (re)introduced in this version. In short words it allows you to connect your MSSQL instance to other data source (like Postgres) and query the data in as they would be "normal" SQL Server DB (via T-SQL) then in the background your queries will be transformed into the native pgsql and consumed from your real source.
There is not much resources on this product (as 2019 version) yet, but it seems to be one of the most powerful features coming with this release.
This is what BOL is saying about it (unfortunately, it mostly covers the old 2016 version).
There is an excellent, yet very short presentation by Bob Ward (
Principal Architect # Microsoft) he did during SQL Bits 2019 on this topic.
The only thing I can think of that might be worth trying is SQL::Translator. It's a set of Perl modules that have been around for ages but seem to be still maintained. Whether it does what you want will depend on how detailed those queries are.
The no-brainer solution is to keep a SQL Server Express in place and introduce Triggers that call out to the Postgres database.
If this is too heavy, you can look at creating a Tabular Data Stream (TDS is SQL Server network transport) gateway with limited functionality and map each possible incoming query with any parameters to a static Postgres query. This limits any testing to a finite, small, number of cases.
This way, there is no SQL Server, and you have more control than with the trigger option.
If your terminals have a limited dialect demand then this may be practical. Attempting a general translation is very likely to be worth more than the devices cost to replace (unless you have zillions already deployed).
There is an open implementation FreeTDS that you could use if you are happy with C or Java.
I am generally a Sql Server coder, but we have a client who wants to move a system from Sql to ORACLE due to the new licensing model of Sql Server.
I know historically, ORACLE has no logical grouping of objects within a db/schema, along the lines of a Sql Server schema. It's been a while since I've done any real ORACLE work though, so I'm just wondering if somewhere along the line, they may have added such a construct?
The version of ORACLE we are porting the Sql Server database into is ORACLE 11g (11.2).
Traditionally, I've seen oracle developers do this using just a prefix on table/view/object names. So for example a Sql Server object users.OPTIONS might become USR_OPTIONS in ORACLE. This works to be sure, but it just feels really kludgey to me, as it's not so much an actual hierarchy, but is sort of "forcing" one in by simply using contorted names.
Oracle has schema support in Oracle 11gR2. Oracle schemas are tied to a user. You'll have to (somewhat confusingly) create a user for each schema that you're creating. This isn't a big deal, but some people find it distasteful.
Oracle 12c Enterprise Edition has a feature called Multitenant that allows for multiple databases on the same Oracle server in much the same way that SQL Server allows out of the box.
For a new project we have to export data from a SQL Server 2012 database to a PostgreSQL database. We have the SQL Server schema but have to create one for PostgreSQL. As far as possible we would like the schemas to match. Can anyone give any advice on the best way of converting a SQL Server schema to a PostgreSQL one? Are there any tools or scripts which will help? I have seen a PostgreSQL function but to be honest I have no PostgreSQL experience and our remit stops at the data being imported into PostgreSQL so I would like to do everything from the SQL Server side (planning to use SSIS with the 64-bit ODBC driver for PostgreSQL to export the data once we have the schema created)
Although not free, I've used Toad Data Modeler for this in the past. We never used it on any particularly complex schemas, but it did do a good job of keeping schemas in sync between various DB platforms.
Your mileage may vary, but it's worth a look.
I don't know a direct schema converter but most data modeling tools offer such conversion functionality. We use Dezign for Databases. This tool has got a function "switch target dbms". This a data modeling tool just like Toad Data Modeler mentioned here before. With the database independent modeling functionality you can keep schemas on different db platforms in sync. For data synchronization (data pump) between different database platforms you can use DataDiff CrossDB.
I will have a Postgres database in production but want to use MS SQL (whatever edition) for reporting. So, I would like to have replication set up where MS SQL subscribes from postgres. Is this possible?
All heterogeneous replication scenarios are deprecated by Microsoft, and they now recommend building solutions using SSIS and CDC instead.
We load data from PostgreSQL into our SQL Server reporting database using SSIS and it works well, although we had to use a commercial OLE DB provider because of limitations (at that time) in the open-source one.
Actually copying the data is usually the easy part; most of the work comes in gathering requirements, understanding the data, transforming it, implementing logging and error handling etc. SSIS can do some things for you right away (e.g. logging) but my general advice would be to use it primarily as a workflow tool and for simple data copying with minimal transformation logic (e.g. data type conversion). If something seems seems too difficult or clumsy in SSIS then you can put it into a stored procedure or script and call that from SSIS instead.
I've been using and following PostgreSQL for several years and not aware of such a solution. If one exists, I'm concerned that might be complex or fragile. I would recommend regular export/imports via cron. In the between the export and import, you would need to take care of the translation step of the formats.
If you reporting actually happens in MS Excel or MS Access, I recommend looking into connecting them directly to PostgreSQL via ODBC.
Is it possible using SQL Server Replication to replicate data to AND from (bi-directional) Oracle and SQL Server? The schemas are completely different. In real-time would be a bonus.
Have already investigated Oracle Golden Gate, which seemed to do the job, although the licence cost is not insignificant!
I wondered if anyone has had any experience in replicating data across different schemas, and what other tools they employed? I realise this is a bit of an open-ended question but any advice and previous experiences would be most useful.
Thanks
Duncan
I recently had to create a solution to import periodically lots of data from different databases (most of the time from Oracle databases) to a SQL Server database (a data warehouse). To do so, I used SQL Server Integration Service to create a package able to import, transform and insert the data as I wanted (since it was from heterogeneous sources too). This software comes with SQL Server and the version 2005 and superior is really easy to use (graphical programming). In your case, you could trigger your created services when needed. I am not sure it is the best solution since you would need to create a SSIS service for each direction (from Oracle to SQL Server and from SQL Server to Oracle).