Sails DB tables to models export - export

I was wondering if there is any way to export the DB tables to Sails Models structure, so i can make a instant replica of a current DB in sails and start using the DB with sails.
Its kinds of reverse migration. (DB -> SAILS)

Yes as of only just a couple of weeks now a module has been created to do this for you. At present it is limited to only a couple of database types e.g. postgresql and mySql however the publisher seems to be actively working on new database conversions also.
To find all the info check here:
https://www.npmjs.com/package/sails-inverse-model
I have used it to convert 53 postgresql tables to sails models.

Related

.Net Core Identity - migrate from Postgresql to SQL Server

I have a .NET Core 3.0 MVC website with Identity. The database is a Postgresql database (mainly because of better performance with geographic data). But since the only other person that can work with postgresql has quit, i have to migrate to SQL Server (because of internal policy).
But there isn't much information on the big web on this specific migration.
I got a few ideas but since setting up a test takes quite some time, I wanted to check here first.
Is it just a matter of copying all tables between the databases (Copy/export data - change connectionstring - people won't even notice the change)?
write a small script using entity framework, copying all users to the new database with a default password - users have to change password on first login
people have to re-register
a combination of the above
EDIT: the problem is not the tables and data types but my concern is the passwords and the hashes. Can I just copy all the values to the SQL database and can people just log in?
There is a password hash in the table and I was thinking it maybe used other variables like the database engine to create the hash.
If the application is build on the same stack, lets say in your case Dot net core with Aspnet Identity, then the hashes can be migrated with no issue at all. Everything is handled by dotnet and it is not bound to the underlying datastore.
Create the schema and populate it and you will be good to go. No need to rehash or make your users change their passwords. Just move the data
You will need to figure out which data types you are using in Postgresql and what their equivalents are on MSSQL. Most data types are same/similar while there might be a few where there is no direct equivalent.
There are lots of ways to move data between databases. One possible simple way in this case is to dump your postgres db using pg_dump. This will get you a text file with sql statements to recreate the database. Then you can modify the sql statements as necessary to work on your MSSQL database.

Connect with more than two databases using Django ORM, one database is legacy database without migrate into app

We want to connect a remote existing database from settings.py, can we use those tables directly using model without migrate from app.
We know about the legacy database connection but inspect.db command always asks to migrate the connected database.
Is using mysql connector is preferable or it is out of standard, please suggest.
Thanks, your help is appreciated!
I think what you are looking for is the managed meta option in your models. When you define your models, by default you have managed=True. If you want to use an existing database without Django interfering with migrations, you should use managed=False.
See this part of the doc:
[...] If False, no database table creation or deletion operations will be performed for this model. This is useful if the model represents an existing table or a database view that has been created by some other means. This is the only difference when managed=False. All other aspects of model handling are exactly the same as normal. [...]

Keep two different databases synchronized

I'm modeling a new microservice architecture migrating some part of a monolithic software to microservices.
I'm adding a new PostgreSQL database and the idea is in the future use that database but for now I still need to keep updated the old SQL Server database and also synchronize the PostgreSQL database if something new appears in the old database.
I've searched for ETL tools but are meant to move data to a datawarehouse (that's not what I need). I just can't replicate the information because the DB model is not the same.
Basically I need a way to detect new rows inserted in the SQL Server database, transform that information and insert it in my PostgreSQL.
Any suggestions?
PostgreSQL's foreign data wrappers might be useful. My approach would be, to change the frontend to use PostgreSQL and let postgreSQL handle the split via it's various features (triggers, rules, ...)
Take a look at StreamSets Data Collector. It can detect changes in SQL Server and insert/update/delete to any DB that has a JDBC driver including Postgres. It is open source but you can buy support. You can also make field changes/additions/removals/renaming to the data stream so that the fields match the target table.

Streaming data from SQL to Mongo

I am working with industrial equipment that inserts some text data into a SQL Server 2008 database every time it cycles (about every 25 seconds). I am looking to forward that data to a mongo database in real time to use with an internal Meteor application.
Would there be any obvious starting point? The closest answer I have found is at: https://github.com/awatson1978/meteor-cookbook/blob/master/cookbook/datalayer.md
Q: Well, how am I suppose to use the data in my SQL database then?
Through REST interfaces and/or exposing the SQL database as a JSON stream. We put the ORM outside of Meteor. So, the trick is to move your data from your SQL database into Meteor's Mongo database, and have Mongo act as an object store or caching layer.
Apologies, if it is something obvious.
You need to use Mongo, but as simple repository for your MySql database.
This maintains all Meteor's characteristics and uses Mongo as a temporal repository for your MySql or PostgreSql databases.
A brilliant attempt to that is mysql-shadow by #perak (https://github.com/perak/mysql-shadow). It does what it says, keeps Mongo synchronized both ways with MySql and let's you work your data in MySql.
The bad news is that the developer will not continue maintaining it, but what is done is enough to work with simple scenarios where you don't have complex triggers that update other tables or stuff like that.
This works with MySql of course, but if you look at the code the MS SQL implementation is not hard.
For a full featured synchronization you can use SymmetricsDS (http://www.symmetricds.org), a very well tested database replicator. This involves setting up a new java server, of course, but is by far the best way to be sure that you will be able to convert your Mongo database in a simple repository of your real MySql, PostgreSQL, SQL Server , Informix database. I have to check it myself yet.
For now MySQL Shadow seems like a good enough solution.
One advantage of this approach is that you can still use all standard Meteor features, packages, meteor deployment and so on. You donĀ“t have to do anything but set up the synch mechanism, and you are not breaking anything.
Also, if someday the Meteor team uses some of the dollars raised in SQL integration, your app is more likely to work as is.

Backup and Restore through the Entity Framework

I'm working with an SQL Server database with about 50 tables and plenty of relationships between those tables. I have already written a backup and restore function which will retrieve all data from the model, export it to XML which it could then import again into a clean database. But maintaining this import/export is a lot of work when there are some major structural changes to the entity model. I want a more dynamic solution.
Is there a more dynamic solution to export data from an entity model and to import it back again into a clean database?
Oh, before I forget... I don't have direct access to the database itself, not it's connection. All I get and all I can use is this entity framework object...
I would be passing this back to your server support team. Seems a bit strange for an app developer to have to worry about data backup/restore.
Have you tried the SQL Server Publishing Wizard? It creates a nifty formatted SQL file that can be easily moved between application, I've also also used (with some search&replace based on regular expression) to move data from SQL Server to Oracle...
RegardsMassimo

Resources