I have an existing Oracle database with production data. I am planning to move to MongoDB, so I want to migrate my existing data in Oracle database to MongoDB. The Data Model of data stored in Oracle database and MongoDB will be different.
I am planning to get all the data from Oracle database as json using https://blogs.oracle.com/jsondb/generating-json-data. Once I will have the json file with all the data I will import that in the MongoDB. In case the data extracted from Oracle database is not per my required I will create a utility to convert the data to multiple json file for each collection.
I need to some suggestions if there are better ways to this and is my solution right approach for the problem?
Related
I want to reproduce the database schema of private sever based on the data that returns by ajax response. My steps: convert json to xml and use it as database schema.
I want to use SQL Server 2012 as RDBMS but don't know how to import xml as database schema. (I know about PostgreSQL which perfecty works with xml but I need SQL Server)
May be there are any tools in SQL Server or external? Or any ideas how to create database schema based on ajax response data.
I have a database called transportdb it contains certain Datatables with some data so now i want to export this complete MySql database with all data using sql query in Mysql
Architectural/perf question here.
I have a on premise SQL server database which has ~200 tables of ~10TB total.
I need to make this data available in Azure in Parquet format for Data Science analysis via HDInsight Spark.
What is the optimal way to copy/convert this data to Azure (Blob storage or Data Lake) in Parquet format?
Due to manageability aspect of task (since ~200 tables) my best shot was - extract data locally to file share via sqlcmd, compress it as csv.bz2 and use data factory to copy file share (with 'PreserveHierarchy') to Azure. Finally run pyspark to load data and then save it as .parquet.
Given table schema, I can auto-generate SQL data extract and python scripts
from SQL database via T-SQL.
Are there faster and/or more manageable ways to accomplish this?
ADF matches your requirement perfectly with one-time and schedule based data move.
Try copy wizard of ADF. With it, you can directly move on-prem SQL to blob/ADLS in Parquet format with just couple clicks.
Copy Activity Overview
I am newbie and needs guidance or resources to read. I have two databases, one is in Azure SQL-Server 2012 and the other is in MongoDB at remote location. I access the Azure SQL-Server data using Sql Server Management Studio (SSMS) from my PC and the data of Mongodb in browser using REST API. The retrieved data is in JSON format.
For analysis I want to merge the data from Mongodb in to SQL-Server. I don't know how to store the results of the REST API query as a table in SQL-Server 2012? Note that the columns I want to retrieve from MongoDB are not sub-structured so can easily fit in Relational database.
Azure SQL Database supports OPENJSON function that can parse JSON tet and transform it into a table see https://azure.microsoft.com/en-us/updates/public-preview-json-in-azure-sql-database/
I'm new to elastic search and I have a basic question.
I want to load data from database and search them by using elastic search in MVC.NET project, but cause of data I have in my database's table I cant't convert all of them to the json and search in thme by using elastic search. How should I fill data of the elastic search from the database in an mvc.net project. I don't want the whole solution because it is impossible just a general and brief explanation. thank you very much.
First of all you should be able to model your data from SQL to ElasticSearch.
As ElasticSearch is a NoSQL and document oriented database/search engine.
You need an indexer to index SQL data to ElasticSearch.
Get all the columns associated with one record that you want to search in ElasticSearch from your SQL database (use joins if data is in multiple tables).
Use a dedicated Stored Procedure to get only needed data and construct a document class, serialize to JSON and index in your ElasticSearch cluster.
Use ElasticSearch.net client as they very neatly expose bulk index APIs.
Hope this will get you started. Have fun