I'm currently doing a business intelligence research about connecting Microsoft SQL Server to a nosql database.
My target is to import data from a nosql table to a relational DWH based on SQL Server.
I found the following approaches:
Microsoft Hadoop Connector
Hadoop Cloudera
Building an individual script and create an xml and include it via Integration Services (not really satisfying)
If somebody did something like this before or knows some kind of "best practices". It doesn't matter wich NoSQL system is used
NoSQL, by "definition", does not have a standard structure. So, depending on what NoSQL backend you are trying to import from, you will need some custom code to translate that into whatever structured format your data warehouse expects.
Your code does not have to generate XML; it could directly use a database connection (e.g., JDBC, if you are using Java) to make SQL queries to insert the data.
Related
Our organization uses Elastic Logstash & Kibana (ELK) and we use a SQL Server data warehouse for analysis and reporting. There are some data items from ELK that we want to copy into the data warehouse. I have found many websites describing how to load SQL Server data into ELK. However, we need to go in the other direction. How can I transfer data from ELK to SQL Server, preferably using SSIS?
I have implemented a similar solution in python, where we are ingesting data from elastic cluster into our sql dwh. You can import Elasticsearch package for python which allows you to do that.
you can find more information here
https://elasticsearch-py.readthedocs.io/en/master/
I have a .GDB database (old one) and the data in it is very important
I need to convert that .gdb database to a SQL Server database - can anyone help me...
Create connections to both source GDB and Destination SQL Server in ArcCatalog. Copy everything from source and paste it into the destination. You won't be able to do it with SQL tools alone.
Lacking ESRI software, for simple cases, my workflow is to use the GDAL C++ API to read the GDB. This requires the GDAL File GDB driver. Then I will use Microsoft.SqlServer.Types to transfer to SQL Server. This involves low-level APIs and you need to understand the spatial types in the respective libraries. It gets complex if you have polygons with rings, for example.
I'm not aware of a tool that will automatically convert between these database types. You'll need to use an application that can read the old database type (Firebase), learn the table design, create a similar table design in SQL Server, and use the application to load the data from Firebase to SQL Server.
Typically, this kind of work is called ETL (Extract/Transform/Load) and is done with migration tools like SQL Server Integration Service (SSIS). SSIS is free with SQL Server, and there are a lot of books available on how to use it - but like learning to develop software, this isn't a small task.
The easiest way to export Esri File Geodatabase FGDB (.gdb) data to MS SQL Server is with ArcGIS for Desktop at the Standard or Advanced level.
You may also want to try exporting to shapefile (SHP) format (an open transitional format) then import to your MS SQL Server. I've seen a tool online that has worked for me called Shape2SQL.
Esri also has an open File Geodatabase API that you can use to write your own too.
I highly recommend FME Workbench for GIS data conversion. It's like SQL Server Integration Services (ETL) but for GIS. Graphical interface, connect data readers with data writes, insert transforms, run them, etc.
What kind of database the Sql Server Migration Assistant uses as an internal data repository and stores it in the source-metabase.mb file?
I guess that this it is one of standard tool that I could use to open and edit some entries (I need to autamatically add some custom scripts for tables with BLOBs data migration )
You could also just suggest the way how to check most popular database formats: SqlServer Compact, MySQL, Access..
it is one of standard tool that I could use to open and edit some entries
I would not count on it :) It was a proprietary metadata format that has nothing to do with DB products that SSMA supports. It can store metadata for representing Oracle and also SQL Server among others, obviously formats are not connected with file structure that actual DBs use. SSMA format has no open docs, also it may fail to synchronize your changes after manual intervention if you reverse engineer it (due to the fact it was designed just as the migration tool to target SQL Server and was supposed to mostly create new objects there based on source database counterparts).
Can you just write some stored procedures or triggers in your database? For most DBs metadata is exposed as special tables/views anyway. Probably you need to do it only for SQL Server as it's your target db after migration, right? Looking into ways to directly parse or manipulate files managed by "big" DB (like SQL Server or Oracle) doesn't seem to be a good idea for most scenarios. (except digital forensics, for example)
SQL Server metadata related views are here and functions are here. You may profile your SQL Server instance while connecting to it with SSMA just to get some feel what it does to extract metadata (object names, columns of tables, source of SPs etc.)
Data manipulation is pretty much clear from the DB side if you need it too.
For a new project we have to export data from a SQL Server 2012 database to a PostgreSQL database. We have the SQL Server schema but have to create one for PostgreSQL. As far as possible we would like the schemas to match. Can anyone give any advice on the best way of converting a SQL Server schema to a PostgreSQL one? Are there any tools or scripts which will help? I have seen a PostgreSQL function but to be honest I have no PostgreSQL experience and our remit stops at the data being imported into PostgreSQL so I would like to do everything from the SQL Server side (planning to use SSIS with the 64-bit ODBC driver for PostgreSQL to export the data once we have the schema created)
Although not free, I've used Toad Data Modeler for this in the past. We never used it on any particularly complex schemas, but it did do a good job of keeping schemas in sync between various DB platforms.
Your mileage may vary, but it's worth a look.
I don't know a direct schema converter but most data modeling tools offer such conversion functionality. We use Dezign for Databases. This tool has got a function "switch target dbms". This a data modeling tool just like Toad Data Modeler mentioned here before. With the database independent modeling functionality you can keep schemas on different db platforms in sync. For data synchronization (data pump) between different database platforms you can use DataDiff CrossDB.
We have a SharePoint 2007 site. It is supported by two back-end databases - one hosted on SQL Server, another on an open-source RDBMS. We issue CAML queries to retrieve data from SQL Server, and ADO.NET queries to retrieve data from the other server. Our architect says we would be better off if we used the same approach (namely, CAML) to get data from the both databases.
Is it possible to use CAML queries to retrieve data from any RDBMS other than SQL Server?
If so, please suggest any web resources, docs, anything you find appropriate.
CAML (at least that part used for SPList.GetItems queries) seems to be quite simple, so translating it into valid SQL statements should not be too complex. Which means, you could create a "translator module" and issue your queries against it. For instance, you can follow guidelines published in the article "[Implementing a .Net Framework Data Provider](http://msdn.microsoft.com/en-us/magazine/aa720164(VS.71).aspx)".