Load bulk JSON data into SQL Server table - sql-server

What is the standard way of loading a bulk of JSON data from a file into a database table in an SQL Server version less than 2016?
I am aware of OPENJSON() function that was introduced in SQL Server 2016. However, I am limited to use SQL Server 2014.
The preferable way of doing this is to import data using SQL Server Integration Services (SSIS), but there is no in-built connector or a source component to start with.
I tried using SSIS JSON Component for this purpose. Apparently, it is not good from performance point of view.
Is there any other way of bulk loading JSON into SQL?

I've done this in SSIS 2012 / SQL Server 2012. The requirement was to consume a RESTful API with various endpoints that returned json.
In your Data Flow, create a Script Component. Most likely it will be of Type = Source since you will be generating OutPuts with Output Columns from it.
Create the various OutPuts and OutPut columns for the various fields / data items you will extract from the JSON. Usually this is several tables as shown below.
In the Scripting Component's CreateNewOutputRows() override method, Deserialize the JSON data into C# classes. I use http://json2csharp.com/ to stub the classes and then fix it if need in special cases, like if the json has ID values for key, instead of keyname and such. I use Newtonsoft.Json to deserialize the json into the class instances / Lists etc. Add rows to the relevant outputbuffer you setup in Step 2.
Connect the Outputs from the Scripting Component to the various destination tables.
That's it! You are done.

Related

ASP.net MVC - get data from SQL Server without knowing the data-structure

I would like to know how I can get data from SQL Server tables into my ASP.net MVC application without knowing the data-structure in advance.
Example:
a user uploads a .csv file into the application with an unknown data structure (can be 3 fields can be 50 fields with varying data types)
the .csv file gets stored into a SQL Server table
now I want the application to be able to display the data from these uploads in e.g. a HTML table without having to use a hardcoded model
Is it possible to display the data using a connection string and e.g. LINQ to SQL or EF? Best case would be where I can dynamically assign table names etc. into queries.
The models will still be used to access data belonging to the application logic, it's just the displaying of data from user-uploads that is not clear to me at this time.
EF and Linq2Sql will always require you to have a existing model associated to a database table.
If you really need that "dynamic" query, you can use a micro-ORM like dapper.Net to return your query to a dynamic type.
This solution won't save you from having generate the select sql query, by retrieving the list of fields from the table. Maybe you can use the sys tables from sql server (if that is the data base) for that. Like in here

Bulk export of data from SQL Server 2008 into XML file (without using BCP)

I have a requirement whereby I need to refresh databases on Server A with data from Server B (the schemas will always be the same) with subsets of data
Unfortunately, due to the way the servers have been set up and are run (hosted), the option of using BCP is not available to me. Also, the servers may not be able to see each other/are linked so a SP on Server B will not be able to access Server A directly.
Because of this, my plan is to run an export procedure on Server A to create a file (XML?) which is placed on to a location available to Server B. I would then create a procedure on Server B to consume the data into the database
My question is: without the use of BCP, what options are available to me for the bulk exporting of data (with selection criteria) from my source server? And does my plan sound sensible? Am I missing any obvious approach/have others solved this problem before?
Helo Mike,
have you considered using of SQL Server Integration Services? You could export your data by dynamic SELECT statemenet save them into binary file and transfer them (FTP) to the second locality for loading.

Integrate Excel with SQL Server

I am quite new to SQL Server but I'm looking for a tool that integrates Excel with SQL Server and provide a two way connection Read/write.
I want to be able to pull data from SQL server and perform some evaluation/data manipulation and then write the data back to the server.
Basically my client receives Excel raw data from vendors which they perform some validation on the spreadsheet then send the spreadsheet back but a copy of the data needs to be in some sort of data management system. I have test MDS and I'm not full satisfied. The functionality I'm looking for is
Data validation
Data match - match and merge /consolidate two or more worksheets into one
read/write to sql
I Do not want the import/export wizard and don't want to use SSIS and they are both not suitable.
There's Google - like it has never been before. And there are quite a big number of subject experts posting on their blogs for the love of helping people like you and me.
So check out here multiple ways you can import data into SQL Server, without using SSIS such as,
bcp Utility
e.g.
bcp dbo.ImportTest in 'C:\ImportData.txt' -T -SserverName\instanceName
Bult Insert using T-SQL
e.g.
`BULK INSERT dbo.ImportTest`
`FROM 'C:\ImportData.txt'`
`WITH ( FIELDTERMINATOR =',', FIRSTROW = 2 )`
Note the article was published and last updated in 2012. So you may further check the compatibility for older versions if you are using any.
PS: I still believe you could be using SSIS as not to Re-invent the wheel...

How do I convert my SQL SERVER data into a SAS Table?

I am using Enterprise Miner 6.2 and want to create a data source but my option is a SAS Table. How do I go about exporting SQL Server or Excel data into a SAS table?
SAS has many ways of connecting to and/or reading data from disparate sources. I haven't used Enterprise Miner, so I'm not sure which of SAS' methods are available to you directly from within EM, but it's likely there will be someone at your site who has some interface to Base SAS and who can help you/advise what data access products are installed and how you can use them.
For SQL Server data, SAS/Access to SQL Server or SAS/Access to OLE DB will allow you to read directly from SQL Server tables in place. Alternatively, someone could provide you with a dump of the data you need from the SQL Server database.
For Excel data, there are also SAS/Access products, but SAS also has native capabilities to read in the data if saved as, for example, a .csv or .txt file.
To help answer you further, perhaps can you come back with some details about what SAS products/interfaces are available to you?

How to build a database from an XSD schema and import XML data

I have a complex XSD schema and hundreds of XML files conforming to the schema.
How do I automate the creation of related SQL Server tables to store the XML data?
I've considered creating C# classes from the XSD schema using the xsd.exe tool and letting something like Subsonic figure out how to make a shiny database out of it, but not sure if it's the best way to approach it.
Has anyone managed to elegantly import XSD files into SQL Server?
A similar question with good answers: How can I create database tables from XSD files?
I suggest you use SQL Server Integration Services, which comes with SQL Server 2008 or 2005 (Or Data Transformation Services if your stuck with 2000).
Unfortunately it doesn't come with the free "Express" version of SQL Server, however SQL Server Developer edition can be had for < $100 which has the full SQL Server Standard functionality and would suit your needs.
SSIS is a big topic and I'm not going to go over all of the bells and whistles here but basically you:
Create a new SSIS project using BIDS (Business Intelligence Development Studio, a modified Visual Studio that comes with SSIS)
Drag a new Data Flow Task onto the Control Flow surface, then click the data flow tab.
Drag an "XML source" from toolbox into data flow panel, and then configure the XSD and XML file locations.
Drag an ADO.NET data destination from the toolbox onto the dataflow and connect one of the the outputs from the XML source to the input of the ADO.NET destination. If you want to create a new table based on the data output from the xml schema as opposed to using an existing one click on "New" when specifying the Connection Manager Settings in the ADO.NET Destination and it generate and execute the appropriate create table statement. Repeat this for any other outputs from the XML source (there will be one for each logical flat table generated from the schema).
You will most probably need to use other data transformation objects first to transform the data before it loaded into SQL server, but that is the general gist of it. If you need to run the process for a large amount of XML files you could put the task in a control loop and use a variable to set the XML file location.
The MS Documentation on using an XML source in SSIS is here: http://msdn.microsoft.com/en-us/library/ms140277(v=SQL.100).aspx
Just found XSD2DB on Sourceforge, according to the site:
XSD2DB is a command line tool written
in C#, that will read a Microsoft
ADO.NET compatible DataSet Schema File
(XSD) and generate a database.
Checking it out.

Resources