I have a JSON file of data which I have pulled from an API and I would very much like to just dump this data into an SQL Server.
The reason it's SQL Server specifically is that the database is already in place for the current project. I have spent time googling this and searching on here but was unable to find anything useful thus far. I'm familiar with Python but I'm open to any solution.
TLDR: I'm interested in which languages and packages provide easy solutions to automate JSON to an SQL Server table, do you have any suggestions or know of any packages that already achieve this?
You can use something like SSIS to accomplish this (you may already have it) by writing a script task. This could do custom parsing then load it into the correct table. This can be easily automated. I mention SSIS because it's very easy to add future tasks to this, if you're ever required.
Alternatively you could create a script outside of the database (ie. Python) that parses the JSON, connects to the database through ODBC/OLEDB and writes the records. This can be automated using Task Scheduler or something similar. An example implementation of this could use PYODBC.
you can use WCF Web Service, sending json data to SQL Server .
refer the links below hope it will be helpful for you
http://www.codeproject.com/Articles/167159/How-to-create-a-JSON-WCF-RESTful-Service-in-sec
http://mikesknowledgebase.azurewebsites.net/pages/Services/WebServices-Page2.htm
you cant directly fetch json data's in sql server instead you can use wcf service
Related
I'm exploring options on how to one-way sync from a table available via API to an SQL database. Does anyone have any suggestions on how to achieve this?
The data from the "Source" is often updated and should be copied to the "Destination" as the changes happen (live).
Source
Read Only table from an ERP available via an API. Webhooks on the source are not possible. Entries to this table may be created, updated or deleted. There would be approximately 150,000 entries in the table with about 1000 changes per day.
Destination
Azure MS SQL database which I have full control over.
I'm looking for best practice or any ideas on how to achieve this. There seems to be very few articles that I can find with anything helpful.
I'm open to using any tool on Azure including Logic Apps and Azure Functions but want to stay away from using 3rd party tools.
If you are trying to achieving this through logic apps, Below is the flow that you can follow.
Note: Make sure you preprocess the data before sending the data to SQL database using appropriate actions based on the type of data that you are receiving.
I have a task to get Jira data into our SQL Server via an automated process. There are two options that I can think of using SSIS (it is the only approved tool offered by my company):
Make a REST API (GET request) call in SSIS. To do this, I believe I will need to write a script task which is very challenging for me because I am not a code person. There are third party plugin tools (e.g. ZAPPYSYS) to call REST API and all you need is the URI and authentication in SSIS, but I don't think my company is going to approve another paid license just for this job.
Since our on-prem Jira is connected to Postgres, I was wondering if it is a valid option to extract Jira data from Postgres into SQL Server using SSIS? Something like: https://www.mssqltips.com/sqlservertip/2619/export-data-from-postgres-to-sql-server-using-ssis/
I am still very new to databases, ETL and technology in general, much appreciated if someone can let me know my option 2 is a valid option to try, or any guidance on writing the script task code for option 1.
Problem:
I receive multiple sets of flat files on a weekly basis that need t be imported my Database. The flat files I receive are not in a conventional format to import, so they need to be run through a script and be parsed in to a more SQL friendly format. These flat files are usually in JSON, TXT, XML, LOG, ect.
Current Solutions
Current I have a windows forms application and another GUI to transform the files and to bulkimport to SQL tables. However, I'm finding it unreliable to ask users to import data, and I would much rather automate the tasks.
More recently, I have been creating SSIS packages. This proves to much faster and useful since I can add script components. It allows me to manually parse whatever flat files I throw at it. My issue is finding a way to automate this. I have no control of the server where my database is hosted. So I'm unable to deploy the packages there to bring in the files. Currently, I'm just running the packages on my local machine manually to get the data in.
Needed Solution
I need a way for me to automate getting these flat files in. Originally I wanted to request and FTP server for the files to be dumped in. Then the files would be picked up by my packages and imported into the SQL Server DB. However, since I have no control of any of the local folders on that server, it seems to be impossible for me to automate this. Is there a better way for me to find a solution for this? Could I build something custom in C#, Python, Powershell, etc.? I'm very new to the scene and trying to find a solution for this problem has been a nightmare.
Sorry this might be a very simple question,
I can't find anywhere give me a sure answer,
and I'm very new to the database management system,
so please give me a short guide.
Does SQL Server support consistency transaction on XML? (I'm using ASP.NET Web API)
In my use case, the entity Project has an attribute TaskTree,
which saves the concrete task structure of the Project.
The datatype of TaskTree is XML. (umm, I'm also wondering how I pass a xml within JSON response...)
So the main problem is:
For short:
Any of the modification request to the TaskTree need a long time to modify XML file.
If there are many people request to modify the TaskTree at the same time,
Can all the input/output to the XML file be consistent in SQL SERVER?
Thanks.
I need to be able to extract and transform data from a data source on a client machine and ship it off via a web service call to be loaded into our data store. I would love to be able leverage SSIS but the Sql Server licensing agreement is preventing me from installing Integration Services on a client machine. Can I just provide the client copies of the Integration Services' assemblies to be referenced by my app? Does anyone have any ideas on how to best implement a solution to this problem apart from building a custom solution from the ground up? Ideally the solution would include leveraging an existing ETL tool?
Thanks for your suggestions.
If you are providing your client with a service around their data, you should develop a standard that they need to deliver their data in, and negotiate a delivery method for that file well before you ever consider what to do with SSIS. Since from comments it appears that your data is on a machine in a client's remote location, the most common method I have seen is either having the client SecureFTP a file into your network for processing, or to have a job on your end that gets the file using SecureFTP. Once you have the file on your network, writing the SSIS to process it is trivial.
If the server can reach out to the client machine, then you can just run the SSIS package on the server. What kind of data are you moving? If it's a flat file, you could FTP it to the server.
Another way to go about this is to use BCP. I'm not a big fan of this approach (SSIS is much faster, more robust, etc), but it can work in a pinch.
http://msdn.microsoft.com/en-us/library/ms162802.aspx