Best method to copy a file to the SQL Server machine - sql-server

I am looking into copying a file from the client computer to the server computer. One path I've looked into was creating a CLR method which accepts a stream as input. Another suggestion I've had is to use the BCP utility, though I have been unsuccessful in finding any examples which this was done using BCP.
Is it possible to pass a blob to BCP and import to a table, or would there be more steps involved to make this work?
Which method would be best for a file copy functionality?

You can BCP blobs in and out of the database. Also, I found a reference to scripts for importing and exporting files as blobs which might be helpful.

Related

Importing directory of SAS7BDAT files into SQL Server

I have a directory of SAS7BDAT files - about 300 of them which I need to import them into a SQL Server table. Unfortunately, the date field is not part of the dataset but is in the filename. So I need to parse the filename, get the date and append to each dataset at the time of import.
Is SSIS a good candidate for this? If so, do I use For-each loop to this? How do I parse the filename and append the date?
For individual files, I can easily use SQL Server Management Studio and import it. I can do the same for this exercise too and then handle the date when loading to the final table, but am hoping there is a much more cleaner solution.
Is there any other backend way of handling this without SAS installed? Python or otherwise?
TIA
[Solved]
Came across an article which mentioned R's SAS7BDAT library.
So using that, I could successfully load all the files along with the filename into an R list using "ldply".
After some data frame manipulation, I could load all the files into SQL Server using SQLSave.
The files are very small in size. So performance wasn't much of an issue, although I suspect it can be for larger volumes.

Can you automatically parse JSON into an SQL Sever?

I have a JSON file of data which I have pulled from an API and I would very much like to just dump this data into an SQL Server.
The reason it's SQL Server specifically is that the database is already in place for the current project. I have spent time googling this and searching on here but was unable to find anything useful thus far. I'm familiar with Python but I'm open to any solution.
TLDR: I'm interested in which languages and packages provide easy solutions to automate JSON to an SQL Server table, do you have any suggestions or know of any packages that already achieve this?
You can use something like SSIS to accomplish this (you may already have it) by writing a script task. This could do custom parsing then load it into the correct table. This can be easily automated. I mention SSIS because it's very easy to add future tasks to this, if you're ever required.
Alternatively you could create a script outside of the database (ie. Python) that parses the JSON, connects to the database through ODBC/OLEDB and writes the records. This can be automated using Task Scheduler or something similar. An example implementation of this could use PYODBC.
you can use WCF Web Service, sending json data to SQL Server .
refer the links below hope it will be helpful for you
http://www.codeproject.com/Articles/167159/How-to-create-a-JSON-WCF-RESTful-Service-in-sec
http://mikesknowledgebase.azurewebsites.net/pages/Services/WebServices-Page2.htm
you cant directly fetch json data's in sql server instead you can use wcf service

How to read the meta data out of SQL Server backup files directly?

Generally to get the meta data from SQL Server backup files, we need to use TSQL commands like restore headeronly or restore filelistonly. However, there are some third party tools can read this information directly from the backup files, like this one http://www.yohz.com/sqlbakreader_details.htm. Since this tool don't have a command line version, which makes it less useful. I want to know whether there are some ways that I can read this data directly.
Thanks.
The .bak file is a Microsoft Tape Format file. Here's a PDF that contains the format.
Of interest to you will be:
For a quick dump (if you are on the SQL Server in question that created the backup), you can do select from the [msdb].[dbo].[backup*] tables.
See this article for more info.
I think you may be able to use SQL Servers SMO libraries and write your own in .net or Powershell. If you take a look at this page http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.restore.readbackupheader.aspx there are code samples in VB.Net and Powershell.

Accessing file in sql server

I have a datalog file and need to be saved in database by parsing the data. What are the possible and best ways to access the file if I am hosting it online. I mean it is easy to access the file in a local system but the case is different if hosted online.
Well, you can do it using SSIS as in this link, or Transact SQL like in this other.
Hope that helps,

How is the best way to import data into SQL Server Express?

I want to import data into SQL Server Express, from Access, Excel and txt files. I'm creating a decent database, and I must to import these old formated data. When working with few records, I copy and paste directly through Visual Web Developer DB Explorer.
But now I'm dealing with a few more records (40k). I think copy/paste unsafe, slow and unprofessional. I haven't any other interfaces to control SQL server. How can I do that?
Thanks!
There is an "Import and Export Wizard" that comes with SQL Express. It allows you to import from Access, Excel, ODBC, SQL Client etc.
I don't think there's a clear answer but I really think MSACCESS 2000 or higher is a very versatile tool for doing this..
Linking in tables and using Append queries to other linked tables works really well, plus utilizing the power of VBA helps in some cases too (like calling a vba function from query designer (like InStr or Mid etc..) (if your familiar with this)
Does anyone else agree?
The BCP (Bulk Copy) works well for importing into SQL Server: http://msdn.microsoft.com/en-us/library/ms162802.aspx
There is also the "bulk insert" command: http://msdn.microsoft.com/en-us/library/ms188365.aspx which has the caveat that the file must be physically accessible from the server.
Both of these methods can import comma delimited files, so you'd need to be able to create those from your data source.
I recommend loading all the objects from one SQL table into a JSON object and then indexing through an array of object and translating them into the new table. I have some open source MySQL to JavaScript bridge code that can help with this if you need.
In case you have not found a solution to this yet, try http://www.razorsql.com/download_win.html
I am not affiliated with them, but I was looking for this same solution and this is working.

Resources