Viewing a file stored in Varbinary(MAX) in SQL Server - sql-server

Let's say I inserted a file into a varbinary(max) field like so:
CREATE TABLE myTable
(
FileName nvarchar(60),
FileType nvarchar(60),
Document varbinary(max)
)
GO
INSERT INTO myTable(FileName, FileType, field_varbinary)
SELECT
'Text1.txt' AS FileName,
'.txt' AS FileType,
*
FROM
OPENROWSET(BULK N'C:\Text1.txt', SINGLE_BLOB) AS Document
GO
Of course my file now looks like this:
0xFFD8FFE000104A46494600010101004800....
Is there a simple and elegant way to retrieve this file?
My preference is to open it right away in a temp folder rather than saving and then viewing and deleting. In MS Access, this is as simple as using an Attachment field and double clicking to upload/download.

Since there is no built-in functionality in SSMS for this task I usually write a simple LINQPad script that extracts my varbinary column and writes it to the file system.
Something like this:
var results = from p in myTable
where p.ID == ... //your condition here
select p;
foreach (var item in results)
{
File.WriteAllBytes("C:\\" + item.FileName + item.FileType , item.Document.ToArray());
}

I am working with C# and ASP.NET, and I ended up doing this using a Generic Handler, later linking to it elsewhere in my website project:
public class ImageProvider : IHttpHandler {
public string connString = "...";
public void ProcessRequest(HttpContext context)
{
context.Response.ContentType = "image/jpeg";
string sqlSelectQuery = "select img from table"
SqlConnection conn = new SqlConnection(connString);
conn.Open();
SqlCommand cmd = new SqlCommand(sqlSelectQuery, conn);
byte[] img = (byte[])cmd.ExecuteScalar();
context.Response.BinaryWrite(img);
}

Related

How do we extract blob data type column's data from h2 database?

We've deployed nodes and posted a transaction from one node to another using corda and the same got stored in h2 database in "NODE_TRANSACTIONS" table.
TRANSACTION_VALUE column in NODE_TRANSACTIONS table is of BLOB data type.
Please suggest how to extract data from this column in a readable format
We've tried extracting data using resultset.getBinaryStream in java, but not sure of the supported file type in which it needs to be read. Tried with file types image/txt/pdf etc but none of the files were in readable format.
static String url = "jdbc:h2:tcp://localhost:12345/node";
static String username = "sa";
static String password = "";
Class.forName("oracle.h2.Driver");
Connection conn = DriverManager.getConnection(url, username, password);
System.out.println("getting connection: " + conn);
String sql = "SELECT TX_ID, TRANSACTION_VALUE FROM NODE_TRANSACTIONS where rownum<2";
PreparedStatement stmt = conn.prepareStatement(sql);
ResultSet rs = stmt.executeQuery();
while (rs.next()) {
InputStream data=rs.getBinaryStream(2);
File file = new File("D:\\blob.txt");
FileOutputStream fos = new FileOutputStream(file);
byte[] buffer = new byte[1];
while (data.read(buffer) > 0) {
fos.write(buffer);
}
fos.close();
}
conn.close();
Also, please suggest any other way to read the column data using h2 database functions (or) oracle functions
I expect the output to be in a readable format
If you're after having state data stored in a readable format in the database following a transaction, the state must implement the QueryableState interface.
See the docs at https://docs.corda.net/api-persistence.html and the example at https://github.com/corda/cordapp-example/blob/release-V4/java-source/src/main/java/com/example/state/IOUState.java

SQL Server Output multiple CSV files from one query

I am trying to get SQL server to create multiple CSV files from one query. At this time we have 7 separate long running (2+ hours each) queries that need to be output to separate CSV files. I can create one query that calls all of them but that generates one giant CSV. Is there a way to tell SQL Server to create a separate CSV after each internal query has completed?
This question becomes more important as our next production run will have 52 of those long running queries and my boss does not want to have to run each of them separately.
EXEC dbo.Get_Result_Set1;
EXEC dbo.Get_Result_Set2;
EXEC dbo.Get_Result_Set3;
EXEC dbo.Get_Result_Set4;
EXEC dbo.Get_Result_Set5;
EXEC dbo.Get_Result_Set6;
EXEC dbo.Get_Result_Set7;
You may want to create an SSIS package as the wrapper around executing these stored procedures, rather than your current query.
Each stored procedure can then be linked to a flat-file output.
This has the advantage that you should be able to express any required dependencies between the different invocations and (if possible) exploit some parallelism (if there are no dependencies between some/all of the invocations).
Could you create an Agent Job to do it? You could do a separate job step for each one of the queries. Under the advanced tab in the step section, there is an output option.
Not the answer I was looking for but I wound up having someone help me write a C# procedure in Visual Studio that calls each of my SQL procedures and outputs each to an Excel file. It works and I can reuse it in the future.
using System.Collections.Generic;
using System.Configuration;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using System.Linq;
using System.Text;
namespace StoredProcedureRunner
{
class Program
{
public static void Main(string[] args)
{
var storedProcs = new List<string>();
storedProcs.Add "dbo.Get_Result_Set1");
storedProcs.Add "dbo.Get_Result_Set2");
storedProcs.Add "dbo.Get_Result_Set3");
storedProcs.Add "dbo.Get_Result_Set4");
storedProcs.Add "dbo.Get_Result_Set5");
storedProcs.Add "dbo.Get_Result_Set6");
storedProcs.Add "dbo.Get_Result_Set7");
foreach (var storedProc in storedProcs)
{
var table = GetDataTable(storedProc);
WriteDataTableToCSV(storedProc + ".csv", table);
}
}
public static DataTable GetDataTable(string storedProc)
{
DataTable table = new DataTable();
using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["ConStrg"].ConnectionString))
{
using (var command = new SqlCommand(storedProc, connection))
{
using (var adapter = new SqlDataAdapter(command))
{
command.CommandType = CommandType.StoredProcedure;
command.CommandTimeout = 0
adapter.Fill(table);
}
}
}
return table;
}
public static void WriteDataTableToCSV(string filename, DataTable table)
{
StringBuilder sb = new StringBuilder();
var columnNames = table.Columns.Cast<DataColumn>().Select(col => col.ColumnName);
sb.AppendLine(string.Join(",", columnNames));
foreach(DataRow row in table.Rows)
{
var fields = row.ItemArray.Select(field => field.ToString());
sb.AppendLine(string.Join(",", fields));
}
File.WriteAllText(filename, sb.ToString());
}
}
}

How to insert images into SQL Server database table

I have created a new SQL Server local database with a table called drink.
I use Microsoft Visual Studio 2008.
Inside the table I defined the following columns:
id [int], kind [varchar], year [datatime], image [image]
I would like to insert images into the image column but I don't know how to do.
I need this column, because I want to display all data in DataGridView using VB.NET
Thanks for help!
Using openrowset you can insert image into database:
insert into tableName (id,kind,ImageColumn)
SELECT 1,'JPEG',BulkColumn
FROM Openrowset( Bulk '<Path of the image>', Single_Blob) as img
This worked for me...
Imports System.Data.Sql
Imports System.IO
Imports System.drawing
Module Module1
Private Const _ConnectString As String = "your connection string here"
Sub Main()
Dim MyImage As Image = Image.FromFile("RandomImage.jpg")
Dim Id As Long = 1
SaveDrinkImage(MyImage, Id)
End Sub
Sub SaveDrinkImage(MyImage As Image, Id As Long)
Dim ImageBytes(0) As Byte
Using mStream As New MemoryStream()
MyImage.Save(mStream, MyImage.RawFormat)
ImageBytes = mStream.ToArray()
End Using
Dim adoConnect = New SqlClient.SqlConnection(_ConnectString)
Dim adoCommand = New SqlClient.SqlCommand("UPDATE [drink] SET [image]=#MyNewImage WHERE [id]=#id", adoConnect)
With adoCommand.Parameters.Add("#MyNewImage", SqlDbType.Image)
.Value = ImageBytes
.Size = ImageBytes.Length
End With
With adoCommand.Parameters.Add("#id", SqlDbType.BigInt)
.Value = Id
End With
adoConnect.Open()
adoCommand.ExecuteNonQuery()
adoConnect.close()
End Sub
End Module
There are two approaches. You can store the image as a blob in the database or you can store the path (string) to the image file. There is an answer that discusses this: Storing images in SQL Server?
Try this:
Sub SaveDrinkToDB(name As String, imageFilePath As String)
Using cn = New SqlConnection("connection string here")
cn.Open()
Using cmd = New SqlCommand("INSERT INTO Drink (Name, Image) Values(#Name,#Image)", cn)
cmd.Parameters.AddWithValue("#name", name)
cmd.Parameters.AddWithValue("#Image", File.ReadAllBytes(imageFilePath))
cmd.ExecuteNonQuery()
End Using
End Using
End Sub
Usage:
SaveDrinkToDB("Milk", "c:\drink.png")
USE [database_name]
GO
INSERT INTO [schema].[tablename]([property1],[property2],[Property3],[Image])
VALUES
('value','value','value','C:\pathname.jpg')
GO

Update database table column values with new Excel column values

I have a table in the database.
id : Name : Category
--------------------
1 : jake : admin
I have used sqlbulkcopy for inserting new rows into the database table from an excel sheet containing two columns 'Name' and 'Category'.
Now, I have another requirement and that is updating the values of the column Name and Category from the same Excel sheet but with newly updated values. My question is, is this possible? I just need a way to update the values of the column. I went through this reference.
update SQl table from values in excel
but I don't know how to implement it.
Any help or suggestion will be greatly appreciate.
Thanks.
The code for uploading and importing excel file using sqlbulkcopy is below.
protected void Button1_Click(object sender, EventArgs e)
{
if (FileUpload1.HasFile)
{
FileUpload1.SaveAs(Server.MapPath("~/" + Path.GetFileName(FileUpload1.FileName)));
string file = Server.MapPath("~/"+Path.GetFileName(FileUpload1.FileName));
string constr = #"Provider=Microsoft.Jet.OLEDB.4.0;Excel 8.0; Extended Properties=HDR=Yes;IMEX=1; Data Source=" + file + ";";
using (OleDbConnection olecon = new OleDbConnection(constr))
{
OleDbCommand olecmd = new OleDbCommand("select Name, Category FROM [Sheet1$]", olecon);
olecon.Open();
using (DbDataReader dbrdr = olecmd.ExecuteReader())
{
string sqlcon = "Data Source=matty2011-PC\\SQLEXPRESS;Initial Catalog=mydb; Integrated Security=True";
using (SqlBulkCopy sqlbulkcopy = new SqlBulkCopy(sqlcon))
{
sqlbulkcopy.ColumnMappings.Add(0, 1);
sqlbulkcopy.ColumnMappings.Add(1, 2);
sqlbulkcopy.DestinationTableName = "exceldata";
sqlbulkcopy.WriteToServer(dbrdr);
}
}
}
}
else
{
Response.Write("Please select a file for upload.!"); return;
}
}
hi
if possible use SSIS to solve
Simply by DTS
note : SSIS DTS will append the existing data

Sql SMO: How to get path of database physical file name?

I am trying to return the physical file path of a database's mdf/ldf files.
I have tried using the following code:
Server srv = new Server(connection);
Database database = new Database(srv, dbName);
string filePath = database.PrimaryFilePath;
However this throws an exception "'database.PrimaryFilePath' threw an exception of type 'Microsoft.SqlServer.Management.Smo.PropertyNotSetException' - even though the database I'm running this against exists, and its mdf file is located in c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL
What am I doing wrong?
Usually the problem is with the DefaultFile property being null. The default data file is where the data files are stored on the instance of SQL Server unless otherwise specified in the FileName property. If no other default location has been specified the property will return an empty string.
So, this property brings back nothing (empty string) if you didn't set the default location.
A workaround is to check the DefaultFile property, if it returns an empty string use SMO to get the master database then use the Database.PrimaryFilePath property to retrieve the Default Data File Location (since it hasn't changed)
Since you say the problem is with your PrimaryFilePath:
Confirm that your connection is open
Confirm that other properties are available
This is how I do it, prepared for multiple file names. Access database.LogFiles to get the same list of log file names:
private static IList<string> _GetAttachedFileNames(Database database)
{
var fileNames = new List<string>();
foreach (FileGroup group in database.FileGroups)
foreach (DataFile file in group.Files)
fileNames.Add(file.FileName);
return fileNames;
}
Server srv = new Server(connection);
DatabaseCollection dbc = svr.Databases;
Database database = dbc["dbName"];
string filePath = database.PrimaryFilePath;
I think the easiest approach would be to run sql script on your sql server instance which will always return you correct data and log file paths. The following sql will do the trick
SELECT
db.name AS DBName,
(select mf.Physical_Name FROM sys.master_files mf where mf.type_desc = 'ROWS' and db.database_id = mf.database_id ) as DataFile,
(select mf.Physical_Name FROM sys.master_files mf where mf.type_desc = 'LOG' and db.database_id = mf.database_id ) as LogFile
FROM sys.databases db
order by DBName
You can still execute this sql using SMO if you want to, which will return you a dataset and then you can extract that information.
var result = new List();
var server = new Server( serverInstanceName );
var data = server.Databases[ "master" ].ExecuteWithResults(sql);
foreach ( DataRow row in data.Tables[ 0 ].Rows )
result.Add( new DatabaseInfo( row[ "DBName" ].ToString(), row[ "DataFile" ].ToString(), row[ "LogFile" ].ToString() ) );
return result;
If you will use this snippet then make sure to create a DatabaseInfo class which will store the information returned from Sql server instance.
using Smo = Microsoft.SqlServer.Management.Smo;
public string GetDataBasePath(string strDatabaseName)
{
ServerConnection srvConn = new ServerConnection();
srvConn.ConnectionString = "<your connection string goes here>";
Server srv = new Server(srvConn);
foreach (Smo.Database db in srv.Databases)
{
if (string.Compare(strDatabaseName, db.Name, true) == 0)
{
return db.PrimaryFilePath;
}
}
return string.Empty;
}

Resources