Losing one byte when storing data in SQL Server Image column - sql-server

We have a legacy Classic ASP app that is using ADODB to store a document in an image column in SQL Server. The problem we are running into is that a single byte is getting lost from most files. We have one instance where a file won't lose a byte so I'm thinking it's potentially size related.
So far we have..
Verified that the correct byte length is being passed to the adodb CreateParameter call.
Used the SQL Server BCP tool to verify that the file is corrupt in SQL Server and not by our extraction. When we extracted the file with BCP it is missing the byte AND we did use the -w parameter.
Does anyone have any advice on what we could try to do next?
UPDATE:
We have done some more research and it appears the byte is disappearing in adodb. We do this..
cmd.Parameters.Append cmd.CreateParameter("SupportDocImage",adLongVarBinary,adParamInput,lSize,sFilePath)
Where lSize would be 123,139 and sFilePath is the binary data (yes.. bad naming)
This actually calls off to a stored procedure and in that stored procedure we will do a DATALENGTH(#Variable) and we instead get 123,138.
Also from what we have seen it is the last byte that gets lost.

Check out this article.
The solution is for Oracle, but it is using ADO all the same.
http://forums.devshed.com/visual-basic-programming-52/inserting-blob-s-via-adodb-stream-using-a-stored-procedure-103030.html
Try adding +1 to the size.
Also, did you check this property off when uploading the image.
cmdSQL.Properties("SPPrmsLOB") = True
Regardless, you still have to write a stored procedure that takes a varbinary(max) or image data type and inserts it into the table.

Related

Read/write a large text/comments cell (6k+ chars) in Excel from/to SQL Server

I am trying to alter an existing Excel 2010 workbook so that the data is hosted on MSSQL since the data is getting to large and the workbook too slow. I'm using ADO. My issue here is I don't know how best to handle the cells that contain a large amount of comments which also include carriage returns/line feeds and possible other special characters. The largest cell contains approx 6000 characters so far. I don't really expect the text within a given cell to get much bigger than that.
Questions: -
What data type should I use within SQL Server to store this data? I'm concerned about special characters like carriage returns.
What is the best method to transfer data back and forth from Excel and MSSQL? I could probably use a hidden ListObject to read the data, but I'm more concerned about writing any edits back. The cell length is too long for a SQL string and I don't know how to handle the carriage returns. I keep getting an Application Object error. I don't have any problem with most cells and their data, just these large text cells that represent comment descriptions.
I don't know how to handle the initial large data dump into MSSQL. The SQL Server Import Wizard keeps failing stating there are characters not within the assigned code page. There is no indication of the row it failed on or what characters are causing the issue. Is that down to the data type I've chosen? It is currently Varchar hence my first question. Should I just use Text or NText? Won't they make the database massive? SSIS uses the import wizard so that will still fail. Anything that requires a SQL statement such as ADO or OPENROWSET won't like the length of the data unless I'm missing something.
Any suggestions/help would be much appreciated.

ColdFusion 8 + MSSQL 2005 and CLOB datatype on resultset

The environment I am working with is CF8 and SQL 2005 and the datatype CLOB is disabled on the CF administrator. My concern is, will there be a performance ramification by enabling the CLOB datatype in the CF Administrator.
The reason I want/need to enable it is, SQL is building the AJAX XML response. When the response is large, the result is either truncated or returned with multiple rows (depending on how the SQL developer created the stored proc). Enabling CLOB allows the entire result to be returned. The other option I have is to have SQL always return the XML result in multiple rows and have CF join the string for each result row.
Anyone with some experience with this idea or have any thoughts?
Thanks!
I really think that returning Clob data is likely to be less expensive then concatenating multiple rows of data into an XML string and then parsing it (ick!). What you are trying to do is what CLOB is designed for. JDBC handles it pretty well. The performance hit is probably negligible. After all - you have to return the same amount of character data either way, whether in multiple rows or a single field. And to have to "break it up" on the SQL side and then "reassemble" it on the CF side seems like reinventing the wheel to be sure.
I would add that questions like this sometimes mystify me. A modest amount of testing would seem to be able to answer this question to your own satisfaction - no?
I would just have the StoredProc return the data set, or multiple data sets, and just build the XML the way you need it via CF.
I've never needed to use CLOB. I almost always stick to the varchar datatype, and it seems to do the job just fine.
There are also options where you could call the Stored Proc, which triggers MSSQL to generate an actual xml file (not just a string) and simply return you the file name. Then you can use CFFILE action="read" to grab the xml string and parse it accrodingly. Assuming your web server and db have a common file storage area.

Passing huge amounts of data as an hexadecimal (0x123AB...) parameter of a clr stored procedure in sql server

I post this question has followup of This question, since the thread is not recieving more answers.
I'm trying to understand if it is possible to pass as a parameter of a CLR stored procedure a large amount of data as "0x5352532F...".
This is to avoid to send the data directly to the CLR stored procedure, instead of sending ti to a temporary DB field and from there passing it as varbinary(max) parmeter to the CLR stored procedure.
I have a triple question:
1) is it possible, if yes how? Let's say i want to pass a pdf file to the CLR stored procedure (not the path, the full bits that make up the file). Something like:
exec MyCLRStoredProcs.dbo.insertfile
#file_remote_path ='c:\temp\test_file.txt' ,
#file_contents=0x4D5A90000300000004000.... --(this long list is the file content)
where insertfile is a stored proc that writes to the server path (at file_remote_path) the binary data I pass as (file_contents).
2) is it there corruption risk of adopting this approach (or it is the same approach that sql server uses behind the scenes)?
3) how to convert the content of a file into the "0x23423..." hexadecimal representation
What is your goal? Are you trying to transfer a file from the client filesystem to the server filesystem? If so, you might want to look at a web service file transfer mechanism.
Do you want to persist the data into the database? If so, and you have access to SQL Server 2008, I recommend looking at the new FILESTREAM type. This type maintains the link between the database and the file system for you.
Alternatively, if you don't have SQL Server 2008, you get to choose between saving it as a file and maintaining a string path to it in the database or storing the contents of a file in a VARBINARY(MAX) column.
If all you want is to get the data into the database, you don't need a CLR proc. You can save it directly to the database, or you can code a SQL stored proc to do so.
Assuming you keep the approach of sending this to a CLR proc:
1) is it possible, if yes how?
Sure, why not. The code you wrote looks like a good example. The stored proc will need to convert the string into bytes.
2) is it there corruption risk of adopting this approach
I'm not sure what you mean here. Will SQL Server randomly replace characters in your string? No. Might you accidentally hit some sort of limit? Yes, possibly; the maximum size of NVARCHAR(MAX) is 2^31-1, or 2,147,483,647 characters. But I doubt you'd have a PDF that size. Might you lose the link between the file on disk and the database path to it? Yes, though FILESTREAM should take care of that for you.
3) how to convert the content of a file into the "0x23423..." hexadecimal representation
There are many examples on the Internet on how to do this. Here's one:
How do you convert Byte Array to Hexadecimal String, and vice versa?

SQL Server 2005 - How do I convert image data type to character format

Background: I am a software tester working with a test case management database that stores data using the deprecated image data type. I am relatively inexperienced with SQL Server.
The problem: Character data with rich text formatting is stored as an image data type. Currently the only way to see this data in a human readable format is through the test case management tool itself which I am in the process of replacing. I know that there is no direct way to convert an image data type to character, but clearly there is some way this can be accomplished, given that the test case management software is performing that task. I have searched this site and have not found any hits. I have also not yet found any solutions by searching the net.
Objective: My goal is to export the data out of the SQL Server database into an Access database There are fewer than 10,000 rows in the database. At a later stage in the project, the Access database will be upsized to SQL Server.
Request: Can someone please give me a method for converting the image data type to a character format.
.
You presumably want to convert to byte data rather than character. This post at my blog
Save and Restore Files/Images to SQL Server Database might be useful. It contains code for exporting to a byte array and to a file. The entire C# project is downloadable as a zip file.
One solution (for human readability) is to pull it out in chunks that you convert from binary to character data. If every byte is valid ASCII, there shouldn't be a problem (although legacy data is often not what you expect).
First, create a table like this:
create table Nums(
n int primary key
);
and insert the integers from 0 up to at least (maximum image column length in bytes)/8000. Then the following query (untested, so think it through) should get your data out in a relatively useful form. Be sure whatever client you're pulling it to won't truncate strings at smaller than 8000 bytes. (You can do smaller chunks if you want to be opening the result in Notepad or something.)
SELECT
yourTable.keycolumn,
Nums.n as chunkPosition,
CAST(SUBSTRING(imageCol,n*8000+1,8000) AS VARCHAR(8000)) as chunk
FROM yourTable
JOIN Nums
ON Nums.n <= (DATALENGTH(yourTable.imageCol)-1)/8000
ORDER BY yourTable.keycolumn, Nums.n

Problem inserting data into a varchar field using a stored procedure

I am uploading some data from DB2 to SQL2005. The table contains one text field that is 2000 characters long in DB2 but is set up as a varchar(2000) in SQL.
When I run my insert in query browser it processes fine and the data is copied correctly, however when the same statement runs in my stored procedure the data for this field only is copied incorrectly and is shown as a series of strange characters.
The field can occasionally contain control characters however the problem occurs even when it doesn't.
Is there a setting i need to apply to my stored procedure in order to get the insert to work correctly?
Or do i need to use a cast or convert on the data in order to get it appearing correctly.
Thanks.
Update: Thanks for your suggestions. It now appears that the problem was caused by the software that we used to create the linked server containing the DB2 database. This could not handle a field 2000 characters long when run via a stored procedure but could when run interactively.
I ultimately got around the problem by splicing the field into 10 200 character long fields for the upload and then re-joined them again when they were in the SQL database.
It sounds like the data coming from db2 is in a different character set. You should make sure it isn't EBCDIC or something like that.
Also, try calling your stored procedure from the SQL Server Management Studio (query browser), to see if it works at all.
You might want to change your varchar(2000) to an nvarchars(2000) (in the stored procedure as well as the table - i assume it exists as a parameter). This would allow them to hold two byte characters. It'll depend on the DB2 configuration but it may be that it's exporting UTF-16 (or similar) rather than UTF-8.
This problem was caused by the 3rd party software that we used to create the linked server to the DB2 database. This could not handle a field 2000 characters long when run via a stored procedure but could when run interactively.
I ultimately got around the problem by splicing the field into 10 200 character long fields for the upload and then re-joined them again when they were in the SQL database.

Resources