How to convert binary data to text?
I have column called File names in test table with image datatype so when i am selecting the data from test table its showing the data for file names column is binary data i.e ('0x433A5C55736535').
Regards
Anji
I can't beleive nobody answered this. If it's image data you won't get anything readable. but in case it is text data in a binary field, you can do this:
select cast(DataColumn as varchar(MAX)) as DataAsText from [TableWithData];
'0x433A5C55736535' is not binary data, binary data is composed only with 0 and 1.
I think you need to use the defined functions of your test table to get the data you want, if you are in MySQL it's "Select * from file names"
Related
I have a table in SQL Server, with binary data stored as a string in a varchar(max) field.
The table name is attachment, and the field is named "documentbody".
select id, mimetype, documentbody
from attachment
The files stored in the table are mostly PDF, but also include JPG and PNG and probably some other file types too.
Here is a sample of what one of the "files" looks like, when queried (first 100 characters only):
JVBERi0xLjQKJeLjz9MNCjEgMCBvYmoKPDwvVHlwZSAvUGFnZQovUGFyZW50IDIgMCBSCi9NZWRpYUJveCBbIDAgMCA2MTIuMDAw
How can I convert this data into actual binary data?
When one wishes to convert data from one datatype to another, and no implicit convert exists, one uses either CAST or CONVERT.
e.g.
select cast(MyColumn as varbinary(max)), convert(varbinary(max), MyColumn)
from MyTable;
CAST is ANSI-SQL for what it is worth whereas CONVERT is SQL Server specific. However CONVERT handles many other cases including specific formatting, which CAST doesn't handle.
OK, taking a total guess here, many people encode binary data as base64 so try this:
SELECT CAST(CAST(N'' AS XML).value('xs:base64Binary(sql:column("MyColumn"))', 'VARBINARY(MAX)') AS VARCHAR(MAX))
FROM MyTable;
Every time that I try to import an Excel file into SQL Server I'm getting a particular error. When I try to edit the mappings the default value for all numerical fields is float. None of the fields in my table have decimals in them and they aren't a money data type. They're only 8 digit numbers. However, since I don't want my primary key stored as a float when it's an int, how can I fix this? It gives me a truncation error of some sort, I'll post a screen cap if needed. Is this a common problem?
It should be noted that I cannot import Excel 2007 files (I think I've found the remedy to this), but even when I try to import .xls files every value that contains numerals is automatically imported as a float and when I try to change it I get an error.
http://imgur.com/4204g
SSIS doesn't implicitly convert data types, so you need to do it explicitly. The Excel connection manager can only handle a few data types and it tries to make a best guess based on the first few rows of the file. This is fully documented in the SSIS documentation.
You have several options:
Change your destination data type to float
Load to a 'staging' table with data type float using the Import Wizard and then INSERT into the real destination table using CAST or CONVERT to convert the data
Create an SSIS package and use the Data Conversion transformation to convert the data
You might also want to note the comments in the Import Wizard documentation about data type mappings.
Going off of what Derloopkat said, which still can fail on conversion (no offense Derloopkat) because Excel is terrible at this:
Paste from excel into Notepad and save as normal (.txt file).
From within excel, open said .txt file.
Select next as it is obviously tab delimited.
Select "none" for text qualifier, then next again.
Select the first row, hold shift, select the last row, and select the text radial button. Click Finish
It will open, check it to make sure it's accurate and then save as an excel file.
There is a workaround.
Import excel sheet with numbers as float (default).
After importing, Goto Table-Design
Change DataType of the column from Float to Int or Bigint
Save Changes
Change DataType of the column from Bigint to any Text Type (Varchar, nvarchar, text, ntext etc)
Save Changes.
That's it.
When Excel finds mixed data types in same column it guesses what is the right format for the column (the majority of the values determines the type of the column) and dismisses all other values by inserting NULLs. But Excel does it far badly (e.g. if a column is considered text and Excel finds a number then decides that the number is a mistake and insert a NULL instead, or if some cells containing numbers are "text" formatted, one may get NULL values into an integer column of the database).
Solution:
Create a new excel sheet with the name of the columns in the first row
Format the columns as text
Paste the rows without format (use CVS format or copy/paste in Notepad to get only text)
Note that formatting the columns on an existing Excel sheet is not enough.
There seems to be a really easy solution when dealing with data type issues.
Basically, at the end of Excel connection string, add ;IMEX=1;"
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=\\YOURSERVER\shared\Client Projects\FOLDER\Data\FILE.xls;Extended Properties="EXCEL 8.0;HDR=YES;IMEX=1";
This will resolve data type issues such as columns where values are mixed with text and numbers.
To get to connection property, right click on Excel connection manager below control flow and hit properties. It'll be to the right under solution explorer. Hope that helps.
To avoid float type field in a simple way:
Open your excel sheet..
Insert blank row after header row and type (any text) in all cells.
Mouse Right-Click on the head of the columns that cause a float issue and select (Format Cells), then choose the category (Text) and press OK.
And then export the excel sheet to your SQL server.
This simple way worked with me.
A workaround to consider in a pinch:
save a copy of the excel file, modify the column to format type 'text'
copy the column values and paste to a text editor, save the file (call it tmp.txt).
modify the data in the text file to start and end with a character so that the SQL Server import mechanism will recognize as text. If you have a fancy editor, use included tools. I use awk in cygwin on my windows laptop. For example, I start end end the column value with a single quote, like "$ awk '{print "\x27"$1"\x27"}' ./tmp.txt > ./tmp2.txt"
copy and paste the data from tmp2.txt over top of the necessary column in the excel file, and save the excel file
run the sql server import for your modified excel file... be sure to double check the data type chosen by the importer is not numeric... if it is, repeat the above steps with a different set of characters
The data in the database will have the quotes once the import is done... you can update the data later on to remove the quotes, or use the "replace" function in your read query, such as "replace([dbo].[MyTable].[MyColumn], '''', '')"
Table
id - int
file - varbinary(max)
Query
SELECT file
FROM Table
WHERE id = 1
Data
The id 1 data has a file. And it's binary length is 836,412. But when I run the query, I only see a 43,680 bytes of binary data.
I tried to download the data as a CSV through "Save Result as..." button in the pop up panel. But still, I couldn't get a full length of the binary data.
In the CSV, there's only 16 bit (unsigned, 65534 length) data available. Unfortunately, I cannot request the data from my application at this moment. I have to pull it out from SSMS and convert it manually in my test code to see the file.
How can I get the full binary data from SSMS? Is there an option for displaying full length of binary data?
You can try casting the "file" field to XML using a query such as the following...
SELECT CAST(CONVERT(VARCHAR(MAX), file, 1) AS XML)
FROM Table
WHERE id = 1
However, you'll need to confirm that your SSMS Query options are set appropriately:
Query | Query Options... | Results | Grid | XML Data: unlimited
This will provide your results in hex, like so:
0x47494638396150003200F70 ...
How does creating a large object work? Does there need to be a client, because all I am hoping to do is have an image be one column.
I am typing the following commands after creating my table but I just get an error about the path not being correct for the image (even though I have it starting right from the C drive).
CREATE TABLE image (name text,
raster oid);
INSERT INTO image (name, raster)
VALUES ('beautiful image', lo_import('C:Documents/etc/motd'));
I am not running any C code, am I suppose to do that or does this automatically create the object Large Object?
If I am suppose to run some C code where would I do it with respect to PostgreSQL?
Can I do what I want all with PostgreSQL syntax? Is there another way to approach including images as a field?
Any help will be greatly appreciated.
According to PostgreSQL documentation, there's two ways to handle large objects (considering Java JDBC):
To use the bytea data type you should simply use the getBytes(), setBytes(), getBinaryStream(), or setBinaryStream() methods.
and
LargeObject API.
Also, you can covert your image to a base64 string and then insert it directly using, for instance, PgAdmin:
CREATE TABLE image_table (name varchar(255), DATA bytea);
INSERT INTO image_table
VALUES ('my_image.jpg',
decode('paste your byte array string here', 'base64'));
Full sample code here.
In Sybase 15.03 we have a table with column named "content" type TEXT.
It contains compressed contents of the file, looks something like this when doing straight select:
H4sIAAAAAAAAAL2Y21IjOQyG7/cp+nJny9UjWbIszx2QQDIJhMmhanff/0H2d6cpSOKh2ZoMQFWS
How can I select/extract/uncompress/decompress its value, so I can examine its content?
Thank you.
Michael
Just do a SELECT of the column, and make sure the SET TEXTSIZE setting is set high enough to avoid the data being truncated.
To get the data in a file directly, do a BCP-out (and use the -T flag similarly to SET TEXTSIZE).