How can I convert byte[] to BitmapImage? - wpf

I have a byte[] that represents the raw data of an image. I would like to convert it to a BitmapImage.
I tried several examples I found but I kept getting the following exception
"No imaging component suitable to complete this operation was found."
I think it is because my byte[] does not actually represent an Image but only the raw bits.
so my question is as mentioned above is how to convert a byte[] of raw bits to a BitmapImage.

The code below does not create a BitmapSource from a raw pixel buffer, as asked in the question.
But in case you want to create a BitmapImage from an encoded frame like a PNG or a JPEG, you would do it like this:
public static BitmapImage LoadFromBytes(byte[] bytes)
{
using (var stream = new MemoryStream(bytes))
{
var image = new BitmapImage();
image.BeginInit();
image.CacheOption = BitmapCacheOption.OnLoad;
image.StreamSource = stream;
image.EndInit();
return image;
}
}

When your byte array contains a bitmap's raw pixel data, you may create a BitmapSource (which is the base class of BitmapImage) by the static method BitmapSource.Create.
However, you need to specify a few parameters of the bitmap. You must know in advance the width and height and also the PixelFormat of the buffer.
byte[] buffer = ...;
var width = 100; // for example
var height = 100; // for example
var dpiX = 96d;
var dpiY = 96d;
var pixelFormat = PixelFormats.Pbgra32; // for example
var stride = (width * pixelFormat.BitsPerPixel + 7) / 8;
var bitmap = BitmapSource.Create(width, height, dpiX, dpiY,
pixelFormat, null, buffer, stride);

I ran across this same error, but it was because my array was not getting filled with the actual data. I had an array of bytes that was equal to the length it was supposed to be, but the values were all still 0 - they had not been written to!
My particular issue - and I suspect for others that arrive at this question, as well - was because of the OracleBlob parameter. I didn't think I needed it, and thought I could just do something like:
DataSet ds = new DataSet();
OracleCommand cmd = new OracleCommand(strQuery, conn);
OracleDataAdapter oraAdpt = new OracleDataAdapter(cmd);
oraAdpt.Fill(ds)
if (ds.Tables[0].Rows.Count > 0)
{
byte[] myArray = (bytes)ds.Tables[0]["MY_BLOB_COLUMN"];
}
How wrong I was! To actually get the real bytes in that blob, I needed to actually read that result into an OracleBlob object. Instead of filling a dataset/datatable, I did this:
OracleBlob oBlob = null;
byte[] myArray = null;
OracleCommand cmd = new OracleCommand(strQuery, conn);
OracleDataReader result = cmd.ExecuteReader();
result.Read();
if (result.HasRows)
{
oBlob = result.GetOracleBlob(0);
myArray = new byte[oBlob.Length];
oBlob.Read(array, 0, Convert.ToInt32(myArray.Length));
oBlob.Erase();
oBlob.Close();
oBlob.Dispose();
}
Then, I could take myArray and do this:
if (myArray != null)
{
if (myArray.Length > 0)
{
MyImage.Source = LoadBitmapFromBytes(myArray);
}
}
And my revised LoadBitmapFromBytes function from the other answer:
public static BitmapImage LoadBitmapFromBytes(byte[] bytes)
{
var image = new BitmapImage();
using (var stream = new MemoryStream(bytes))
{
stream.Seek(0, SeekOrigin.Begin);
image.BeginInit();
image.StreamSource = stream;
image.CreateOptions = BitmapCreateOptions.PreservePixelFormat;
image.CacheOption = BitmapCacheOption.OnLoad;
image.UriSource = null;
image.EndInit();
}
return image;
}

Create a MemoryStream from the raw bytes and pass that into your Bitmap constructor.
Like this:
MemoryStream stream = new MemoryStream(bytes);
Bitmap image = new Bitmap(stream);

Related

With HelixToolkit.SharpDX.Wpf how do I set the DiffuseMap on a PhongMaterial from an ImageSource?

The DiffuseMap property of a PhongMaterial accepts a Stream.
If I have an ImageSource, how do I convert it to something acceptable to the property? Note that I need to be able to do this fast, in memory.
In the examples in the source code I can only find examples of loading images from file:
var image = LoadFileToMemory(new System.Uri(#"test.png", System.UriKind.RelativeOrAbsolute).ToString());
this.ModelMaterial = new PhongMaterial
{
AmbientColor = Colors.Gray.ToColor4(),
DiffuseColor = Colors.White.ToColor4(),
SpecularColor = Colors.White.ToColor4(),
SpecularShininess = 100f,
DiffuseAlphaMap = image,
DiffuseMap = LoadFileToMemory(new System.Uri(#"TextureCheckerboard2.dds", System.UriKind.RelativeOrAbsolute).ToString()),
NormalMap = LoadFileToMemory(new System.Uri(#"TextureCheckerboard2_dot3.dds", System.UriKind.RelativeOrAbsolute).ToString()),
};
LoadFileToMemory simply takes the bytes from a file and returns it as a MemoryStream.
By ImageSource you mean a BitmapSource or DrawingImage? ImageSource is the abstract base class for both of them.
If you have a BitmapSource you can convert it to a MemoryStream using:
private Stream BitmapSourceToStream(BitmapSource writeBmp)
{
Stream stream = new MemoryStream();
//BitmapEncoder enc = new PngBitmapEncoder();
//BitmapEncoder enc = new JpegBitmapEncoder();
BitmapEncoder enc = new BmpBitmapEncoder();
enc.Frames.Add(BitmapFrame.Create(writeBmp));
enc.Save(stream);
return stream;
}

How can I get Image class from MemoryStream of jpg Image?

Now I am trying to get a Image Class from jpg Image.
I already tried to use BitmapSource linked at here.
The error is not english, but the meaning is "the image header is broken. so, it is impossible to decode.".
The other formats like gif, png, bmp has no problem.
only jpg format faced on this problem.
< Sequence >
An Zip Archive file(jpg file is in this file.) -> unzip library -> MemoryStream(jpg file) -> BitmapSource
imageSource.BeginInit();
imageSource.StreamSource = memoryStream;
imageSource.EndInit();
this code makes the error.
I think the reason is the memory stream has raw binary of jpg, and it is not Bitmap format. So, BitmapSource cannot recognize this memory stream data as a bitmap image.
How can I solve this problem?
My goal is that Input : "ZIP File(in jpg)" -> Output : Image Class.
Thank you!
< My Code >
using (MemoryStream _reader = new MemoryStream())
{
reader.WriteEntryTo(_reader); // <- input jpg_data to _reader
var bitmap = new BitmapImage();
bitmap.BeginInit();
bitmap.CacheOption = BitmapCacheOption.OnLoad;
bitmap.StreamSource = _reader;
bitmap.EndInit();
bitmap.Freeze();
Image tmpImg = new Image();
tmpImg.Source = bitmap;
}
Rewind the stream after writing. While apparently only JpegBitmapDecoder is affected by a source stream's Position, you should generally do this for all kinds of bitmap streams.
var bitmap = new BitmapImage();
using (var stream = new MemoryStream())
{
reader.WriteEntryTo(stream);
stream.Position = 0; // here
bitmap.BeginInit();
bitmap.CacheOption = BitmapCacheOption.OnLoad;
bitmap.StreamSource = stream;
bitmap.EndInit();
bitmap.Freeze();
}
var tmpImg = new Image { Source = bitmap };
And just in case you don't actually care about whether your Image's Source is a BitmapImage or a BitmapFrame, you may reduce your code to this:
BitmapSource bitmap;
using (var stream = new MemoryStream())
{
reader.WriteEntryTo(stream);
stream.Position = 0;
bitmap = BitmapFrame.Create(stream, BitmapCreateOptions.None, BitmapCacheOption.OnLoad);
}
var tmpImg = new Image { Source = bitmap };

error in my byte[] to WPF BitmapImage conversion?

I'm saving a BitmapImage to a byte[] for saving in a DB. I'm pretty sure the data is being saved and retrieved accurately so it's not an issue there.
On my byte[] to BitmapImage conversion I keep getting an exception of "System.NotSupportedException: No imaging component suitable to complete this operation was found."
Can anyone see what I'm doing wrong with my two functions here?
private Byte[] convertBitmapImageToBytestream(BitmapImage bi)
{
int height = bi.PixelHeight;
int width = bi.PixelWidth;
int stride = width * ((bi.Format.BitsPerPixel + 7) / 8);
Byte[] bits = new Byte[height * stride];
bi.CopyPixels(bits, stride, 0);
return bits;
}
public BitmapImage convertByteToBitmapImage(Byte[] bytes)
{
MemoryStream stream = new MemoryStream(bytes);
stream.Position = 0;
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.StreamSource = stream;
bi.EndInit();
return bi;
}
Does this StackOverflow question helps?
byte[] to BitmapImage in silverlight
EDIT:
Try this, not sure it will work:
public BitmapImage convertByteToBitmapImage(Byte[] bytes)
{
MemoryStream stream = new MemoryStream(bytes);
stream.Position = 0;
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.CacheOption = BitmapCacheOption.OnLoad;
bi.DecodePixelWidth = ??; // Width of the image
bi.StreamSource = stream;
bi.EndInit();
return bi;
}
UPDATE 2:
I found these:
Load a byte[] into an Image at Runtime
BitmapImage from byte[] on a non UIThread
Apart from that, I don't know.
How do you know that the byte[] format you are creating is what the BI expects in the Stream? Why don't you use the BitmapImage.StreamSource to create the byte[] that you save? Then you know the format will be compatible.
http://www.codeproject.com/KB/vb/BmpImage2ByteArray.aspx
http://social.msdn.microsoft.com/forums/en-US/wpf/thread/8327dd31-2db1-4daa-a81c-aff60b63fee6/
[I did not try any of this code, but you can]
Turns out the bitmapimage CopyPixels isn't right. I take the output of the bitmapimage and convert it to something usable in this case a jpg.
public static Byte[] convertBitmapImageToBytestream(BitmapImage bi)
{
MemoryStream memStream = new MemoryStream();
JpegBitmapEncoder encoder = new JpegBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(bi));
encoder.Save(memStream);
byte[] bytestream = memStream.GetBuffer();
return bytestream;
}

How to get Memory Stream/Base64 String from Image.Source?

I have a dynamically created Image control that is populated via a OpenFileDialog like:
OpenFileDialog dialog = new OpenFileDialog();
if (dialog.ShowDialog() == true)
{
using (FileStream stream = dialog.File.OpenRead())
{
BitmapImage bmp = new BitmapImage();
bmp.SetSource(stream);
myImage.Source = bmp;
}
}
I want to send the image back to the server in a separate function call, as string via a web service.
How do I get a memory stream / base64 string from myImage.Source
Here's an alternative which should work (without BmpBitmapEncoder). It's uses the FileStream stream to create the byte array that is then converted to a Base64 string. This assumes you want to do this within the scope of the current code.
Byte[] bytes = new Byte[stream.Length];
stream.Read(bytes, 0, bytes.Length);
return Convert.ToBase64String(bytes);
Make sure you have http://imagetools.codeplex.com/
Then you can do this:
ImageSource myStartImage;
var image = ((WriteableBitmap) myStartImage).ToImage();
var encoder = new PngEncoder( false );
MemoryStream stream = new MemoryStream();
encoder.Encode( image, stream );
var myStartImageByteStream = stream.GetBuffer();
Then for Base64:
string encodedData = Convert.ToBase64String(myStartImageByteStream);

Creating WPF BitmapImage from MemoryStream png, gif

I am having some trouble creating a BitmapImage from a MemoryStream from png and gif bytes obtained from a web request. The bytes seem to be downloaded fine and the BitmapImage object is created without issue however the image is not actually rendering on my UI. The problem only occurs when the downloaded image is of type png or gif (works fine for jpeg).
Here is code that demonstrates the problem:
var webResponse = webRequest.GetResponse();
var stream = webResponse.GetResponseStream();
if (stream.CanRead)
{
Byte[] buffer = new Byte[webResponse.ContentLength];
stream.Read(buffer, 0, buffer.Length);
var byteStream = new System.IO.MemoryStream(buffer);
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.DecodePixelWidth = 30;
bi.StreamSource = byteStream;
bi.EndInit();
byteStream.Close();
stream.Close();
return bi;
}
To test that the web request was correctly obtaining the bytes I tried the following which saves the bytes to a file on disk and then loads the image using a UriSource rather than a StreamSource and it works for all image types:
var webResponse = webRequest.GetResponse();
var stream = webResponse.GetResponseStream();
if (stream.CanRead)
{
Byte[] buffer = new Byte[webResponse.ContentLength];
stream.Read(buffer, 0, buffer.Length);
string fName = "c:\\" + ((Uri)value).Segments.Last();
System.IO.File.WriteAllBytes(fName, buffer);
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.DecodePixelWidth = 30;
bi.UriSource = new Uri(fName);
bi.EndInit();
stream.Close();
return bi;
}
Anyone got any light to shine?
Add bi.CacheOption = BitmapCacheOption.OnLoad directly after your .BeginInit():
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.CacheOption = BitmapCacheOption.OnLoad;
...
Without this, BitmapImage uses lazy initialization by default and stream will be closed by then. In first example you try to read image from possibly garbage-collected closed or even disposed MemoryStream. Second example uses file, which is still available.
Also, don't write
var byteStream = new System.IO.MemoryStream(buffer);
better
using (MemoryStream byteStream = new MemoryStream(buffer))
{
...
}
I'm using this code:
public static BitmapImage GetBitmapImage(byte[] imageBytes)
{
var bitmapImage = new BitmapImage();
bitmapImage.BeginInit();
bitmapImage.StreamSource = new MemoryStream(imageBytes);
bitmapImage.EndInit();
return bitmapImage;
}
May be you should delete this line:
bi.DecodePixelWidth = 30;

Resources