WPF BitmapImage Serialization/Deserialization - wpf

I've been trying to Serialize and Deserialize BitmapImages. I've been using methods which supposedly works which I found in this thread: error in my byte[] to WPF BitmapImage conversion?
Just to iterate what is going on, here is part of my Serialization code:
using (MemoryStream ms = new MemoryStream())
{
// This is a BitmapImage fetched from a dictionary.
BitmapImage image = kvp.Value;
PngBitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(image));
encoder.Save(ms);
byte[] buffer = ms.GetBuffer();
// Here I'm adding the byte[] array to SerializationInfo
info.AddValue((int)kvp.Key + "", buffer);
}
And here is the deserialization code:
// While iterating over SerializationInfo in the deserialization
// constructor I pull the byte[] array out of an
// SerializationEntry
using (MemoryStream ms = new MemoryStream(entry.Value as byte[]))
{
ms.Position = 0;
BitmapImage image = new BitmapImage();
image.BeginInit();
image.StreamSource = ms;
image.EndInit();
// Adding the timeframe-key and image back into the dictionary
CapturedTrades.Add(timeframe, image);
}
Also, I'm not sure if it matters but earlier when I populated my dictionary I encoded Bitmaps with PngBitmapEncoder to get them into BitmapImages. So not sure if double-encoding has something to do with it. Here's the method that does that:
// Just to clarify this is done before the BitmapImages are added to the
// dictionary that they are stored in above.
private BitmapImage BitmapConverter(Bitmap image)
{
using (MemoryStream ms = new MemoryStream())
{
image.Save(ms, System.Drawing.Imaging.ImageFormat.Png);
BitmapImage bImg = new BitmapImage();
bImg.BeginInit();
bImg.StreamSource = new MemoryStream(ms.ToArray());
bImg.EndInit();
ms.Close();
return bImg;
}
}
So the problem is, serialization and deserialization works fine. No errors, and the dictionary has entries with what seems to be BitmapImages, however their width/height and
some other properties are all set to '0' when I look at them in debugging-mode. And of course, nothing is shown when I try to display the images.
So any ideas as to why they aren't properly deserialized?
Thanks!

1) You should not dispose MemoryStream, used from image initializing. Remove using in this line
using (MemoryStream ms = new MemoryStream(entry.Value as byte[]))
2) After
encoder.Save(ms);
Try adding
ms.Seek(SeekOrigin.Begin, 0);
ms.ToArray();

Related

How to convert System.Windows.Media.DrawingImage into Stream?

I'am trying convert DrawingImage into MemoryStream. My code looks like this:
public MemoryStream ImageStream(DrawingImage drawingImage)
{
MemoryStream stream = new MemoryStream();
ImageSource imageSource = drawingImage;
if (imageSource != null)
{
BitmapSource bitmap = imageSource as BitmapSource;
if (bitmap != null)
{
BitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(bitmap));
encoder.Save(stream);
}
}
return stream;
}
But problem is after casting ImageSource into BitmapSource bitmap is always null. Any sugestion how to fix that?
The reason your bitmap variable is always null is because DrawingImage does not extend BitmapImage or vice-versa, so the cast is guaranteed to fail. A DrawingImage does not contain any pixel data of any kind. It references a Drawing that is used whenever the image needs to be rasterized.
How did you find yourself in a situation where you want to rasterize a DrawingImage and serialize it into a stream? I get the feeling you are going about something in an unusual way if you have need of a function like this.
Nevertheless, you could implement this function by drawing the DrawingImage to a DrawingVisual, rendering it to a RenderTargetBitmap, and then passing the render target to the encoder to serialize the raster data to a stream.
public MemoryStream ImageStream(DrawingImage drawingImage)
{
DrawingVisual visual = new DrawingVisual();
using (DrawingContext dc = visual.RenderOpen())
{
dc.DrawDrawing(drawingImage.Drawing);
dc.Close();
}
RenderTargetBitmap target = new RenderTargetBitmap((int)visual.Drawing.Bounds.Right, (int)visual.Drawing.Bounds.Bottom, 96.0, 96.0, PixelFormats.Pbgra32);
target.Render(visual);
MemoryStream stream = new MemoryStream();
BitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(target));
encoder.Save(stream);
return stream;
}
If you want something a little more generic, I would split this into two methods and change some of the types.
public BitmapSource Rasterize(Drawing drawing)
{
DrawingVisual visual = new DrawingVisual();
using (DrawingContext dc = visual.RenderOpen())
{
dc.DrawDrawing(drawing);
dc.Close();
}
RenderTargetBitmap target = new RenderTargetBitmap((int)drawing.Bounds.Right, (int)drawing.Bounds.Bottom, 96.0, 96.0, PixelFormats.Pbgra32);
target.Render(visual);
return target;
}
public void SavePng(BitmapSource source, Stream target)
{
BitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(source));
encoder.Save(target);
}
Then you could use it with any kind of stream. For example, to save the drawing to a file:
using (FileStream file = File.Create("somepath.png"))
{
SavePng(Rasterize(drawingImage.Drawing), file);
}

Cannot decode jpeg using JpegBitmapDecoder

I have the following two functions to convert bytes to image and display on Image in WPF
private JpegBitmapDecoder ConvertBytestoImageStream(byte[] imageData)
{
Stream imageStreamSource = new MemoryStream(imageData);
JpegBitmapDecoder decoder = new JpegBitmapDecoder(imageStreamSource, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.Default);
BitmapSource bitmapSource = decoder.Frames[0];
return decoder;
}
The above code does not work at all. I always get the exception that "No imaging component found" Image is not displayed.
private MemoryStream ConvertBytestoImageStream(int CameraId, byte[] ImageData, int imgWidth, int imgHeight, DateTime detectTime)
{
GCHandle gch = GCHandle.Alloc(ImageData, GCHandleType.Pinned);
int stride = 4 * ((24 * imgWidth + 31) / 32);
Bitmap bmp = new Bitmap(imgWidth, imgHeight, stride, PixelFormat.Format24bppRgb, gch.AddrOfPinnedObject());
MemoryStream ms = new MemoryStream();
bmp.Save(ms, ImageFormat.Jpeg);
gch.Free();
return ms;
}
This function works, but is very slow. I wish to optimize my code.
Your ConvertBytestoImageStream works fine for me if i pass it a JPEG buffer. There are however a few things that could be improved. Depending on whether you really want to return a decoder or a bitmap, the method could be written this way:
private BitmapDecoder ConvertBytesToDecoder(byte[] buffer)
{
using (MemoryStream stream = new MemoryStream(buffer))
{
return BitmapDecoder.Create(stream,
BitmapCreateOptions.PreservePixelFormat,
BitmapCacheOption.OnLoad); // enables closing the stream immediately
}
}
or this way:
private ImageSource ConvertBytesToImage(byte[] buffer)
{
using (MemoryStream stream = new MemoryStream(buffer))
{
BitmapDecoder decoder = BitmapDecoder.Create(stream,
BitmapCreateOptions.PreservePixelFormat,
BitmapCacheOption.OnLoad); // enables closing the stream immediately
return decoder.Frames[0];
}
}
Note that instead of using JpegBitmapDecoder this code utilizes a static factory method of the abstract base class BitmapDecoder which automatically selects the proper decoder for the provided data stream. Hence this code can be used for all image formats supported by WPF.
Note also that the Stream object is used inside a using block which takes care of disposing it when it is no longer needed. BitmapCacheOption.OnLoad ensures that the whole stream is loaded into the decoder and can be closed afterwards.

error in my byte[] to WPF BitmapImage conversion?

I'm saving a BitmapImage to a byte[] for saving in a DB. I'm pretty sure the data is being saved and retrieved accurately so it's not an issue there.
On my byte[] to BitmapImage conversion I keep getting an exception of "System.NotSupportedException: No imaging component suitable to complete this operation was found."
Can anyone see what I'm doing wrong with my two functions here?
private Byte[] convertBitmapImageToBytestream(BitmapImage bi)
{
int height = bi.PixelHeight;
int width = bi.PixelWidth;
int stride = width * ((bi.Format.BitsPerPixel + 7) / 8);
Byte[] bits = new Byte[height * stride];
bi.CopyPixels(bits, stride, 0);
return bits;
}
public BitmapImage convertByteToBitmapImage(Byte[] bytes)
{
MemoryStream stream = new MemoryStream(bytes);
stream.Position = 0;
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.StreamSource = stream;
bi.EndInit();
return bi;
}
Does this StackOverflow question helps?
byte[] to BitmapImage in silverlight
EDIT:
Try this, not sure it will work:
public BitmapImage convertByteToBitmapImage(Byte[] bytes)
{
MemoryStream stream = new MemoryStream(bytes);
stream.Position = 0;
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.CacheOption = BitmapCacheOption.OnLoad;
bi.DecodePixelWidth = ??; // Width of the image
bi.StreamSource = stream;
bi.EndInit();
return bi;
}
UPDATE 2:
I found these:
Load a byte[] into an Image at Runtime
BitmapImage from byte[] on a non UIThread
Apart from that, I don't know.
How do you know that the byte[] format you are creating is what the BI expects in the Stream? Why don't you use the BitmapImage.StreamSource to create the byte[] that you save? Then you know the format will be compatible.
http://www.codeproject.com/KB/vb/BmpImage2ByteArray.aspx
http://social.msdn.microsoft.com/forums/en-US/wpf/thread/8327dd31-2db1-4daa-a81c-aff60b63fee6/
[I did not try any of this code, but you can]
Turns out the bitmapimage CopyPixels isn't right. I take the output of the bitmapimage and convert it to something usable in this case a jpg.
public static Byte[] convertBitmapImageToBytestream(BitmapImage bi)
{
MemoryStream memStream = new MemoryStream();
JpegBitmapEncoder encoder = new JpegBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(bi));
encoder.Save(memStream);
byte[] bytestream = memStream.GetBuffer();
return bytestream;
}

How to get Memory Stream/Base64 String from Image.Source?

I have a dynamically created Image control that is populated via a OpenFileDialog like:
OpenFileDialog dialog = new OpenFileDialog();
if (dialog.ShowDialog() == true)
{
using (FileStream stream = dialog.File.OpenRead())
{
BitmapImage bmp = new BitmapImage();
bmp.SetSource(stream);
myImage.Source = bmp;
}
}
I want to send the image back to the server in a separate function call, as string via a web service.
How do I get a memory stream / base64 string from myImage.Source
Here's an alternative which should work (without BmpBitmapEncoder). It's uses the FileStream stream to create the byte array that is then converted to a Base64 string. This assumes you want to do this within the scope of the current code.
Byte[] bytes = new Byte[stream.Length];
stream.Read(bytes, 0, bytes.Length);
return Convert.ToBase64String(bytes);
Make sure you have http://imagetools.codeplex.com/
Then you can do this:
ImageSource myStartImage;
var image = ((WriteableBitmap) myStartImage).ToImage();
var encoder = new PngEncoder( false );
MemoryStream stream = new MemoryStream();
encoder.Encode( image, stream );
var myStartImageByteStream = stream.GetBuffer();
Then for Base64:
string encodedData = Convert.ToBase64String(myStartImageByteStream);

Creating WPF BitmapImage from MemoryStream png, gif

I am having some trouble creating a BitmapImage from a MemoryStream from png and gif bytes obtained from a web request. The bytes seem to be downloaded fine and the BitmapImage object is created without issue however the image is not actually rendering on my UI. The problem only occurs when the downloaded image is of type png or gif (works fine for jpeg).
Here is code that demonstrates the problem:
var webResponse = webRequest.GetResponse();
var stream = webResponse.GetResponseStream();
if (stream.CanRead)
{
Byte[] buffer = new Byte[webResponse.ContentLength];
stream.Read(buffer, 0, buffer.Length);
var byteStream = new System.IO.MemoryStream(buffer);
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.DecodePixelWidth = 30;
bi.StreamSource = byteStream;
bi.EndInit();
byteStream.Close();
stream.Close();
return bi;
}
To test that the web request was correctly obtaining the bytes I tried the following which saves the bytes to a file on disk and then loads the image using a UriSource rather than a StreamSource and it works for all image types:
var webResponse = webRequest.GetResponse();
var stream = webResponse.GetResponseStream();
if (stream.CanRead)
{
Byte[] buffer = new Byte[webResponse.ContentLength];
stream.Read(buffer, 0, buffer.Length);
string fName = "c:\\" + ((Uri)value).Segments.Last();
System.IO.File.WriteAllBytes(fName, buffer);
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.DecodePixelWidth = 30;
bi.UriSource = new Uri(fName);
bi.EndInit();
stream.Close();
return bi;
}
Anyone got any light to shine?
Add bi.CacheOption = BitmapCacheOption.OnLoad directly after your .BeginInit():
BitmapImage bi = new BitmapImage();
bi.BeginInit();
bi.CacheOption = BitmapCacheOption.OnLoad;
...
Without this, BitmapImage uses lazy initialization by default and stream will be closed by then. In first example you try to read image from possibly garbage-collected closed or even disposed MemoryStream. Second example uses file, which is still available.
Also, don't write
var byteStream = new System.IO.MemoryStream(buffer);
better
using (MemoryStream byteStream = new MemoryStream(buffer))
{
...
}
I'm using this code:
public static BitmapImage GetBitmapImage(byte[] imageBytes)
{
var bitmapImage = new BitmapImage();
bitmapImage.BeginInit();
bitmapImage.StreamSource = new MemoryStream(imageBytes);
bitmapImage.EndInit();
return bitmapImage;
}
May be you should delete this line:
bi.DecodePixelWidth = 30;

Resources