WPF & Video processing by frame - wpf

can any one give me sample on how to access webcam in WPF and process it frame by frmae for video processing ,in faster way?
on each frame i have to find objects for further image processing

Use WPFMediaKit (open source project) and use the control
WPFMediaKit.DirectShow.Controls.VideoCaptureElement
Just set its attribute VideoCaptureDevice
myVideoCaptureElementcontrol.VideoCaptureDevice = MultimediaUtil.VideoInputDevices[0];
0 is the index of the video input device.

Related

SceneKit Style Transfer with coreML

Goal:
SceneKit Style Transfer with coreML
How:
Grabbing CVPixelBuffer from a SceneKit frame, like on WWDC video bellow:
https://developer.apple.com/videos/play/wwdc2020/10642/
Tried:
I tried accessing a SCN frame, but it doesn't seem to have access to CVPixelBuffer
Question:
How can I grab CVPixelBuffer from a SceneKit frame? Or is there a better way to do Style Transfer in SceneKit?
I found that there is a description like this.
Here I stylize offline the virtual object texture with a different style but I want to do even more.
I think it displays the virtual object texture through matel. I try to get the CVPixelBuffer through CIImage, and process it through arView.renderCallbacks.postProcess to get real-time processing effects, which is much more convenient than getting video processing, but how to process data through matel I am not sure, hope someone can provide more information.
You can refer to this speech video
https://developer.apple.com/videos/play/wwdc2021/10075/

Convert YUV to RGB on DeckLink using hardware

I'm currently ingesting HD1080p video at 59.94 FPS from a camcorder via the HDMI input on the DeckLink 4K Extreme.
My goal is to replicate the incoming image in a WPF UI element. To accomplish this I'm using the DeckLink SDK in a C# WPF application.
In this program I've implemented the VideoInputFrameArrived callback. In this callback I'm copying the bytes from each frame into a WriteableBitmap which I've set as the source for an Image.
All this works as it should, and when I run the program, the Image is indeed updated in real time as the frames arrive.
My problem then, is that the only two supported Pixel Formats for the video input are 8BitYUV and 10BitYUV, neither of which can be natively displayed on computer monitors.
The WriteableBitmap can only take in various RGB, Black and White, and CMYK formats.
Here is what I've tried so far.
I've tried to convert each frame using the IDeckLinkVideoConversion::ConvertFrame()
Problem: ConvertFrame() requires a destination frame to be rendered on the DeckLink using IDeckLinkOutput::CreateVideoFrame(). As I currently understand it, the DeckLink cannot act as both an input (to capture the video feed) and an output (to render the destination frame).
I've set the incoming stream to 8BitYUV, and copied each frame into the WriteableBitmap with a format of BGR32.
Problem: As I mentioned earlier, this will display an image, but the color is incorrect and the picture is only half the width that it needs to be.
The reason for this is that the incoming stream of 8BitYUV is 16 bits/pixel, whereas the Bitmap expects 32 bits/pixel, and so the Bitmap treats each incoming MacroPixel (4 bytes) as one pixel instead of the 2 pixels it really is.
Currently I'm using a pixel shader to fix the color and a RenderTransform to scale the Image horizontally by a factor of 2 to "fix" the aspect ratio. The porblem is that I have half of the original resolution.
I don't believe this is a hardware limitation, because when I hook up another monitor to the HDMI output on the DeckLink, the incoming picture displays in full 1080p in perfect color. Would it be possible to capture that outgoing stream somewhere in memory?
TL;DR
What is the best way to convert 4:2:2 YUV (UYVY) into a RGB or CMYK pixel format in real time? (1080p # 59.94 FPS)
Preferably a hardware solution i.e. DeckLink or GPU.
You have several options here.
First of all, you can display UYVY directly. Most video adapters will accept UYVY data through DirectDraw, DirectShow, DirectX versions up to 9 APIs and you won't need a real time conversion for the video frames. Integrating this into WPF application might require some effort, and perhaps the most popular way is to utilize DirectShow through DirectShow.NET library and WPF Media Kit. On this way, however, you could also capture video using DeckLink's video capture DirectShow filter. You could connect all parts together faster, however you already capture using DeckLink SDK and this way you have more control and flexibility on the capture process so you might not want to get back to DirectShow.
Second option is to convert to RGB as you wanted. I don't think DeckLink can do it for you, and GPU based conversion definitely exists (the conversion formula is well known, simple and easy to parallelize), however is hardware dependent or otherwise not immediately available. Instead, Microsoft ships Color Converter DSP which can do the conversion (from 8 bits, not 10 though) in a very efficient way. The API is native, and you might need Media Foundation .NET to access it from your app. An alternate efficient software conversion can also be done using FFmpeg's libswscale (for managed app through respective wrappers).
I just did this with the decklink api because the card I have can act as both inputs and outputs. And the outputs do not need to be in playback mode to access this part of the api:
com_ptr<IDeckLinkOutput> m_deckLinkOutput;
if (SUCCEEDED(m_deckLink->QueryInterface(IID_IDeckLinkOutput, (void **)&m_deckLinkOutput)))
{
IDeckLinkMutableVideoFrame *pRGBFrame;
if (SUCCEEDED(m_deckLinkOutput->CreateVideoFrame(videoFrame->GetWidth(), videoFrame->GetHeight(), videoFrame->GetWidth() * 4, bmdFormat8BitBGRA, videoFrame->GetFlags(), &pRGBFrame)))
{
m_deckLinkVideoConversion->ConvertFrame(pFrame, pRGBFrame);
//use the rgbFrame
pRGBFrame->Release();
}
}

How can i check that media element is play file or not?

i want to play an audio or video file, and i made this but i do not know how i can check whether media element is playing file or not?
Like using same button for Pause and Play how i can know that media is also playing file or not?
i use below code to play the file
mediaElement1.LoadedBehavior = MediaState.Manual;
mediaElement1.Source = new Uri("C:/test.wma", UriKind.RelativeOrAbsolute);
mediaElement1.Play();
And one more thing i know how to play file in media element by code. I want to to design media element UI. how i can design this for both video and audio files? what properties should enable and what should be disable.
Last I checked you can't unless you are using Silverlight:
http://msdn.microsoft.com/en-us/library/cc189079(v=vs.95).aspx
I have an application using .Net 4 and as far I can remember you cannot verify if it is playing, stopped, paused, etc.
This is what you can do with .Net 4:
http://msdn.microsoft.com/en-us/library/system.windows.controls.mediaelement(v=vs.100).aspx
Compared with Silverlight:
http://msdn.microsoft.com/en-us/library/system.windows.controls.mediaelement(v=vs.95).aspx

how can I show the thumbnails from an avi file like Explorer does

Need it to work in WPF
Basically I am showing a list of all vide files in a list control and I want a image frame or thumbnail if you will
was wondering how explorer does it ?
it probably does what you expect: it reads some of the file, uses the appropriate codec to decode some video, and then creates an image from one or more frames.

is it possible to get a frame from a video

Using WPF, is it possible to access a video frame by frame.
Ideally, given a video file and time, one would get an ImageSource.
Is it possible ?
You can capture frames with this library : http://videorendererelement.codeplex.com/
System should have proper codecs for that particular video format (E.g. DivX).

Resources