This is one of those messy real world problems. I have a device that provides two video stream of 1920 x 1080 # 30 FPS. I have the documentation and the SDK for the video device which is unmanaged, naturally. It support GPUDirect. We have tested a SDK which is a SharpDX/SlimDX/DirectShow.Net type wrapper basically around DirectShow for our purposes. Licensing legal issues may prevent the use of this SDK. Being new to this world I am trying to piece together an alternative solution.
The question is the sample I have appears that they take the frame buffer in a surface and apply that to the input pin of a filter. After that is a mystery. I see the video in a a WPF window, can overlay text etc. is DirectShow the best or most direct way to display streaming video or will I find that using DirectX and shaders using say SharpDX easier?
I understand the question is broad in nature but I believe there are many others who are faced with this myriad of tools but do not have enough experience in this graphics world to make the best choice and look to those who have gone before to nudge us in the right direction.
Opinion based questions are basically considered off-topic, so I am skipping that part and besides that,
DirectShow and DirectX are working together only at the point of presentation of video. Video Mixing Renderer 9 and Enhanced Video Renderer components are utilizing Direct3D to present video. That is, you are supposed to choose the most appropriate for you: you either work with DirectX directly, or you are using DirectShow's "gate" to Direct3D.
Then DirectShow itself, you typically use it when you have a source interfaced as DirectShow filter, or you need to apply other DirectShow filters for processing, or you would like to synchronize video and audio, or you otherwise have a good reason to use capabilities of DirectShow in terms of using available components. If you have a video feed comping from non-DirectShow sources, then you need to deliver that into non-DirectShow destination then you might prefer to not use DirectShow at all. DirectShow and Media Foundation are primary APIs for video streaming and those you mentioned are merely wrappers.
Related
I've develop an html5 video player with dash.js that plays streaming mpeg dash content. It works great.
Now I have a requirement to run the same on WPF. using an webbrowser to run my already developed html5 sounds very dirty but I can't figure what I can use to make streaming work.
any tips?
There exists no DASH player library for WPF that I am aware of.
#Sander is right, there doesn't appear to be a WPF or C# Dash implementation at this point.
Microsoft's documentation on building a player suggests using dash.js
I'm also not an expert with WPF and have no experience playing video of any kind within WPF. That said I assume you should be able to do something similar to the way dash.js works with the HTML5 Video element. This would require you to do a number of different tasks like parse the MPD file format, and download the required segment(s).
The simplest implementation of this would be simply parsing the MPD and finding one specific bitrate / resolution BaseURL and passing those values to a WPF element that can handle playing mp4 files. However to really get the benefits of DASH you would need to fully parse the MPD file and implement logic around bitrate switching, etc.
Quick Question
How to show webcam rapidly on Adobe AIR application with Stage3D?
Detailed question
About
My goal is to create prototype of AR (Augmentation Reality) mobile application. I have chosen Adobe Flash AIR for good 3D graphics support on mobile and AIR apps easy to porting to many mobile platforms (iOS, Android, Blackberry Playbook).
Purpose
I want to show up complex 3D model (so i need to use Stage3D). And underneath a video from Front Camera. As usual AR application.
Here is examples
(source: augmentedplanet.com)
Problem
Stage3D not transparent at all so i can't use StageVideo for the rapid showing of content of Camera, because StageVideo doesn't seen under Stage3D.
So
And only decision i have found - it's to create flat surface with dynamic texture updating.
Here is example of integration of webcamvideo with Starling Framework (Stage3D). But with many ordinary mobile devices we get such a big texture updating (almost as size of screen resolution), that any app will fall down to low fps or even crashes down. What i have done on my Galaxy Note for example. With 320x200 texture size it's has fairly good performance but look ugly at AR app.
So is any brilliant solution for create AR on AIR? Is anybody got same challenge?
This use case is unfortunately not well supported in AIR. Your best bet is really the manual upload. It might help to add votes to feature requests on Adobe forums for transparent Stage3D.
Now for why this feature was low priority: If you are doing AR you are probably already doing CPU work on the video. That means you already read back the camera data for processing either on the CPU or as a Stage3D texture. That's the expensive part, not the uploading a texture back to Stage3D.
In order for this to be useful there would need to be a lot of complicated code paths working together flawlessly. On all supported devices:
Read back low resolution camera for CPU or GPU AR image processing
Show passthrough high resolution camera image
Overlay 3D with blending
This is unfortunately very hard. On many mobile chipsets video/camera, CPU, and 3D are very separate units so it is hard to share data between them without stalling or copying. It can be done very well IF you target specific hardware. I know this does not solve your problem but I hope it explains why this use case does not work well in AIR yet. I think you have those options:
Go with AIR and readback/upload. It will be very slow on some HW, but it will work reliably.
Go native. It will be a huge win in the best case, but you need a lot of custom code and testing for every single target.
Go native on a single very narrow platform. Many very cool AR demos do this. Look at SDKs for AR from GPU vendors. Most of them have one.
Make the best out of a bad situation: Stick with low res and uploads but add some interesting filters on the video. Once you paid for the upload doing Stage3D stuff with you texture is very cheap.
I hope this helps a bit. While developing Stage3D this exact use case came up every now and then. I still think it is really cool! Maybe this post explained why it never made the top of the list yet.
I have a WPF 4/VB.net project, and I'm trying to play video in it. I've been using a mediaelement, much to the chagrin of the program's overall performance. Thus, I'm looking into the only other viable alternative that I can find for my project...directx.
Which brings me to my question. Is it possible to play video (either Theora [.ogv] or Flash Video [.flv]) in DirectX? If so, how?
Thank you in advance!
The Mediaelements Performance is not so bad and dependent on Codec. Mediaelement can also Play all Formats if you have required Codecs installed.
I have a DirectX Video Player written in VB.Net (without WPF).I used the DirectX SDK. But the methods are highly error prone.
To play your Videoformats, you can install a Codec-Pack e.g. K-Lite with FFDshow (highly recommended)
I would like to port an ActiveX of mine (to Silverlight) that have the following features:
Embedding a logo image in any video file.
Embedding subtitles in any video file.
Cropping any video file (for example: cropping 10 seconds of a 1 min. video).
Save the video file result (by the current encoders of the client).
The current ActiveX uses DirectShow - unfortunately, it can't be used in Silverlight. How I can abandon the old ActiveX technology for the new Silverlight technology?
The simple answer is: you can't.
Silverlight is targeted at two main types of apps:-
Content presentation, be that Video, Audio and Images, all with a view of creating interesting and engaging ways to interact with this sort of content.
More recently Line of Business apps, that is data entry and data presentation. Again with a view to making this at least a little bit more visually stimulating than prior technologies made these sorts of apps.
Video editing doesn't really fall into either of these camps and is not catered for.
I'm not sure its yet true to say the ActiveX is old, after all what technology is used to host the Silverlight Plugin in Internet Explorer? ActiveX.
I don't know if that's going to be easily doable. The various codecs natively available to Silverlight are all wrapped by the Silverlight MediaElement control, and so far as I can figure out, they're not directly exposed through an API, e.g., you can't get at the raw decoded RGBA bitstream. (If I'm wrong on this, I'd love to know, but I've poked around, and I can't figure out how to do it.) The Mono source tree has a decode-only implementation of the Dirac codec, but nothing that would easily let you decode, e.g., WMV or AVI files, so far as I'm aware.
And even if you could somehow grab the raw, decoded RGBA (or YCbCr) bitstream, so as to be able to insert whatever data you want into those frames, you'd still have to re-encode the video stream as well, and Silverlight doesn't provide any native support for that. You'd have to write your own encoders (not at all trivial), port them from the ffmpeg library (also not trivial), or wait for someone else to do it.
In short, my suspicion is that you're going to need to stick with your ActiveX solution for now -- though with some clever JavaScript coding, it might be possible to wrap that in a nice Silverlight UI.
Are there any controls that anyone is aware of that I can use to stream firewire video into a WPF app. I do not need camera control or capture just the video. I need WPF hosting because I'll be adding WPF content on top.
I was hoping that with the addition of having direct X surfaces in WPF something like this might appear.
Ideally looking for something relatively high level (not a direct show guy at all).
Thanks,
Brian
There are a couple really good video rendering packages for WPF. This guy Jeremiah Morrill has a blog where he discusses his numerous render projects. There's the WPF Win32 render project, and a number of low-level techniques he documents for how to access accelerated playback, Media Foundation .NET, DVD controls, etc... I believe his blog is titled "Jer's One Stop Shop".
Reading over his blog in general is a good idea if you are in to video/WPF. Last I checked, "MediaKit", one of his more comprehensive projects, enables easy use of DirectShow (simple xaml and your off and running, so don't worry) and other well known native interfaces. It's very robust and actively maintained, if not that specific project, check into some of the recent API's he's contributing on, some various Win7 media support also.
The only reason I'd bring up this other project, Augmented Reality, is that you remarked about adding content "on top". You should definitely check out wpfAugRel if your doing a lot of video production. Where to get an add on for it eludes me, but I'm sure you can find it off that site, but it allows for you to script in python some fairly slick real-time video production.
-- edit --
Right, look at this google code page, it has some video's (picture's worth a thousand words right?), but regardless, it allow's you to mix in 3D content into live-action, through the use of "marker" prop's, essentially bit's of paper with some easily machine recognizable feature's, that facilitate their underlying engine to inject computer rendered output into a real world scene, highly dynamic, so you can toss these marker's around and the 3D content move's fluidly... anyhow good luck.
Check out this article by UberDemo. It captures video into a WMV file with Windows Media Encoder and WPF. There is a paragraph about how to do the preview in a WPF application.