i am looking to get the current timestamp and/or frame number for a streaming netflix (or other Silverlight) video that is currently playing in my browser.
is there a way to do this programmatically (my current solution involves scraping and parsing the current time from the playhead area)
Related
I'm looking for sample code, or getting started to perform a "green screen" with the latest Azure Kinect DK.
How should I proceed to build and display a color stream with ONLY body area ?
Is it possible to avoid using body stream ? Because it require a stronger (NVidia computer)
We just published the new Green screen code sample as part of our GitHub open source repo microsoft/Azure-Kinect-Sensor-SDK.
You can find more information at green screen example
If you have any questions about the code, you can open GitHub issue.
I just received my Azure Kinect device recently and have not yet tried a few things I am interested in so this reply comes not from direct experience, however sample code and the SDK's indicate there may be a viable approach.
You can always try to implement traditional color-recognition algorithms, but if your usage scenario allows it, you can use data from the depth camera to filter only data within a depth range, with the "green screen" being out of range. You can then correlate pixels from the depth camera to the RGB image data to pick out data from the color stream that is within a certain depth range. Also, the background does not have to be a real green screen, but rather just has to be outside of the filtered depth range.
This approach allows you to use the sensor SDK, not requiring the body tracking SDK and its associated GPU requirements.
How can i implement a Video Player with transparent background in React-native? The video is saved with the extension .mov (only extension found to support transparent background).
Used react-native-player but it only renders a back screen.
Tried with https://www.w3schools.com/html/mov_bbb.mp4 both loaded locally and via uri and it worked.
Platform target: iOs
I needed same thing and tried many different methods and my conclusion is transparent movie file is not an accepted practice/standard. Only apple quicktime supports it.
Here are few workarounds I tried and made it partially work with a lot of performance challenges.
Best working method is to export all mov frames (with whatever frame rate is acceptible for you - for me I was trying to render something that was going to be exported as gif, so even 15 fps worked for me). Export movie frames as transparent png. Ffmpeg can easily do that.
Then either with simple loop, load and update frames from the js code. This is still very challenging because loading each frame in an image view from jsbundle simply does not catch up with your frame rate needs. As far as my experience, I only saw low single digit frame rates with loading resources from jsbundle. On the other hand, there is a solution for this. You can put your png files in xcassets in ios/xcode, and drawables in android. Then use { uri: 'filename' } to load resources from native apps. This resulted very good quality and speed for me on iOS even for 20-30 second video lengths at 1080p quality. Unfortunately this suffered from memory issues and didn't work on Android - Android only handled 100something frame for me. If you have more than 150-200 frames for your movie in total, regardless of fps, it will hit to the memory limits. I spent more than a week to research alternative ways to overcome memory issue, load many large bitmaps in memory for sequence display them in the app. There are theoretical methods to utilize memory on android without heap size limitations but I found it too low level and beyond my android/java knowledge.
Another path I experimented without success was webM format. It's webp format from google that is supported with some push but I couldn't find enough resources online about webm playback. I succeeded to play a webm file but it just works like gif players which almost all of them don't have clear ways to support transparent animated gifs. So I wasn't able to get to a solution with webp/m as well.
Right now I am trying to fake the transparent portions of my video with other RN animated elements over the video.
By the way, I tried most of these solutions natively on iOS and Android separately and both platforms had its own problems with them.
We are using Rails as a backend & AngularJS on Front End side in my App where we need to show Video & audio waveform of that video.
We are using 'wavesurfer.js' to show the waveform on Front End side & 'node-pcm' to generate pcm from video file on BackEnd side.
This is working as expected but in some of the videos while creating waveform from pcm data instead of showing small sine waves we get flat line.
Also it takes too much time to show the waveform for every page reload.
To overcome this issue we are planning to create waveform image using ffmpeg
ffmpeg -i 'https://s3.amazonaws.com/aadasdsadsadasdas/xyz.mp4' -filter_complex showwavespic -frames:v 1 output.png
This is working fine but it also takes too much time (Ofcourse only once ) to generate the image for remote video (i.e. We are saving videos on S3)
Problem with this i don't get any library to integrate the waveform image with the Video.
Can someone suggest any better approach related to this.
I'm studying the feasibility of using IIS Smooth Streaming to build a rich web application to display time synchronized audio/video and related textual data. The text data is a set of spacecraft telemetry that should be displayed outside of the video window.
I've seen some examples of how to display video captions at the correct time using this technology. I've also read Chapter 11 of Silverlight Recipes 2nd edition which again shows video captions. What I'm trying to do is a little bit more complex - show numerous pieces of data outside of the video window. The data should be in sync with the video playhead at all times.
It looks like I will encode my text data in tracks using the StreamIndex element type, with Type="text" and SubType="data". On the client, how do I display this in a separate panel next to the video?
From the marketese I know IIS Smooth Streaming must handle this scenario. I'm just having no luck finding examples of how to handle the data on the client side. Can someone point me to an example, or tell me why this is not possible?
I want to build a Silverlight live feed viewer for an IP camera with a proprietary RTP server, i.e. no IIS, no SmoothStreaming extension. Is SmoothStreamingClient (or microsoft media platform) is the best place to start?
You definitely don't want the SmoothStreamingClient, as that assumes that you're using a SmoothStreaming media source. However, what you can do instead is use a MediaElement and implement your own MediaStreamSource. This requires that you know how to parse the data being spewed by your IP camera and turn it into valid video samples, which is non-trivial, but it's the only supplied mechanism for displaying video data for which there isn't already a built-in streaming source.
However, if the video format that your IP camera sends is already supported by Silverlight, then all you need to do is create a Stream that reads the camera data and pass that as the media source to a MediaElement.
Best way is to have some server-side app that gets the camera data and saves a picture at a certain location on the web server. Then you can have an HTML page refresh periodically to show the new image (trick is to give a url of the style http://someserver/someimage.jpg?dummy=i, where you replace i with a number that changes every time (put a big random number or the current datetime), so that the browser doesn't cache and show the previously downloaded frame all the time