There are examples for switching bitrates: Load higher quality smooth streaming bitrate on start, IIS Smooth streaming low quality on start for SmoothStreamingMediaElement and SMFPlayer.
They work fine for .ism files. But they don't for .csm (composite manifests).
Segment.AvailableStreams and Segment.SelectedStreams collections are empty. And it looks like SetVideoBitrateRange doesn't do anything.
Is there a way for user to select a bitrate while playing a composite manifest?
Related
I'm using DatoCMS and NextJS to build a website. DatoCMS uses Mux behind the scenes to process the video.
The video that comes through is fairly well optimised for whatever browser is being used, and potentially for ABR with HLS; however, it still can take a fair bit of time on the initial load.
The JSON from Dato includes some potentially useful other things:
"video": {
"mp4Url": "https://stream.mux.com/6V48g3boltSf5uQRB8HnelvtPglzZzYu/medium.mp4",
"streamingUrl": "https://stream.mux.com/6V48g3boltSf5uQRB8HnelvtPglzZzYu.m3u8",
"thumbnailUrl": "https://image.mux.com/6V48g3boltSf5uQRB8HnelvtPglzZzYu/thumbnail.jpg"
},
"id": "44785585",
"blurUpThumb": "data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAoHBwgHBhUICAgLCgoXDhgVDhkNDh0VGhUeFxUZHSIfGxUmKzcjHh0oHRwWJDUlKDkvMjIyGSI4PTcwPCsxMi8BCgsLDg0OHBAQHDsoIig7NTs7Ozs7Ozs7Ozs7Ly8vOzs1Ozs7Ozs7Ozs1NTU7Ozs1Ozs7OzUvLzsvLy8vLy8vL//AABEIAA8AGAMBIgACEQEDEQH/xAAYAAACAwAAAAAAAAAAAAAAAAAEBQACBv/EAB4QAAICAgIDAAAAAAAAAAAAAAECABEEBQMGEiIy/8QAFwEAAwEAAAAAAAAAAAAAAAAAAgMFAP/EABsRAAIDAQEBAAAAAAAAAAAAAAECAAMRIUEE/9oADAMBAAIRAxEAPwDIYui48Y+saphApUQL2ZHNeJELTfALdGE943pl2m+gDFPJfP0qc/1JAMntKA0FYyTC9vIt2+JzrZP/2Q=="
}
With either next/image, or the more proprietary react-datocms/image, that blurUpThumb could be used as a placeholder while the full image is being loaded in the background, to improve UX, and (I believe) page-load speed / time to interactive.
Is there a way to achieve that same effect with the video instead of a file?
The usual way an ABR, HLS or DASH etc, video can start faster is by starting with one of the lower resolutions and stepping up to a higher resolution after the first couple of segments once the video is playing and there is more time to buffer.
However in your case, the example video is very short, 13 seconds so the effect is pretty minimal. Playing it on Safari on a MAC I saw the step happen at 4 seconds which is almost a 3rd through already in this case.
Short of re-encoding with lower resolution or some special codecs I think you may find this hard to beat - Mux is a pretty mature video streaming service.
The direct links to the videos above loaded and played quite quickly for me, even over a relatively low speed internet. It might be worth looking at what else your page is loading at the same time as this may be competing for bandwidth and slowing things down.
How can i implement a Video Player with transparent background in React-native? The video is saved with the extension .mov (only extension found to support transparent background).
Used react-native-player but it only renders a back screen.
Tried with https://www.w3schools.com/html/mov_bbb.mp4 both loaded locally and via uri and it worked.
Platform target: iOs
I needed same thing and tried many different methods and my conclusion is transparent movie file is not an accepted practice/standard. Only apple quicktime supports it.
Here are few workarounds I tried and made it partially work with a lot of performance challenges.
Best working method is to export all mov frames (with whatever frame rate is acceptible for you - for me I was trying to render something that was going to be exported as gif, so even 15 fps worked for me). Export movie frames as transparent png. Ffmpeg can easily do that.
Then either with simple loop, load and update frames from the js code. This is still very challenging because loading each frame in an image view from jsbundle simply does not catch up with your frame rate needs. As far as my experience, I only saw low single digit frame rates with loading resources from jsbundle. On the other hand, there is a solution for this. You can put your png files in xcassets in ios/xcode, and drawables in android. Then use { uri: 'filename' } to load resources from native apps. This resulted very good quality and speed for me on iOS even for 20-30 second video lengths at 1080p quality. Unfortunately this suffered from memory issues and didn't work on Android - Android only handled 100something frame for me. If you have more than 150-200 frames for your movie in total, regardless of fps, it will hit to the memory limits. I spent more than a week to research alternative ways to overcome memory issue, load many large bitmaps in memory for sequence display them in the app. There are theoretical methods to utilize memory on android without heap size limitations but I found it too low level and beyond my android/java knowledge.
Another path I experimented without success was webM format. It's webp format from google that is supported with some push but I couldn't find enough resources online about webm playback. I succeeded to play a webm file but it just works like gif players which almost all of them don't have clear ways to support transparent animated gifs. So I wasn't able to get to a solution with webp/m as well.
Right now I am trying to fake the transparent portions of my video with other RN animated elements over the video.
By the way, I tried most of these solutions natively on iOS and Android separately and both platforms had its own problems with them.
i am looking to get the current timestamp and/or frame number for a streaming netflix (or other Silverlight) video that is currently playing in my browser.
is there a way to do this programmatically (my current solution involves scraping and parsing the current time from the playhead area)
I have a couple issues with it:
media fails to play and continues to fail until the application is restarted
audio plays normally but video is in slow-motion and will not play normally until the application is restarted.
there's no way to reinitialize other than an app restart. (that I know of)
there's no real solid way to know if a video is rendering. I can observe Position to verify it is playing but it's not a guarantee there's any video output.
I run two instances of an extended version of MediaElement in my WPF app which hinges on their stability. After many tests I've concluded that it's the highest performing video player out there, above MediaKit and vlcdotnet, due to using solely GPU to decode. I've encoded videos in WVC1 (Windows Media Video AP#l3 VC-1) for 1080p quality and relatively small file sizes. Eventually after several plays the videos will fail to play, at which point the MediaFailed event is fired and subsequently stops working altogether until the application is restarted.
It is my understanding that when it fails like this, it's a core failure within the underlying Windows Media Player OCX control and it cannot be fixed in any way other than a full application restart. Has anybody found any reliable workarounds for these issues? Mum's the word over at the Microsoft forums...
Answering my own question here to possibly facilitate future SO'ers who might run into these issues. The issue lies in WPF, it is not planned to be improved. For a more reliable solution with minimal overhead, try WPFMediaKit.
I'm writing a pair of applications for distributing audio (among other features). I have a WPF program that allows an artist to record and edit audio. Clicking a button then uploads this to a silverlight-powered website. A consumer visiting this website can then listen to the audio. Simple. It works. But I'd like it to be better: I need an audio format that works seamlessly on both the recording and playback sides.
I'm currently using mp3 format, and I'm not happy with it. For the recording/editing, I use the Alvas Audio c# library. It works ok, but for MP3 recording requires that the artist goes into his registry to change msacm.l3acm to l3codecp.acm. That's a lot to ask of an end-user. Furthermore mp3 recording seems rather fragile when I install on a new machine. (Sometimes it randomly just doesn't work until you've fiddled around for a while. I still don't know why.) I've been told that unless I want to pay royalties to the mp3 patent holders, I always need to rely on this type of registry change.
So what other audio format could I use instead? I need something compressed. Alvas audio can also record to GSM, for example, but that won't play back in silverlight. Silverlight will play WMA, but I don't know how to record in that format - Alvas Audio won't. I'd be open to using another recording library instead, but I haven't managed to find one.
Am I missing something obvious, or is there really no user-friendly way to record audio in WPF and play it back in Silverlight? It seems like there should be...
Any suggestions greatly appreciated.
Thanks.
IMO, WMA would be your best bet. I'm not sure how your application is setup or how low level you want to go, but the Windows Media Format SDK is a great way to encode WMA and the runtimes come with Windows. There are .NET PIAs and samples for it here: http://windowsmedianet.sourceforge.net/
Given that Ogg Vorbis is being adopted for the new HTML audio tag in (cough) some browsers, it's probably worth checking it out. You won't get bitten by any licensing concerns if you follow this route. If ease of deployment is top of your list, then go with WMA.
[tries hard not to start ranting about fragmented state of codec options in browsers and the commercial interests that scupper any concensus]