How to implement video with transparent background in react-native - reactjs

How can i implement a Video Player with transparent background in React-native? The video is saved with the extension .mov (only extension found to support transparent background).
Used react-native-player but it only renders a back screen.
Tried with https://www.w3schools.com/html/mov_bbb.mp4 both loaded locally and via uri and it worked.
Platform target: iOs

I needed same thing and tried many different methods and my conclusion is transparent movie file is not an accepted practice/standard. Only apple quicktime supports it.
Here are few workarounds I tried and made it partially work with a lot of performance challenges.
Best working method is to export all mov frames (with whatever frame rate is acceptible for you - for me I was trying to render something that was going to be exported as gif, so even 15 fps worked for me). Export movie frames as transparent png. Ffmpeg can easily do that.
Then either with simple loop, load and update frames from the js code. This is still very challenging because loading each frame in an image view from jsbundle simply does not catch up with your frame rate needs. As far as my experience, I only saw low single digit frame rates with loading resources from jsbundle. On the other hand, there is a solution for this. You can put your png files in xcassets in ios/xcode, and drawables in android. Then use { uri: 'filename' } to load resources from native apps. This resulted very good quality and speed for me on iOS even for 20-30 second video lengths at 1080p quality. Unfortunately this suffered from memory issues and didn't work on Android - Android only handled 100something frame for me. If you have more than 150-200 frames for your movie in total, regardless of fps, it will hit to the memory limits. I spent more than a week to research alternative ways to overcome memory issue, load many large bitmaps in memory for sequence display them in the app. There are theoretical methods to utilize memory on android without heap size limitations but I found it too low level and beyond my android/java knowledge.
Another path I experimented without success was webM format. It's webp format from google that is supported with some push but I couldn't find enough resources online about webm playback. I succeeded to play a webm file but it just works like gif players which almost all of them don't have clear ways to support transparent animated gifs. So I wasn't able to get to a solution with webp/m as well.
Right now I am trying to fake the transparent portions of my video with other RN animated elements over the video.
By the way, I tried most of these solutions natively on iOS and Android separately and both platforms had its own problems with them.

Related

"progressively" load video in NextJS (from DatoCMS/mux)

I'm using DatoCMS and NextJS to build a website. DatoCMS uses Mux behind the scenes to process the video.
The video that comes through is fairly well optimised for whatever browser is being used, and potentially for ABR with HLS; however, it still can take a fair bit of time on the initial load.
The JSON from Dato includes some potentially useful other things:
"video": {
"mp4Url": "https://stream.mux.com/6V48g3boltSf5uQRB8HnelvtPglzZzYu/medium.mp4",
"streamingUrl": "https://stream.mux.com/6V48g3boltSf5uQRB8HnelvtPglzZzYu.m3u8",
"thumbnailUrl": "https://image.mux.com/6V48g3boltSf5uQRB8HnelvtPglzZzYu/thumbnail.jpg"
},
"id": "44785585",
"blurUpThumb": "data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEAAoHBwgHBhUICAgLCgoXDhgVDhkNDh0VGhUeFxUZHSIfGxUmKzcjHh0oHRwWJDUlKDkvMjIyGSI4PTcwPCsxMi8BCgsLDg0OHBAQHDsoIig7NTs7Ozs7Ozs7Ozs7Ly8vOzs1Ozs7Ozs7Ozs1NTU7Ozs1Ozs7OzUvLzsvLy8vLy8vL//AABEIAA8AGAMBIgACEQEDEQH/xAAYAAACAwAAAAAAAAAAAAAAAAAEBQACBv/EAB4QAAICAgIDAAAAAAAAAAAAAAECABEEBQMGEiIy/8QAFwEAAwEAAAAAAAAAAAAAAAAAAgMFAP/EABsRAAIDAQEBAAAAAAAAAAAAAAECAAMRIUEE/9oADAMBAAIRAxEAPwDIYui48Y+saphApUQL2ZHNeJELTfALdGE943pl2m+gDFPJfP0qc/1JAMntKA0FYyTC9vIt2+JzrZP/2Q=="
}
With either next/image, or the more proprietary react-datocms/image, that blurUpThumb could be used as a placeholder while the full image is being loaded in the background, to improve UX, and (I believe) page-load speed / time to interactive.
Is there a way to achieve that same effect with the video instead of a file?
The usual way an ABR, HLS or DASH etc, video can start faster is by starting with one of the lower resolutions and stepping up to a higher resolution after the first couple of segments once the video is playing and there is more time to buffer.
However in your case, the example video is very short, 13 seconds so the effect is pretty minimal. Playing it on Safari on a MAC I saw the step happen at 4 seconds which is almost a 3rd through already in this case.
Short of re-encoding with lower resolution or some special codecs I think you may find this hard to beat - Mux is a pretty mature video streaming service.
The direct links to the videos above loaded and played quite quickly for me, even over a relatively low speed internet. It might be worth looking at what else your page is loading at the same time as this may be competing for bandwidth and slowing things down.

How to make images resize as per client device size

I have a react app that has many image references ( tags <img src=... /> and css background:url(...)) type.
These images are hosted on Azure Storage.
To speed up my App loading time on various devices (desktops and mobile), I need to resize these images before they hit the client, ie, on the server somewhere.
So far, I can think of the following options:
Pick each image, and produce multiple versions of them for various standard device sizes. Then, pick up each <img src=... /> tag, and, using JS alter the image name, such that the right size of image gets served. This will not work with css.
Use Azure CDN to automatically resize images. I was hoping that resizing would happen automatically, as the CDN portal retrieves the user-agent from the device. Does anyone know if this is true?
Serve images through an Azure function, resizing them on the fly (as suggested here)
Can someone suggest other options they can think of, or a pros / con of the above.
Since you're using javascript, use the window tag. For browsers, the window tab measures the resolution of the browser and you can set the height and width of your image to window.innerHeight and window.innerWidth. There are multiple other ways to do this but this is the easiest and most optimised if your coding project needs to be efficient with the least lines of code necessary.
More info about the window object here : https://www.w3schools.com/js/js_window.asp
P.S. this is only a solution for desktop, for mobile you can use screen.width, screen.height. This might not work on desktop but on a macOS Big Sur device it works, I tried it (This might be because macOS Big Sur is like a mobile optimised interface given that you can even run iOS apps on it but we don't know unless we try). That might be a better option as it is most likely common across all your devices.
More info about the screen object here : https://www.tutorialrepublic.com/faq/how-to-detect-screen-resolution-with-javascript.php
On the off-chance that none of them are common across all of your target devices, try making a detector program with which you can detect the device type and store that in a variable. Then create 2 if statements saying
if(deviceType = iOS){
<img src=..., screen.width, screen.height/>
}else if(deviceType = Windows){
<img src=..., window.innerWidth, window.innerHeight/>
}
Obviously this code won't work but it's just there to show you the flow where you can sort of understand what I meant. You need to integrate it your own way but this was just a way to make it easier as many times people mention that my answers are not easy to understand, just as a safety measure.
The best part of these options is that instead of remade copies of the image itself, this will resize the one, which saves storage space and eliminates the chance of the user using an unexpected display output like a 49" Samsung Odyssey G9 monitor where the resolution is extremely far from what you might have expected and resized. This also means you don't have to create a separate file just to make image resizing code, just the one to detect the OS (not necessary if the screen object works) which would've already been done since this is Azure we're talking about and they always detect their user base.
If you have any queries, please reply back.
Good luck!

Stability issues with WPF MediaElement (MediaFailed, slow-motion video), are there workarounds out there?

I have a couple issues with it:
media fails to play and continues to fail until the application is restarted
audio plays normally but video is in slow-motion and will not play normally until the application is restarted.
there's no way to reinitialize other than an app restart. (that I know of)
there's no real solid way to know if a video is rendering. I can observe Position to verify it is playing but it's not a guarantee there's any video output.
I run two instances of an extended version of MediaElement in my WPF app which hinges on their stability. After many tests I've concluded that it's the highest performing video player out there, above MediaKit and vlcdotnet, due to using solely GPU to decode. I've encoded videos in WVC1 (Windows Media Video AP#l3 VC-1) for 1080p quality and relatively small file sizes. Eventually after several plays the videos will fail to play, at which point the MediaFailed event is fired and subsequently stops working altogether until the application is restarted.
It is my understanding that when it fails like this, it's a core failure within the underlying Windows Media Player OCX control and it cannot be fixed in any way other than a full application restart. Has anybody found any reliable workarounds for these issues? Mum's the word over at the Microsoft forums...
Answering my own question here to possibly facilitate future SO'ers who might run into these issues. The issue lies in WPF, it is not planned to be improved. For a more reliable solution with minimal overhead, try WPFMediaKit.

Silverlight player that supports "seeking" to a random position in the stream

Does anyone know of a Silverlight Video player that permits "seeking" to a particular offset of the stream without having to download the whole thing?
I'm a bit new to this Silverlight gig, so will appreciate anything you can point me to to get me up to speed.
I come from a flash background, but I think this may apply to silverlight too.
Generally speaking, It's not the player that determines whether you can jump to a point in video but how the video is delivered.
Generally speaking - If you are downloading the video progressively over http, you will have to wait until the section of the video you want to skip to has been downloaded before you can jump to it. This is not always the case, YouTube, for example serve their content progressively. Also, this guy came up with another solution - http://www.flashcomguru.com/index.cfm/2005/11/2/Streaming-flv-video-via-PHP-take-two You may be able to find a similar workaround using dot net rather than php.
An easier way to achieve your goal, although not as cheap, would be to use a streaming server. You can use specific severs to allow actual video streaming, where you will be able to jump to various points in the video. I've found wowza quite good - http://www.wowzamedia.com/ Or to use a service like limelight or streamzilla. However they can be very very expensive.

What audio format works for Silverlight + WPF?

I'm writing a pair of applications for distributing audio (among other features). I have a WPF program that allows an artist to record and edit audio. Clicking a button then uploads this to a silverlight-powered website. A consumer visiting this website can then listen to the audio. Simple. It works. But I'd like it to be better: I need an audio format that works seamlessly on both the recording and playback sides.
I'm currently using mp3 format, and I'm not happy with it. For the recording/editing, I use the Alvas Audio c# library. It works ok, but for MP3 recording requires that the artist goes into his registry to change msacm.l3acm to l3codecp.acm. That's a lot to ask of an end-user. Furthermore mp3 recording seems rather fragile when I install on a new machine. (Sometimes it randomly just doesn't work until you've fiddled around for a while. I still don't know why.) I've been told that unless I want to pay royalties to the mp3 patent holders, I always need to rely on this type of registry change.
So what other audio format could I use instead? I need something compressed. Alvas audio can also record to GSM, for example, but that won't play back in silverlight. Silverlight will play WMA, but I don't know how to record in that format - Alvas Audio won't. I'd be open to using another recording library instead, but I haven't managed to find one.
Am I missing something obvious, or is there really no user-friendly way to record audio in WPF and play it back in Silverlight? It seems like there should be...
Any suggestions greatly appreciated.
Thanks.
IMO, WMA would be your best bet. I'm not sure how your application is setup or how low level you want to go, but the Windows Media Format SDK is a great way to encode WMA and the runtimes come with Windows. There are .NET PIAs and samples for it here: http://windowsmedianet.sourceforge.net/
Given that Ogg Vorbis is being adopted for the new HTML audio tag in (cough) some browsers, it's probably worth checking it out. You won't get bitten by any licensing concerns if you follow this route. If ease of deployment is top of your list, then go with WMA.
[tries hard not to start ranting about fragmented state of codec options in browsers and the commercial interests that scupper any concensus]

Resources