Kinesis :: bounding boxes on aws kinesis live video stream and play the video from front-end react.js app - reactjs

I am working on an application which will detect faces from live stream. I have a front-end application in react.js and backend is serverless AWS which will call rekognition api to detect faces and return co-ordinates of bounding boxes of the detected faces. My video source is aws kinesis video stream.
In UI, ReactPlayer is being used to play the video from hlsurl. Now my question is, how can I draw bounding boxes on detected faces/mask on the real time stream and show the video from UI?
my approach:
return bounding-boxes from a lambda function and draw using html canvas, but due to latency it won't give the feel of real time detection.
So, how can I show the kinesis real time video stream with bounding boxes is there any other approach to follow? I have gone through the documentations but those are for static images only.

Related

Video streaming including application overlay components

I want to make a react-native app having the capability of video streaming from a mobile app to a connected browser user. On top of that, I want to overlay some application components so connected users can see video streaming as well as some of the application UI.
For an example take a reference of the below-given image. Here, video streaming is running in the car showroom and there are a few app components shown as an overlay of the video like an app menu and a car image.
I want to achive same functionality and using VideoSDK platform for video streaming service.
So far I have created react-native app and able to stream video through camera to the connected browser user.
Next, I want to add my app menu on top of the video as per the image and therefore i am thinking screenshare with combination of video sharing is way to go.
The above image is the actual implementation using video SDK in the browser but as you can see screen share window is opening in a totally different context which is not the expected implementation.
Can someone suggest how can I achieve the functionality of video streaming having the capability of app overlay components?
I have reviewed your requirement and I am glad to inform you that we do have application with same requirements, for further discussion and demos can we connect over mail i.e. karan10010#gmail.com

How to create fast video loading feature like instagram Reel with React Native?

How can I create a fast video consumption feature like Instagram Reels, I have created a simple video player with the help of react-native-video but after every video, it takes time to load that video.
How can I do something like prefetch the video before displaying that video? Also, it should take care of the user's mobile data consumption.
Is there any way to implement this feature in react native?
You can use the mobile storage to store the next videos into the queue.

React-Native Audio Waveform editor

I am planning to build an audio editor app with react-native. The functionalities include having a textbox where user can provide the URL for any audio file. Once the file is loaded on the UI, it will be played with a Waveform UI. User can select the start and endpoints of the audio by moving the slider on the waveform and once it's fixed, the app will get the start-time and end-time of the selected waveform, which will be then sent to the backend to cut the audio(probably using FFmpeg library).
I need but can't seem to find any react-native library that allows the user to interact with the waveform.
The UI can be somewhat similar to:
I don't believe that there is one that allows users to interact with the waveform out of the box.
You could use react-native-audiowaveform to show the waveform, and then capture the user's touches.

Sync a label with angular.js video player's progress time

I have a web page that consist a video and a label that shows a series of data about certain parts of the video. for example while the video is played, in 10'th and 18'th second of the video , the label must be updated automatically. i can not use a separate timer because some parameters such as slow internet connection can make synchronization. Is there any way/ any component that can solve this issue?

how to store returned video capture devices

I am making silverlight application in which i have to show multiple webcams videos.
For example:
if i click button:1 then it show me fist webcam video and if i click
button: 2 then it show me second wecam video.
As 'CaptureDeviceConfiguration.GetAvailableVideoCaptureDevices()' returns us all available video devices but problem is this that how i store each returned webcam device in variable so that I can perform the above mentioned task?

Resources