Record audio and show waveform react native - reactjs

So I am trying to create the waveform shape in react native while recording an audio, I looked up many packages but they all need an audio url so they don't support realtime recording, I tried creating one by myself which i used a package that provides me with the decibals value when recording and then push the value to state array but it cause too many lags since I setstate every 0.5 sec.
Any suggestion?

This package audio-react-recorder provides the recording interface as well as a somewhat customisable waveform. I think it's a good place to start. I used it a few times, it works quite well.
Here's a demo
Let me know if it works out for you.

Related

Network Signal strength in React

I have tried many possible ways of detecting signal strength in React and showing them in signal bars like Weak, Medium, Excellent. The solution I built was in plain javascript where we download a image from server and calculate download time but that needs interval to run again and again and is pushing load on server. Is there any way to do the same in React way ?
you can try to add change eventListener to experimental
NetworkInformation API and calculate your signal based on downlink attribute
But its not well supported yet

How to set a countdown timer displayed on the web camera component?

I'm working on a web application using React and TypeScript. Right now, I'm looking for the cleanest way to have a countdown timer fire when a user hits the shutter-release button. Is there any simple and flexible way to do this? As a result, I wanna have a countdown timer on the web camera component like these:
*camera components are built using react-camera-pro
https://www.npmjs.com/package/react-camera-pro
I also want to have a sound that corresponds with the timer, so at first, I was thinking of making a video file that displays the number and plays the sound simultaneously.
<TakePhotoButton
onClick={() => {
if (camera.current) {
// play the video here
setImage(camera.current.takePhoto());
}}}
/>
However, that doesn't sound like a very good solution, and I would lose flexibility if I changed my mind and wanted to set the timer to a different second.
If anyone knows any solution or any libraries or packages, please help me. Thank you.

Initialize one or many expo-av audio object?

I have a list of audio files that I need to be able to play and I am using Audio from expo-av library in React Native. I am wondering what the best practices are for handling playback of a list of audio files. The requirement is that playback should be done from the list; in other words, we don't want to navigate to a different screen component to handle playback.
I would like to separate concerns of the media list component from the actual player. So "MediaList" would be responsible for listing the audio files, and "Media" component would be responsible for handling the playback. In this case it makes sense to initialize expo-av Audio object in each Media component. This makes separates concerns of Media and MediaList, however, I this also looks like a performance issue since there are so many Audio instances.
So my question is, does having an Audio sound object for each Media instance make sense from performance/resources perspective? Or should I only have one Audio sound object and re-use that every time I want to play any file? The question is pretty broad without code but I hope someone can provide some direction on the best approach.

Drone SDK supporting free-flight control

This is my first time getting into drones.
I am looking at DJI drones, currently as it seems most promising from a documentation and reviews point of view.
Basically, I would like to program a drone(s) to fly a certain pattern and take pictures when a certain criteria is met. For example, I would like the drone to take off and fly around a small park, stopping to take a picture of each tree it encounters, automatically (auto-piloted / driven by some "AI").
Now I glanced thru the DJI SDK documentation, and so far it SEEMS this is possible (via FlightControl class). But im not sure.
Question:
Can my requirements be met with current drone SDK technologies?
Yes, the correct SDK, 4.11.1 will do everything you mentioned. You will need to do some location calculations but that's about it.
The sample will almost do everything you want as-is, with minor changes.
With the DJI Mobile SDK you can use the Mission classes to automatically fly a given set of coordinates (waypoints) and do some actions once you arrive at a waypoint, e.g. take a picture.
However the SDK has limitations:
The SDK is unable to detects objects in the video stream. Therefore it is needed to use your own code to detect objects yourself.
The way the drone flies to the waypoint is quite limited, e.g. the drone will always face the camera in the direction of flight.
When using the DJI Mission classes, a change of the route during execution is only possible with the use of timeline Missions by adding timeline elements to the list.
As you already assumed in the comment: Yes, the Mobile SDK is more advanced than Windows SDK.

How to implement video with transparent background in react-native

How can i implement a Video Player with transparent background in React-native? The video is saved with the extension .mov (only extension found to support transparent background).
Used react-native-player but it only renders a back screen.
Tried with https://www.w3schools.com/html/mov_bbb.mp4 both loaded locally and via uri and it worked.
Platform target: iOs
I needed same thing and tried many different methods and my conclusion is transparent movie file is not an accepted practice/standard. Only apple quicktime supports it.
Here are few workarounds I tried and made it partially work with a lot of performance challenges.
Best working method is to export all mov frames (with whatever frame rate is acceptible for you - for me I was trying to render something that was going to be exported as gif, so even 15 fps worked for me). Export movie frames as transparent png. Ffmpeg can easily do that.
Then either with simple loop, load and update frames from the js code. This is still very challenging because loading each frame in an image view from jsbundle simply does not catch up with your frame rate needs. As far as my experience, I only saw low single digit frame rates with loading resources from jsbundle. On the other hand, there is a solution for this. You can put your png files in xcassets in ios/xcode, and drawables in android. Then use { uri: 'filename' } to load resources from native apps. This resulted very good quality and speed for me on iOS even for 20-30 second video lengths at 1080p quality. Unfortunately this suffered from memory issues and didn't work on Android - Android only handled 100something frame for me. If you have more than 150-200 frames for your movie in total, regardless of fps, it will hit to the memory limits. I spent more than a week to research alternative ways to overcome memory issue, load many large bitmaps in memory for sequence display them in the app. There are theoretical methods to utilize memory on android without heap size limitations but I found it too low level and beyond my android/java knowledge.
Another path I experimented without success was webM format. It's webp format from google that is supported with some push but I couldn't find enough resources online about webm playback. I succeeded to play a webm file but it just works like gif players which almost all of them don't have clear ways to support transparent animated gifs. So I wasn't able to get to a solution with webp/m as well.
Right now I am trying to fake the transparent portions of my video with other RN animated elements over the video.
By the way, I tried most of these solutions natively on iOS and Android separately and both platforms had its own problems with them.

Resources