JSSIP How to switch between audio call to video call - reactjs

I am new in JSSIP. I need to switch Audio call to video call in ongoing call.
const session = userAgent.call(destinationNumber, {
mediaConstraints: {
audio: true,
video: false
},
pcConfig: {
iceServers: [{ urls: Config.STUN_SERVER }]
}
});
This is how i initiate audio call. How i can able to switch to video call in between the call?

You can do a trick and initiate a call with video, but instead of putting real video track, you will put some dummy "silent" video track:
function createSilentVideoTrack() {
const canvas = document.createElement("canvas");
canvas.width = 50;
canvas.height = 30;
canvas.getContext("2d").fillRect(0, 0, canvas.width, canvas.height);
animateCanvas(canvas);
const stream = canvas.captureStream(1);
const tracks = stream.getTracks();
const videoTrack = tracks[0];
return videoTrack;
}
And when you need to enable video, you just replace dummy video track to the real one:
navigator.mediaDevices.getUserMedia(constraints).getVideoTracks()[0].then(track => {
connection.getSenders().filter(sender => sender.track !== null && sender.track.kind === "video").forEach(sender => {
sender.replaceTrack(track);
});

Related

Choppy sound on MediaRecorder when playing video along side the recording

I'm trying to build a website with React that let me record myself using MediaRecorder with another video running in the background, but when the other video volume is up the sound of the record is choppy and scrappy, when I turn the video volume down the sound of the record is good, at first I was thinking the problem happends when the sound of the other video is also recorded and disturbe the the actual record, but even when I'm using a microphone and headphones the problem seems to happend, I think its something in my code.
This is the function that init the recorder:
const initVideoStream = async () => {
const stream = await navigator.mediaDevices.getUserMedia(constraints).then(async function (stream) {
const options = {
mimeType: 'audio/webm'
}
recorder.current = new MediaRecorder(stream, options)
recorder.current.ondataavailable = e => {
recordedChunks.current.push(e.data)
}
recorder.current.onstop = async e => {
setisBlob(true)
setVideoBlob(new Blob(recordedChunks.current, { 'type': 'video/mp4' }))
const videoUrlObj = window.URL.createObjectURL(new Blob(recordedChunks.current, { 'type': 'video/mp4' }))
recorderRef.current.src = videoUrlObj
recordedChunks.current = []
}
return stream
}).catch(function (error) {
console.log(error);
setErrorMsg('Cannot find any camera,please check if your camera is connected and try again')
setIsPopupOpen(true)
});
recordedChunks.current = []
setStream(stream)
}
I've tried to change the mimeType to "video/mp4" and that didn't help.
I've upload my project to netlidy so you can see the problem:
https://recorder-poc.netlify.app/

How is the video stream still making it to the peerConnection although I disabled it?

I'm working on a peer to peer webrtc project using react, hooks, and redux. When I make modifications such as disabling the video, I don't see it on the local video, but I continue to see the stream on the remote video as if I never disbaled anything. If I disable my video, then my screen should be black and so should my peer's screen, but instead, my screen is black, but I'm still streaming video from the camera to the peer. What am I doing wrong?
useEffect(()=>{
(async ()=>{
let stream = await navigator.mediaDevices.getUserMedia({audio:false, video: true})
console.log('RAN!');
// {hide} being handled by redux
if(hide) stream.getVideoTracks()[0].enabled = false;
//LOCAL VIDEO BLANK AS IT SHOULD BE
if(localRef?.current) localRef.current.srcObject = stream;
stream.getTracks().forEach(track =>{
peerConnection.addTrack(track, stream)
})
peerConnection.ontrack = (e) =>{
//REMOTE VIDEO IS NOT BLANK. But I'm disabling it using {hide}
if(remoteVideoRef) remoteVideoRef.current.srcObject = e.streams[0]
}
})()
},[hide])
Thanks Philipp. That suggestion worked wonders. I had to set the stream outside and make changes to it inside the hook. Here's the working version for anyone that may need it in the future:
const [mediaStream, setMediaStream] = useState<MediaStream | null>(null);
useEffect(() => {
// peerConnection.getSenders()[0].replaceTrack(track)
navigator.mediaDevices
.getUserMedia({ video: true, audio: false })
.then((stream) => {
setMediaStream(stream);
});
//needed to trigger re-render so we get get the new stream;
}, [hide, muted]);
useEffect(() => {
if (mediaStream) mediaStream.getVideoTracks()[0].enabled = !hide;
//handle hiding
mediaStream?.getVideoTracks().forEach((track) => {
if (peerConnection.getSenders().length) {
peerConnection.getSenders()[0].replaceTrack(track);
} else {
peerConnection.addTrack(track, mediaStream);
}
});
localRef.current.srcObject = mediaStream;
peerConnection.ontrack = (e) => {
let [remoteStream] = e.streams;
remoteVideoRef.current.srcObject = remoteStream;
};
});

How to get the Blob image preview in my Uppy Custom setup

I learn React and now I use the Uppy so user can select files for upload.
When user have select his file the files are hidden by settting showSelectedFiles={false}
I use my own Component to show the selected files and I get the files using this:
.on("file-added", (file) => {
const { setFile } = props;
setFile(file);
const newList = this.state.files.concat({ file });
this.setState({
files: { newList },
});
});
For each file added to the Dashboard the setFile(file); is sending the file object to my Custom view. The problem is that the preview image Blob that is auto created by the Dashboard is not present at this stage.
How can I get the files to my Custom GUI to show them including the image preview Blob?
I'm new to React and JavaScript so please be gentle:)
Complete code:
import React from "react";
import "#uppy/status-bar/dist/style.css";
import "#uppy/drag-drop/dist/style.css";
import "#uppy/progress-bar/dist/style.css";
import "./styles.css";
import "#uppy/core/dist/style.css";
import "#uppy/dashboard/dist/style.css";
const Uppy = require("#uppy/core");
// const Dashboard = require("#uppy/dashboard");
const GoogleDrive = require("#uppy/google-drive");
const Dropbox = require("#uppy/dropbox");
const Instagram = require("#uppy/instagram");
const Webcam = require("#uppy/webcam");
const Tus = require("#uppy/tus");
const ThumbnailGenerator = require("#uppy/thumbnail-generator");
const {
Dashboard,
DashboardModal,
DragDrop,
ProgressBar,
} = require("#uppy/react");
class DashboardUppy extends React.Component {
constructor(props) {
super(props);
this.form = React.createRef();
this.state = {
showInlineDashboard: false,
open: false,
files: [],
};
this.uppy = new Uppy({
id: "uppy1",
autoProceed: false,
debug: true,
allowMultipleUploads: true,
proudlyDisplayPoweredByUppy: true,
restrictions: {
// maxFileSize: 1000000,
maxNumberOfFiles: 100,
minNumberOfFiles: 1,
allowedFileTypes: null,
},
onBeforeFileAdded: (currentFile, files) => {
console.log(files);
const modifiedFile = Object.assign({}, currentFile, {
name: currentFile + Date.now(),
});
if (!currentFile.type) {
// log to console
this.uppy.log(`Skipping file because it has no type`);
// show error message to the user
this.uppy.info(`Skipping file because it has no type`, "error", 500);
return false;
}
return modifiedFile;
},
})
.use(Tus, { endpoint: "https://master.tus.io/files/" })
.use(GoogleDrive, { companionUrl: "https://companion.uppy.io" })
.use(Dropbox, {
companionUrl: "https://companion.uppy.io",
})
.use(Instagram, {
companionUrl: "https://companion.uppy.io",
})
.use(Webcam, {
onBeforeSnapshot: () => Promise.resolve(),
countdown: false,
modes: ["video-audio", "video-only", "audio-only", "picture"],
mirror: true,
facingMode: "user",
locale: {
strings: {
// Shown before a picture is taken when the `countdown` option is set.
smile: "Smile!",
// Used as the label for the button that takes a picture.
// This is not visibly rendered but is picked up by screen readers.
takePicture: "Take a picture",
// Used as the label for the button that starts a video recording.
// This is not visibly rendered but is picked up by screen readers.
startRecording: "Begin video recording",
// Used as the label for the button that stops a video recording.
// This is not visibly rendered but is picked up by screen readers.
stopRecording: "Stop video recording",
// Title on the “allow access” screen
allowAccessTitle: "Please allow access to your camera",
// Description on the “allow access” screen
allowAccessDescription:
"In order to take pictures or record video with your camera, please allow camera access for this site.",
},
},
}).use(ThumbnailGenerator, {
thumbnailWidth: 200,
// thumbnailHeight: 200 // optional, use either width or height,
waitForThumbnailsBeforeUpload: true
})
.on("thumbnail:generated", (file, preview) => {
const img = document.createElement("img");
img.src = preview;
img.width = 100;
document.body.appendChild(img);
})
.on("file-added", (file) => {
const { setFile } = props;
setFile(file);
const newList = this.state.files.concat({ file });
this.setState({
files: { newList },
});
});
}
componentWillUnmount() {
this.uppy.close();
}
render() {
const { files } = this.state;
this.uppy.on("complete", (result) => {
console.log(
"Upload complete! We’ve uploaded these files:",
result.successful
);
});
return (
<div>
<div>
<Dashboard
uppy={this.uppy}
plugins={["GoogleDrive", "Webcam", "Dropbox", "Instagram"]}
metaFields={[
{ id: "name", name: "Name", placeholder: "File name" },
]}
open={this.state.open}
target={document.body}
onRequestClose={() => this.setState({ open: false })}
showSelectedFiles={false}
/>
</div>
</div>
);
}
}
export default DashboardUppy;
Ran into this problem as well because I wanted to use the image preview to figure out the aspect ratio of the underlying image.
If you're using Dashboard or ThumbnailGenerator for Uppy, an event is emitted for every upload:
uppy.on('thumbnail:generated', (file, preview) => {
const img = new Image();
img.src = preview;
img.onload = () => {
const aspect_ratio = img.width / img.height;
// Remove image if the aspect ratio is too weird.
// TODO: notify user.
if (aspect_ratio > 1.8) {
uppy.removeFile(file.id);
}
}
});
I realize though that you already are looking for this event in your code. I guess to answer your question, just put your logic there instead of in file-added.

Adding mediastream.clone() to a video element srcObject does not fire loadedmetadata event if video track is not enabled

I have a react app configured as such:
// landingpage.js
const {mediaStream, setMediaStream} = useContext(StreamContext);
useEffect(() => {
navigator.mediaDevices.getUserMedia({ video: true, audio: true })
.then((stream) => {
const video = document.querySelector('video');
video.srcObject = stream;
video.onloadedmetadata = () => {
video.play();
setMediaStream(stream.clone());
}
}).catch here...;
}, []);
const toggleVideo = () => {
...
const video = document.querySelector('video');
const stream = video.srcObject;
tracks.forEach((track) => {
if (track.kind === 'video') {
track.enabled = !track.enabled;
setMediaStream(stream.clone());
}
});
...
}
// same thing for toggleAudio
The landing page seems to work fine, onloadedmetadata event gets fired and the video plays with audio. But when I take the stream.clone() and go to another page, and then srcObject to a video element there, weird behavior happens:
// Redirected to a new page, newpage.js
const { mediaStream, setMediaStream } = useContext(StreamContext);
useEffect(() => {
if (mediaStream) {
const video = document.querySelector('video');
video.srcObject = mediaStream;
video.onloadedmetadata = () => {
// This event isn't captured if video track is not enabled
// I also confirmed that this block runs if video track is enabled
// I've tried video.addEventListener with 'loadedmetadata' and 'loadeddata' instead, but it didn't work for me
video.play();
};
}
}, [])
In this page, if I had set enabled false for the video track, then audio does not autoplay, no audio, even though tracks are not muted. If videotrack was enabled true, then audio and video works. Here's how the media stream tracks look like:
MediaStreamTrack
contentHint: ""
enabled: true
id: ...
kind: "audio"
label: ...
muted: false
onended: null
onmute: null
onoverconstrained: null
onunmute: null
readyState: "live"
MediaStreamTrack
contentHint: ""
enabled: false
id: ...
kind: "video"
label: ...
muted: false
onended: null
onmute: null
onoverconstrained: null
onunmute: null
readyState: "live"
I expected the clone to autoplay in a new page seamlessly, however, if video was toggled off, it doesn't seem to autoplay the audio.
What I'm noticing is that onloadedmetadata event doesn't get fired in this new page if video track is not enabled.
This problem seems to occur in Safari, but not in Chrome and Firefox.

ReactJS: Resize image before upload

In my reactJs project, I need to resize image before uploading it.
I am using react-image-file-resizer library which has a simple example but not working for me.
I have tried this but its shows me blank result. What am I doing wrong?
var imageURI = '';
const resizedImg = await Resizer.imageFileResizer(
fileList.fileList[0].originFileObj,
300,
300,
'JPEG',
100,
0,
uri => {
imageURI = uri
console.log(uri ) // this show the correct result I want but outside of this function
},
'blob'
);
console.log(resizedImg)
console.log(imageURI)
// upload new image
...uploading image here..
If I do imgRef.put(uri); inside URI function then image upload works. but I need to do that outside of that function.
how to get result in imageURI variable and reuse it later ?
First, wrap this resizer:
const resizeFile = (file) => new Promise(resolve => {
Resizer.imageFileResizer(file, 300, 300, 'JPEG', 100, 0,
uri => {
resolve(uri);
}, 'base64' );
});
And then use it in your async function:
const onChange = async (event) => {
const file = event.target.files[0];
const image = await resizeFile(file);
console.log(image);
}
Ok I figured it out using compres.js library.
async function resizeImageFn(file) {
const resizedImage = await compress.compress([file], {
size: 2, // the max size in MB, defaults to 2MB
quality: 1, // the quality of the image, max is 1,
maxWidth: 300, // the max width of the output image, defaults to 1920px
maxHeight: 300, // the max height of the output image, defaults to 1920px
resize: true // defaults to true, set false if you do not want to resize the image width and height
})
const img = resizedImage[0];
const base64str = img.data
const imgExt = img.ext
const resizedFiile = Compress.convertBase64ToFile(base64str, imgExt)
return resizedFiile;
}
it return a file to be uploaded to server.
Image resize in the browser should be pain-free, but it is not. You can use a package, but they are often poorly-written and poorly maintained.
For that reason, I wrote my own code using several Javascript APIs: FileReader, Image, canvas, and context. However, this code produces resizes with some pixelation. If you want even higher quality resizes, I would recommend the Pica package, which uses web workers.
Javascript
const uploadImage = (event) => {
const [ imageFile ] = event.target.files;
const { type: mimeType } = imageFile;
const fileReader = new FileReader();
fileReader.readAsDataURL(imageFile);
fileReader.onload = (fileReaderEvent) => {
const imageAsBase64 = fileReaderEvent.target.result;
const image = document.createElement("img");
image.src = imageAsBase64;
const imageResizeWidth = 100;
// if (image.width <= imageResizeWidth) {
// return;
// }
const canvas = document.createElement('canvas');
canvas.width = imageResizeWidth;
canvas.height = ~~(image.height * (imageResizeWidth / image.width));
const context = canvas.getContext('2d', { alpha: false });
// if (!context) {
// return;
// }
context.drawImage(image, 0, 0, canvas.width, canvas.height);
// const resizedImageBinary = canvas.toBlob();
const resizedImageAsBase64 = canvas.toDataURL(mimeType);
};
};
HTML
<form>
<input type="file" accept="image/jpeg"
onchange="uploadImage()"/>
</form>
The library which you are using will not resize the image for file upload.
It returns of new image's base64 URI or Blob. The URI can be used as the
source of an component.
To resize the image:
You can refer to the script here
or a working code sample demo here

Resources