I am trying to make an web app with audio, video call using WebRTC.
Problem is that local audio/video working properly in my web app, but remote audio/video is not getting stream on remote side. in console there is no error. you can join room but you can't hear others audio or see video.
here's code:
useEffect(() => {
const initRoom = async () => {
socket.current = socketInit();
//Get User Audio
await captureLocalMedia();
socket.current.emit(ACTIONS.JOIN, {roomId, user});
socket.current.on(ACTIONS.ADD_PEER, handleNewPeerConnection);
async function captureLocalMedia() {
localMediaStream.current =
await navigator.mediaDevices.getUserMedia({
video: true,
audio: true,
});
}
async function handleNewPeerConnection({peerId, createOffer, user: newUser}) {
if(peerId in connections.current) {
return console.warn(`You are already joined with ${user.username}`)
}
var configuration = {
offerToReceiveAudio: true
}
connections.current[peerId] = new RTCPeerConnection({
iceServers: [
{
urls: "stun:stun.l.google.com:19302"
},
{
urls: "stun:stun1.l.google.com:19302"
},
{
urls: "stun:stun2.l.google.com:19302"
},
{
urls: "stun:stun3.l.google.com:19302"
},
{
urls: "stun:stun4.l.google.com:19302"
}
],
configuration: configuration
})
connections.current[peerId].ontrack = (event) => {
addNewClients(newUser, () => {
if(audioElements.current[newUser.id]) {
audioElements.current[newUser.id].srcObject = event.streams[0];
} else {
let settled = false;
const interval = setInterval(() => {
if(audioElements.current[newUser.id]) {
const [remoteStream] = event.streams;
audioElements.current[newUser.id].srcObject=remoteStream
settled = true;
}
if (settled) {
clearInterval(interval)
}
}, 600)
}
})
}
localMediaStream.current.getTracks().forEach((track) => {
connections.current[peerId].addTrack(
track,
localMediaStream.current
)
});
if(createOffer) {
const offer = await connections.current[peerId].createOffer()
await connections.current[peerId].setLocalDescription(offer)
socket.current.emit(ACTIONS.RELAY_SDP, {
peerId,
sessionDescription: offer
})
}
}
}
initRoom();
return () => {
localMediaStream.current
.getTracks()
.forEach((track) => track.stop());
socket.current.emit(ACTIONS.LEAVE, { roomId });
for (let peerId in connections.current) {
connections.current[peerId].close();
delete connections.current[peerId];
delete audioElements.current[peerId];
}
socket.current.off(ACTIONS.ADD_PEER);
}
}, [])
this is socketInit function:
import {io} from 'socket.io-client';
const socketInit = () => {
const options = {
'force new connection': true,
reconnectionAttempts: 'Infinity',
timeout: 10000,
transports: ['websocket'],
};
return io('http://localhost:5500', options)
};
export default socketInit;
You should check whether the offer's SDP contains information about media tracks. For example:
sdp v=0
o=- 4748410946812024893 2 IN IP4 127.0.0.1
............
a=sendrecv
**a=msid:Eei3sKzfsiJybxa4TYhANjGsFMuWe2lAxadS f798f673-566e-4a8e-9760-8d657d031acf**
............
a=rtpmap:126 telephone-event/8000
a=ssrc:3563088629 cname:0j/yv49mmBxgcAbW
a=ssrc:3563088629 msid:Eei3sKzfsiJybxa4TYhANjGsFMuWe2lAxadS f798f673-566e-4a8e-9760-8d657d031acf
a=ssrc:3563088629 mslabel:Eei3sKzfsiJybxa4TYhANjGsFMuWe2lAxadS
a=ssrc:3563088629 label:f798f673-566e-4a8e-9760-8d657d031acf
............
a=max-message-size:262144
If remote peer got information about media tracks and it doesn't work, then the problem is probably with the playing of HTMLMediaElement. Try to add the line:
audioElements.current[newUser.id].autoplay = true
Related
I am implementing Audio/Video call with SIP js and Astrisk server in React JS.I was successful on creating the WebRTC Audio/Video calling. But I am facing an issue with storing the Invitation or Session Object for SIP js. Because Circular JSON data can't be stringed to store.
Assume someone has started calling and the other end got notification of calling and in that case if the page refreshed or reloaded I am unable to recover the call session to take any action(answer/ decline)
/**
* The following code is inside useState and the dependency are handled properly.
* For making it simple and sort I have just copied the required parts. */
const simpleUserDelegate = {
onCallAnswered: (session) => {
console.log(` Call answered`);
if (simpleUser) {
let remoteVideoTrack = simpleUser.getRemoteVideoTrack(session);
if (remoteVideoTrack) {
} else {
setIsAudioCall(true);
}
}
setIsCallAnswered(true);
setIsCallRecieved(false);
localStorage.setItem('isCallRecieved',null);
localStorage.setItem('callerName',null);
localStorage.setItem('callerImage',null);
setIsCallling(false);
},
onCallCreated: (session) => {
setCallSession(session);
console.log(session,` Call created`);
//console.log('session====>',JSON.stringify(session))
// localStorage.setItem('callerUserAgent',JSON.stringify(session._userAgent));
setIsCallling(true);
localStorage.getItem('callerUserAgent')
},
onCallReceived: (invitation) => {
console.log('invitation',invitation);
console.log('invitationSession',invitation.session);
setCallerActiveRoom(invitation._userAgent.options.displayRoomId);
setCallerName(invitation._userAgent.options.displayName);
setCallerImage(invitation._userAgent.options.displayImage);
localStorage.setItem('callerUserAgent',JSON.stringify(invitation.request));
console.log(` Call received`);
// dispatch(setActiveRoomId(invitation._userAgent.options.displayRoomId));
setIsCallRecieved(true);
localStorage.setItem('isCallRecieved',true);
localStorage.setItem('callerName',invitation._userAgent.options.displayName);
localStorage.setItem('callerImage',invitation._userAgent.options.displayImage);
},
onCallHangup: () => {
console.log(` Call hangup`);
setIsCallling(false);
setIsCallRecieved(false);
localStorage.setItem('isCallRecieved',null);
localStorage.setItem('callerName',null);
localStorage.setItem('callerImage',null);
setIsCallAnswered(false);
},
onCallHold: () => {
console.log(` Call hold`);
},
onRegistered: () => {
//console.log('session',session);
console.log(` Call registered`);
},
onUnregistered: () => {
console.log(` Call unregistered`);
},
onServerConnect: () => {
console.log(` server connect`);
},
onServerDisconnect: () => {
console.log(` server dis connect`);
}
};
let simpleUserOptions = {
// traceSip: false,
// logBuiltinEnabled: false,
delegate: simpleUserDelegate,
media: {
constraints: {
audio: true,
video: true
},
local: {
video: document.getElementById('localMedia')
},
remote: {
video: document.getElementById('remoteMedia'),
//audio: remoteAudioRef.current
}
},
userAgentOptions: {
logBuiltinEnabled: true,
logLevel: "debug",
authorizationPassword: password,
authorizationUsername: username,
uri: urI,
noAnswerTimeout : 30,
displayName: name,
displayImage: profileImage,
displayRoomId: `hi${displayRoomId}`
},
};
const simpleUserObj = new Web.SessionManager('wss://pbx.scinner.com:8089/ws', simpleUserOptions);
if(!simpleUserObj.isConnected()){
simpleUserObj
.connect()
.then(() => {
console.log(`${user.username} connected`);
simpleUserObj.register().then(() => {
console.log(`${user.username} registerd`);
}).catch((error) => {
alert("Failed to register.\n" + error);
});
})
.catch((error) => {
alert("Failed to connect.\n" + error);
});
setIsSARegistered(true);
setSimpleUser(simpleUserObj);
setCallerUserAgent
}else{
console.log('isconnected');
setIsSARegistered(true);
}
/**
Set calling
*/
const setCalling = (name, target) => {
simpleUser
.call(target, {
sessionDescriptionHandlerOptions: {
constraints: {
audio: true,
video: true
}
},
inviteWithoutSdp: false
}).then(() => {
console.log(`anon placed a call`);
}).catch((error) => {
console.error(`[${simpleUser.id}] failed to place call`);
console.error(error);
alert("Failed to place call.\n" + error);
});
//setIsCallling(true);
// console.log('isCallling', isCallling)
}
}
const answerCall = () => {
//callSession stored in local state
if (callSession) {
simpleUser.answer(callSession).then(() => {
console.log(`call answered`);
}).catch((error) => {
console.error(`call answered failed`);
console.error(error);
// alert("Failed to place call.\n" + error);
});
}
};
I have a React context I am testing that runs a single function to check for an application update. The checkForUpdate function looks like this:
async function checkForUpdate() {
if (isPlatform('capacitor')) {
const maintanenceURL =
'https://example.com/maintenance.json';
const updateURL =
'https://example.com/update.json';
try {
const maintanenceFetch: AxiosResponse<MaintanenceDataInterface> =
await axios.get(maintanenceURL);
console.log('maintain', maintanenceFetch);
if (maintanenceFetch.data.enabled) {
setUpdateMessage(maintanenceFetch.data.msg);
return;
}
const updateFetch: AxiosResponse<UpdateDataInterface> = await axios.get(
updateURL
);
console.log('updateFetch', updateFetch);
if (updateFetch.data.enabled) {
const capApp = await App.getInfo();
const capAppVersion = capApp.version;
console.log('Thi is a thinkg', capAppVersion);
if (isPlatform('android')) {
console.log('hi');
const { currentAndroid, majorMsg, minorMsg } = updateFetch.data;
const idealVersionArr = currentAndroid.split('.');
const actualVersionArr = capAppVersion.split('.');
if (idealVersionArr[0] !== actualVersionArr[0]) {
setUpdateMessage(majorMsg);
setUpdateAvailable(true);
return;
}
if (idealVersionArr[1] !== actualVersionArr[1]) {
setUpdateMessage(minorMsg);
setUpdateAvailable(true);
return;
}
} else {
const { currentIos, majorMsg, minorMsg } = updateFetch.data;
const idealVersionArr = currentIos.split('.');
const actualVersionArr = capAppVersion.split('.');
if (idealVersionArr[0] !== actualVersionArr[0]) {
setUpdateMessage(majorMsg);
setUpdateAvailable(true);
return;
}
if (idealVersionArr[1] !== actualVersionArr[1]) {
setUpdateMessage(minorMsg);
setUpdateAvailable(true);
return;
}
}
}
} catch (err) {
console.log('Error in checkForUpdate', err);
}
}
}
For some reason, in my test I wrote to test this, my axiosSpy only shows that it has been called 1 time instead of the expected 2 times. The console logs I posted for both get requests run as well. I cannot figure out what I am doing wrong.
Here is the test:
it.only('should render the update page if the fetch call to update bucket is enabled and returns a different major version', async () => {
const isPlatformSpy = jest.spyOn(ionicReact, 'isPlatform');
isPlatformSpy.mockReturnValueOnce(true).mockReturnValueOnce(true);
const appSpy = jest.spyOn(App, 'getInfo');
appSpy.mockResolvedValueOnce({
version: '0.8.0',
name: 'test',
build: '123',
id: 'r132-132',
});
const axiosSpy = jest.spyOn(axios, 'get');
axiosSpy
.mockResolvedValueOnce({
data: {
enabled: false,
msg: {
title: 'App maintenance',
msg: 'We are currently solving an issue where users cannot open the app. This should be solved by end of day 12/31/2022! Thank you for your patience 😁',
btn: 'Ok',
type: 'maintenance',
},
},
})
.mockResolvedValueOnce({
data: {
current: '1.0.0',
currentAndroid: '1.0.0',
currentIos: '2.0.0',
enabled: true,
majorMsg: {
title: 'Important App update',
msg: 'Please update your app to the latest version to continue using it. If you are on iPhone, go to the app store and search MO Gas Tax Back to update your app. The button below does not work but will in the current update!',
btn: 'Download',
type: 'major',
},
minorMsg: {
title: 'App update available',
msg: "There's a new version available, would you like to get it now?",
btn: 'Download',
type: 'minor',
},
},
});
customRender(<UpdateChild />);
expect(axiosSpy).toHaveBeenCalledTimes(2);
});
I need to get localMediaStream in one effect, while it is set in another effect. Please tell me why in this context it is always null (if you do not set it in the same effect), but in this case I have a duplicate userMedia. Consequences - the camera does not go out when I call track.stop(). Based on this package
const peerConnections = useRef({});
const localMediaStream = useRef(null);
const peerMediaElements = useRef({
[LOCAL_VIDEO]: null,
});
useEffect(() => {
async function handleNewPeer({peerID, createOffer}) {
if (peerID in peerConnections.current) {
return console.warn(`Already connected to peer ${peerID}`);
}
peerConnections.current[peerID] = new RTCPeerConnection({
iceServers: freeice(),
});
peerConnections.current[peerID].onicecandidate = event => {
if (event.candidate) {
socket.emit(ACTIONS.RELAY_ICE, {
peerID,
iceCandidate: event.candidate,
});
}
}
let tracksNumber = 0;
peerConnections.current[peerID].ontrack = ({streams: [remoteStream]}) => {
tracksNumber++
if (tracksNumber === 2) { // video & audio tracks received
tracksNumber = 0;
addNewClient(peerID, () => {
if (peerMediaElements.current[peerID]) {
peerMediaElements.current[peerID].srcObject = remoteStream;
} else {
// FIX LONG RENDER IN CASE OF MANY CLIENTS
let settled = false;
const interval = setInterval(() => {
if (peerMediaElements.current[peerID]) {
peerMediaElements.current[peerID].srcObject = remoteStream;
settled = true;
}
if (settled) {
clearInterval(interval);
}
}, 1000);
}
});
}
}
/*localMediaStream.current = await navigator.mediaDevices.getUserMedia({
audio: audio,
video: video
})*/
localMediaStream.current.getTracks().forEach(track => { // localMediaStream null
peerConnections.current[peerID].addTrack(track, localMediaStream.current);
});
if (createOffer) {
const offer = await peerConnections.current[peerID].createOffer();
await peerConnections.current[peerID].setLocalDescription(offer);
socket.emit(ACTIONS.RELAY_SDP, {
peerID,
sessionDescription: offer,
});
}
}
socket.on(ACTIONS.ADD_PEER, handleNewPeer);
return () => {
socket.off(ACTIONS.ADD_PEER);
}
}, []);
// The installation, everything is as in the source, it did not work until I added the crutch above, but when it came to stopping the video stream, a bug appeared with the camera always on
useEffect(() => {
async function startCapture() {
console.log('start capture');
localMediaStream.current = await navigator.mediaDevices.getUserMedia({
audio: audio,
video: video
}).catch(console.log);
addNewClient(LOCAL_VIDEO, () => {
const localVideoElement = peerMediaElements.current[LOCAL_VIDEO];
if (localVideoElement) {
localVideoElement.volume = 0;
localVideoElement.srcObject = localMediaStream.current;
}
});
}
startCapture().then((data) => socket.emit(ACTIONS.JOIN, {room: roomID})).catch((e) => console.error(e)).finally(() => console.log('finally'));
console.log(roomID);
return () => {
localMediaStream.current.getTracks().forEach(track => track.stop());
socket.emit(ACTIONS.LEAVE);
};
}, [roomID]);
Thanks you very much.
I have hosted a peer to peer meeting react app on netlify. I have used Peerjs for my video purpose. Everything is working as expected except the video. For some networks the video of the the remote person is working and for some others it is not working. I looked up and found out that it may be a STUN/TURN issue. I then implemented all the STUN/TURN servers in my code. However the video is still not getting setup in some cases. In some cases it is working fine, in others the video is not showing up. Herewith, I am attaching th code for the video and the link to the site.
import React,{useEffect,useState} from 'react';
import {io} from "socket.io-client";
import {useParams} from 'react-router-dom';
import {Grid} from "#material-ui/core";
import Peer from 'peerjs';
var connectionOptions = {
"force new connection" : true,
"reconnectionAttempts": "Infinity",
"timeout" : 10000,
"transports" : ["websocket"]
};
const Videobox = ({isVideoMute,isAudioMute}) => {
var myPeer = new Peer(
{
config: {'iceServers': [
{urls:'stun:stun01.sipphone.com'},
{urls:'stun:stun.ekiga.net'},
{urls:'stun:stun.fwdnet.net'},
{urls:'stun:stun.ideasip.com'},
{urls:'stun:stun.iptel.org'},
{urls:'stun:stun.rixtelecom.se'},
{urls:'stun:stun.schlund.de'},
{urls:'stun:stun.l.google.com:19302'},
{urls:'stun:stun1.l.google.com:19302'},
{urls:'stun:stun2.l.google.com:19302'},
{urls:'stun:stun3.l.google.com:19302'},
{urls:'stun:stun4.l.google.com:19302'},
{urls:'stun:stunserver.org'},
{urls:'stun:stun.softjoys.com'},
{urls:'stun:stun.voiparound.com'},
{urls:'stun:stun.voipbuster.com'},
{urls:'stun:stun.voipstunt.com'},
{urls:'stun:stun.voxgratia.org'},
{urls:'stun:stun.xten.com'},
{
urls: 'turn:numb.viagenie.ca',
credential: 'muazkh',
username: 'webrtc#live.com'
},
{
urls: 'turn:192.158.29.39:3478?transport=udp',
credential: 'JZEOEt2V3Qb0y27GRntt2u2PAYA=',
username: '28224511:1379330808'
},
{
urls: 'turn:192.158.29.39:3478?transport=tcp',
credential: 'JZEOEt2V3Qb0y27GRntt2u2PAYA=',
username: '28224511:1379330808'
}
]} /* Sample servers, please use appropriate ones */
}
);
const peers = {}
const [socket, setSocket] = useState()
const {id:videoId} = useParams();
const videoGrid = document.getElementById('video-grid')
useEffect(()=> {
const s=io("https://weconnectbackend.herokuapp.com",connectionOptions);
setSocket(s);
return () => {
s.disconnect();
}
},[])
// let myVideoStream;
const [myVideoStream, setmyVideoStream] = useState()
const muteUnmute = () => {
const enabled = myVideoStream.getAudioTracks()[0].enabled;
if (enabled) {
myVideoStream.getAudioTracks()[0].enabled = false;
//setUnmuteButton();
} else {
//setMuteButton();
myVideoStream.getAudioTracks()[0].enabled = true;
}
}
const playStop = () => {
//console.log('object')
let enabled = myVideoStream.getVideoTracks()[0].enabled;
if (enabled) {
myVideoStream.getVideoTracks()[0].enabled = false;
//setPlayVideo()
} else {
//setStopVideo()
myVideoStream.getVideoTracks()[0].enabled = true;
}
}
useEffect(() => {
if(myVideoStream)
playStop()
}, [isVideoMute])
useEffect(() => {
if(myVideoStream)
muteUnmute()
}, [isAudioMute])
useEffect(() => {
if(socket== null)
return;
myPeer.on('open',id=>{
socket.emit('join-room',videoId,id);
})
const myVideo = document.createElement('video')
myVideo.muted = true
navigator.mediaDevices.getUserMedia({
video: true,
audio: true
}).then(stream => {
// myVideoStream = stream;
window.localStream=stream;
setmyVideoStream(stream);
console.log(myVideoStream,"myvideostream");
addVideoStream(myVideo, stream)
myPeer.on('call', call => {
call.answer(stream)
const video = document.createElement('video')
call.on('stream', userVideoStream => {
addVideoStream(video, userVideoStream)
})
})
socket.on('user-connected',userId =>{
connectToNewUser(userId, stream)
})
socket.on('user-disconnected', userId => {
if (peers[userId]) peers[userId].close()
})
})
}, [socket,videoId])
function addVideoStream(video, stream) {
video.srcObject = stream
video.addEventListener('loadedmetadata', () => {
video.play()
})
videoGrid.append(video)
}
function connectToNewUser(userId, stream) {
const call = myPeer.call(userId, stream)
const video = document.createElement('video')
call.on('stream', userVideoStream => {
addVideoStream(video, userVideoStream)
})
call.on('close', () => {
video.remove()
})
peers[userId] = call
}
return (
<div id="video-grid" className="videoStyleFromDiv">
{/* <Video srcObject={srcObject}/> */}
</div>
)
}
export default Videobox
Website Link
The TURN servers you are using have been out of commission for a couple of years in the case of the ones taken from https://www.html5rocks.com/en/tutorials/webrtc/infrastructure/
Copying credentials from random places is not how TURN works, you will need to run your own servers.
So I am using jssip 3.2.10 to make calls on a React project.
The server is setup on Asterisk and CentOS.
I can make calls where the call receiver hears me well, but I can't hear their audio, nor the waiting (traditional) beep noises it should make until the call is picked up.
It does work with some sipml5/asterisk udp online tests so I feel it's on my clients side issue. I tested it on Chrome and Firefox (both latest, with the same results).
My setup
I have a helper to connect called sip.js:
const JsSIP = require('jssip')
const GLOBAL = require('../globals')
function register(user, pass, cb) {
console.log('Registering to SIP')
JsSIP.debug.disable('JsSIP:*')
const address = GLOBAL.jssip_server + ':' + GLOBAL.jssip_port
let socket = new JsSIP.WebSocketInterface('ws://' + address + '/ws')
const configuration = {
sockets: [socket],
uri: 'sip:' + user + '#' + GLOBAL.jssip_server,
authorization_user: user,
password: pass,
connection_recovery_min_interval: 3,
register: true
}
let ua = new JsSIP.UA(configuration)
ua.start()
cb(ua)
}
export {
register
}
Then on my main component I do the following:
componentDidMount() {
if(GLOBAL.jssip) {
this.props.dispatch(connecting(true))
register('***', '***', (ua) => {
this.setState({ua: ua}, () => {
this.state.ua.on("registered", () => {
this.props.dispatch(connecting(false))
this.setState({critical: false})
})
this.state.ua.on("registrationFailed", () => {
this.props.dispatch(connecting(false))
this.setState({critical: true})
})
})
})
}
}
And when I try to make a call I do the following:
doCall(number) {
this.props.dispatch(placeCall(call))
if(GLOBAL.jssip) {
let eventHandlers = {
'connecting': (e) => {
console.log('call is in progress')
this.setState({sipStatus: "connecting"})
},
'progress': (e) => {
console.log('call is in progress')
this.setState({sipStatus: "progress"})
},
'failed': (e) => {
console.log('call failed with cause: ', e)
this.setState({sipStatus: "failed"})
},
'ended': (e) => {
console.log('call ended with cause: ', e)
this.setState({sipStatus: "ended"})
},
'confirmed': (e) => {
this.setState({sipStatus: "confirmed"})
}
}
let options = {
eventHandlers: eventHandlers,
mediaConstraints: { 'audio': true, 'video': false }
}
let session = this.state.ua.call('sip:'+number+'#'+GLOBAL.jssip_server, options)
}
}
Anyone has a clue on how to fix this?
Thanks to the answer here:
How to handle audio stream in JsSIP?
I found the solution, I needed to add to the file rendering the call:
<audio ref={(audio) => {this.audioElement = audio}} id="audio-element"></audio>
And changed doCall last bit to this:
this.setState({session: this.state.ua.call('sip:'+number+'#'+GLOBAL.jssip_server, options)}, () =>{
this.state.session.connection.addEventListener('addstream', (event: any) => {
this.audioElement.srcObject = event.stream
this.audioElement.play()
})
})