I'm trying to start a WebRTC call with AWS Kinesis, but the demo on The AWS Kinesis Javascript docs only shows how to join the call as a VIEWER not the MASTER.
I can't find a clear example anywhere online, and I've spent hours on it with my teammate.
I can see and hear myself, so I know I'm getting the hardware working correctly, but we can't see or hear each other. I know it's going to be something simple, but I just can't figure out where I'm going wrong with the connection.
const startKinesisCall = async () => {
const coachingSession = new AWS.KinesisVideo({
region,
accessKeyId,
secretAccessKey,
correctClockSkew: true
});
// Get Signaling Channel Endpoints
// Each signaling channel is assigned an HTTPS and WSS endpoint to connect to for
// data-plane operations. These can be discovered using the GetSignalingChannelEndpoint API.
const getSignalingChannelEndpointResponse = await coachingSession.getSignalingChannelEndpoint({
ChannelARN: channelARN,
SingleMasterChannelEndpointConfiguration: {
Protocols: ['WSS', 'HTTPS'],
Role: Role.VIEWER
}
}).promise();
const endpointsByProtocol = getSignalingChannelEndpointResponse?.ResourceEndpointList?.reduce((endpoints, endpoint) => {
endpoints[endpoint.Protocol] = endpoint?.ResourceEndpoint;
return endpoints;
}, {});
// Create KVS Signaling Client
// The HTTPS endpoint from the GetSignalingChannelEndpoint response is used with this client.
// This client is just used for getting ICE servers, not for actual signaling.
const kinesisVideoSignalingChannelsClient = new AWS.KinesisVideoSignalingChannels({
region,
accessKeyId,
secretAccessKey,
endpoint: endpointsByProtocol.HTTPS,
correctClockSkew: true,
});
// Get ICE server configuration
// For best performance, we collect STUN and TURN ICE server configurations.
// The KVS STUN endpoint is always stun:stun.kinesisvideo.${region}.amazonaws.com:443.
// To get TURN servers, the GetIceServerConfig API is used.
const getIceServerConfigResponse = await kinesisVideoSignalingChannelsClient
.getIceServerConfig({
ChannelARN: channelARN,
}).promise();
const iceServers = [{ urls: `stun:stun.kinesisvideo.${region}.amazonaws.com:443` }];
getIceServerConfigResponse.IceServerList.forEach(iceServer =>
iceServers.push({
urls: iceServer.Uris,
username: iceServer.Username,
credential: iceServer.Password,
}),
);
console.log('ICE SERVERS: ', iceServers);
// Create RTCPeerConnection
// The RTCPeerConnection is the primary interface for WebRTC communications in the Web.
const peerConnection = new RTCPeerConnection({ iceServers });
// Create WebRTC Signaling Client
// This is the actual client that is used to send messages over the signaling channel.
const signalingClient = new SignalingClient({
channelARN,
channelEndpoint: endpointsByProtocol.WSS,
role: Role.MASTER,
region,
clientId,
credentials: {
accessKeyId,
secretAccessKey,
},
systemClockOffset: coachingSession.config.systemClockOffset
});
// GET THE USER MEDIA DEVICES
const localStream = await navigator.mediaDevices.getUserMedia({
video: true,
audio: true
}).catch(e => {
console.log("COULD NOT FIND WEBCAM");
setShowErrorStartingVideoModal(true);
});
// *** AUDIO & VIDEO DEVICE COLLECTION ***
let audioInputDevices: MediaDeviceInfo[];
let audioOutputDevices: MediaDeviceInfo[];
let videoInputDevices: MediaDeviceInfo[];
try {
const mediaDevices = await navigator.mediaDevices.enumerateDevices();
audioInputDevices = mediaDevices.filter(device => device.kind === 'audioinput');
audioOutputDevices = mediaDevices.filter(device => device.kind === 'audiooutput');
videoInputDevices = mediaDevices.filter(device => device.kind === 'videoinput');
setMicrophoneList(audioInputDevices);
setSpeakerList(audioOutputDevices);
setCameraList(videoInputDevices);
} catch (e) {
console.log(e);
console.log("ERROR COLLECTING MEDIA DEVICE INFORMATION: MAKE SURE PERMISSIONS ARE ALLOWED AND TRY AGAIN");
};
// GRAB THE LOCAL PROVIDER AND PATIENT VIDEO TILES
const providerVideoTile: HTMLVideoElement = document.getElementById('provider-video-element') as HTMLVideoElement;
const patientVideoElement = document.getElementById('patient-video-element') as HTMLVideoElement;
// let dataChannel: RTCDataChannel
// Add Signaling Client Event Listeners
signalingClient.on('open', async () => {
if (!localStream || !peerConnection) return;
// Get a stream from the webcam, add it to the peer connection, and display it in the local view
try {
localStream.getTracks().forEach(track => peerConnection.addTrack(track, localStream));
providerVideoTile.srcObject = localStream;
} catch (e) {
// Could not find webcam
console.log(e);
return;
};
// Create an SDP offer and send it to the master
const offer = await peerConnection.createOffer({
offerToReceiveAudio: true,
offerToReceiveVideo: true
});
console.log('CREATED OFFER: ', offer);
await peerConnection.setLocalDescription(offer);
if (peerConnection.localDescription) signalingClient.sendSdpOffer(peerConnection.localDescription, patient.patientID);
});
// When the SDP answer is received back from the master, add it to the peer connection.
signalingClient.on('sdpAnswer', async answer => {
console.log('RECEIVED ANSWER: ', answer);
if (!peerConnection) return;
await peerConnection.setRemoteDescription(answer).catch(e => console.log(e));
});
signalingClient.on('sdpOffer', async (offer, senderClientID) => {
console.log({ offer });
if (!peerConnection) return;
await peerConnection.setRemoteDescription(offer).catch(e => console.log(e));
console.log('REMOTE DESCRIPTION SET: ', peerConnection);
const answer = await peerConnection.createAnswer().catch(e => console.log(e));
console.log({ answer });
if (answer) signalingClient.sendSdpAnswer(answer, senderClientID);
// dataChannel = peerConnection.createDataChannel(`data-channel-of-${senderClientID}`);
// dataChannel.addEventListener("open", (event) => {
// console.log(event);
// dataChannel.send('******HI ALEC*******');
// });
});
// When an ICE candidate is received from the master, add it to the peer connection.
signalingClient.on('iceCandidate', async (candidate, senderClientID) => {
if (!peerConnection) return;
console.log('new iceCandidate received:', candidate);
await peerConnection.addIceCandidate(candidate).catch(e => console.log(e));
console.log("ICE CANDIDATE ADDED: ", candidate);
});
signalingClient.on('close', async () => {
if (!localStream) return;
// Handle client closures
console.log("ENDING THE CALL");
localStream.getTracks().forEach(track => track.stop());
peerConnection.close();
if ('srcObject' in providerVideoTile) providerVideoTile.srcObject = null;
});
signalingClient.on('error', error => {
// Handle client errors
console.log(error);
});
signalingClient.on('chat', (dataMessage: any) => {
const decodedMessage = UTF8Decoder.decode(new Uint8Array(dataMessage.data));
console.log("GOT TEST MESSAGE:", decodedMessage);
});
signalingClient.on('SeriesData', (dataMessage: any) => {
const seriesFromMobile = JSON.parse(UTF8Decoder.decode(new Uint8Array(dataMessage.data)));
console.log("SERIES FROM MOBILE:", seriesFromMobile);
kickOffSeriesCreation(seriesFromMobile);
});
signalingClient.on('EffortMarker', (dataMessage: any) => {
const effortMarker = UTF8Decoder.decode(new Uint8Array(dataMessage.data));
console.log("EFFORT MARKER:", effortMarker);
setEffortMarker(effortMarker);
});
signalingClient.on('CoachingMessage', async (dataMessage: any) => {
const coachingMessage = UTF8Decoder.decode(new Uint8Array(dataMessage.data));
console.log("COACHING MESSAGE FROM MOBILE:", coachingMessage);
if (coachingMessage === 'EndSeries') {
await handleForceEndEffort(signalingClient);
await handleEndSeries(signalingClient);
};
});
// Add Peer Connection Event Listeners
// Send any ICE candidates generated by the peer connection to the other peer
peerConnection.addEventListener('icecandidate', ({ candidate }) => {
if (candidate) {
console.log(candidate);
signalingClient.sendIceCandidate(candidate, patient.patientID);
} else {
// No more ICE candidates will be generated
console.log('NO MORE ICE CANDIDATES WILL BE GENERATED');
}
});
// As remote tracks are received, add them to the remote view
peerConnection.addEventListener('track', event => {
// if (patientVideoElement.srcObject) return;
setNoPatientConnected(false);
console.log({ event });
try {
peerConnection.addTrack(event.track, event.streams[0]);
if (event.track.kind === 'video') patientVideoElement.srcObject = event.streams[0];
} catch (e) {
console.log(e);
}
});
// Open Signaling Connection
signalingClient.open();
};
Try this this page, You can use master on one computer and viewer on other.
https://awslabs.github.io/amazon-kinesis-video-streams-webrtc-sdk-js/examples/index.html
For anyone else with the same issue, I managed to find the master example on this github repo and was able to get it working
Related
I'm currently using socket.io for real time alerts in my app. I'm developing it using React Native with Expo.
I import this instance of the socket into required components:
socketInstance.js
import io from 'socket.io-client';
import { url } from './url';
const socket = io(url, { secure: true });
export default socket;
And then use it to emit data to the server, for example, when the payment for an order has been completed:
OrderPurchaseScreen.js
const openPaymentSheet = async () => {
const { error } = await presentPaymentSheet();
if (error) {
Alert.alert(`Error code: ${error.code}`, error.message, [
{
text: "Try Again",
onPress: () => openPaymentSheet(),
},
{
text: "Cancel Order",
onPress: () => handleExit(),
style: "cancel",
},
]);
} else {
Alert.alert(
"Payment Successful",
"Your payment has successfully been processed."
);
socket.emit("order-purchase-complete", Store.getState().orderReducer.orderTicket.restaurantId);
setActive(false);
navigation.navigate('OrderCompleteScreen');
}
In the node server
server.js
io.on('connection', (socket) => {
socket.on("logIn", (userId) => {
console.log("new user logged in. - " + userId.toString());
socket.join(userId.toString());
socket.on("order-cancelled", (userId) => {
console.log("order cancelled");
io.to(userId.toString()).emit("order-cancelled", createNotificationObject("Order Cancelled", "The restaurant has cancelled your order. Your money will be refunded."));
});
socket.on("new-order-completed", (userId) => {
console.log("order completed");
io.to(userId.toString()).emit("new-order-completed", createNotificationObject("Order Completed", "Your order has been completed."));
});
});
socket.on("restaurantLogin", (restaurantId) => {
console.log("new restaurant logging in...");
socket.join(restaurantId.toString());
socket.on("new-order-for-approval", (restaurantId) => {
console.log("New Order For Approval!");
io.to(restaurantId.toString()).emit("new-order-for-approval", createNotificationObject("Order Awaiting Approval", "There is a new order awaiting approval. View it in the Restaurant Manager."));
});
socket.on("order-purchase-complete", (restaurantId) => {
console.log("new order purchase completed");
io.to(restaurantId.toString()).emit("order-purchase-complete", createNotificationObject("Order Completed", "A new order has been placed. View it in the Restaurant Manager."));
});
});
}
I have found that in dev mode, everything works fine and as expected. However when I switch to prod mode for IOS (have not tested Android), it only seems to be able to handle the user logging in. When it comes to emitting data after the order being completed for example, nothing gets emitted. Anyone know what I can do to debug this to help me find out the problem or have a potential solution?
Found the answer while browsing the socket.io documentation:
https://socket.io/blog/socket-io-on-ios/
"A Note About Multitasking in iOS
As you probably know, iOS is very picky about what you can do in the background. As such, dont expect that your socket connection will survive in the background! Youll probably stop receiving events within seconds of the app going into the background. So its better to create a task that will gracefully close the connection when it enters the background (via AppDelegate), and then reconnect the socket when the app comes back into the foreground."
So all I did was use AppState to get the state of the app, and depending on if it was in the foreground or background I would re connect to the socket or disconnect:
App.js
useEffect(async () => {
const subscription = AppState.addEventListener(
"change",
async (nextAppState) => {
if (
appState.current.match(/inactive|background/) &&
nextAppState === "active"
) {
if (_userToken !== null && email !== null && password !== null) {
socket.connect();
socket.emit("logIn", Store.getState().userReducer._id);
}
appState.current = nextAppState;
setAppStateVisible(appState.current);
if (appState.current === "background") {
socket.disconnect();
}
//console.log("AppState", appState.current);
}
);
My React code creates a WebSocket connection to our company's ActiveMQ 5.15.5 server, and then subscribes to the following two topics: salary and decoding. The problem is that the code is only able to subscribe to one of the topics. It cannot subscribe to both.
const client = window.Stomp.client(`ws://${ipAddress}:61614`, 'aj6.stomp');
const headers = { id: 'username' };
client.debug = null;
client.connect('user', 'pass', () => {
client.subscribe(
'/topic/salary', //BREAKPOINT was set here
message => {
const body = JSON.parse(message.body);
if (body && body.pcId) {
salaries[body.pcId] = body;
setState({ salaries});
}
},
headers,
);
client.subscribe(
'/topic/decoding', //BREAKPOINT was set here
message => {
const newBody = JSON.parse(message.body);
if (newBody && newBody.PcID) {
consoleMessage[newBody.PcID] = newBody;
setState({ consoleMessage });
}
},
headers,
);
});
So in the code above I put a break-point at client.subscribe('/topic/decoding... and client.subscribe('/topic/salary.... I saw that it only subscribes to /topic/decoding but not /topic/salary.
Does anyone know how I can fix this issue so that it subscribes to both topics?
From Stomp documentation:
Since a single connection can have multiple open subscriptions with a server, an id header MUST be included in the frame to uniquely identify the subscription. The id header allows the client and server to relate subsequent MESSAGE or UNSUBSCRIBE frames to the original subscription.
Within the same connection, different subscriptions MUST use different subscription identifiers.
Stomp API definition:
subscribe(destination, callback, headers = {})
So for my understanding, You can't have the same username id for both of your subscriptions
Try creating 2 clients, e.g.:
const salaryClient = window.Stomp.client(`ws://${ipAddress}:61614`, 'aj6.stomp');
const salaryHeaders = { id: 'salary' };
salaryClient.debug = null;
salaryClient.connect('user', 'pass', () => {
salaryClient.subscribe(
'/topic/salary',
message => {
const body = JSON.parse(message.body);
if (body && body.pcId) {
salaries[body.pcId] = body;
setState({ salaries});
}
},
salaryHeaders,
);
});
const decodingClient = window.Stomp.client(`ws://${ipAddress}:61614`, 'aj7.stomp');
const decodingHeaders = { id: 'decoding' };
decodingClient.debug = null;
decodingClient.connect('user', 'pass', () => {
decodingClient.subscribe(
'/topic/decoding',
message => {
const newBody = JSON.parse(message.body);
if (newBody && newBody.PcID) {
consoleMessage[newBody.PcID] = newBody;
setState({ consoleMessage });
}
},
decodingHeaders,
);
});
I am woking on WebRTC.
I have declared a variable localStream like
var localStream;
outside the component. The localStream contains the MediaStream at first but when my callback function is called after calling the Signaling Server, the variable localStream is empty. I want to store the MediaStream in the state so that it won't be empty after calling the Signaling server.
on page load -> localStream contains the stream
From client I place a call and it calls Signaling Server
From Signalling Server it calls back the client and here localStream is empty.
I tried useState(), useState(undefined) It doesn't work. It is null.
import ....
const hubUrl = 'https://f1d3e599fe6c.ngrok.io/ConnectionHub';
const conn =
new signalR.HubConnectionBuilder()
.withUrl(hubUrl, signalR.HttpTransportType.WebSockets)
.configureLogging(signalR.LogLevel.Debug)
.withAutomaticReconnect()
.build();
var localStream;
var peerConnectionConfig = { iceServers: [{ url: 'stun:stun.l.google.com:19302' }] };
onst webrtcConstraints = { audio: true, video: true };
const Chat = (props) => {
React.useEffect(() => {
...
}
receivedSdpSignal = (connection, partnerClientId, sdp) => {
console.log('WebRTC: processing sdp signal');
console.log('sdp', sdp);
connection.setRemoteDescription(new RTCSessionDescription(sdp)).then(() => {
console.log('WebRTC: set Remote Description');
if (connection.remoteDescription.type == "offer") {
console.log('WebRTC: remote Description type offer');
console.log('localStream', localStream);
connection.addStream(localStream);
console.log('WebRTC: added stream');
connection.createAnswer().then((desc) => {
console.log('WebRTC: create Answer...');
console.log('WebRTC: Description...');
console.log(desc);
connection.setLocalDescription(desc).then(() => {
console.log('WebRTC: set Local Description...');
console.log(connection.localDescription);
this.sendHubSignal(JSON.stringify({ "sdp": connection.localDescription }), partnerClientId);
}).catch(err => console.log("WebRTC: Error while setting local description", err));
}, err => console.log("WebRTC: Error while creating the answer", err));
} else if (connection.remoteDescription.type == "answer") {
console.log('WebRTC: remote Description type answer');
}
}).catch(err => console.log("WebRTC: Error while setting remote description", err));
}
const initializeUserMedia = () => {
console.log('WebRTC: called initializeUserMedia: ');
mediaDevices.getUserMedia(webrtcConstraints).then((stream) => {
console.log("WebRTC: got media stream");
localStream = stream;
let audioTracks = localStream.getAudioTracks();
if (audioTracks.length > 0) {
console.log(`Using Audio device: ${audioTracks[0].label}`);
}
}).catch(err => console.log("Error getting user media stream.", err));
}
This function is outside UseEffect(){}. The localStream is null here.
I am creating light for desk, this light will be green when my microphone is active on Discord, and red when my microphone is muted. So I want to get my microphone status from discord. But I don't know-how.
How to get my microphone status from discord when I am not in a voice chat too?
I know how to get microphone status when I am sitting on voice chat but I want to get this information the whole time.
Thanks.
You can use Client#voiceStateUpdate to detect when you change if you are muted or not.
client.on('voiceStateUpdate', (oldState, newState) => {
if (oldState.member.user.id !== "youridhere") return // make sure its only checking for your microhpone
if (newState.mute) { // check if mic is muted
setLightColor("red") //if true make light red
} else {
setLightColor("green") //if false make light green
}
})
I finally found a way to do this !
I made a custom Node.js app using rpcord.
It requires a registered discord app (no need to validate it). You will have to manually link it to Discord using the URL Generator with the rpc and rpc.voice.read scopes, then copy/paste the Client ID, Client Secret and Discord Token in the constants.
The app will run a command every time the Discord voice settings change. In my case, it changes the color of a key on my Logitech keyboard, but you can adapt it to your own needs.
const { RPClient, RPCEvent, Presence } = require("rpcord");
const { exec } = require("child_process");
const CLIENT_ID = 'YOUR APP CLIENT ID';
const CLIENT_SECRET = 'YOUR APP CLIENT SECRET';
const DEFAULT_COLOR = '004f8b';
const DISCORD_TOKEN = 'YOUR DISCORD TOKEN';
exec(`g815-led -a ${DEFAULT_COLOR}`);
const rpc = new RPClient(CLIENT_ID, {
secret: CLIENT_SECRET,
scopes: ["rpc", "rpc.voice.read"],
});
function run(cmd) {
return new Promise((resolve, reject) => {
exec(cmd, (error, stdout, stderr) => {
if (error) return reject(error)
if (stderr) return reject(stderr)
resolve(stdout)
})
})
}
rpc.on("ready", () => {
console.log("Connected!");
});
rpc.connect().then(() => {
rpc.authenticate(DISCORD_TOKEN).then(() => {
rpc.getVoiceSettings().then(voiceSettings => {
console.log(voiceSettings);
});
rpc.subscribe(RPCEvent.VoiceSettingsUpdate).then((...args) => {
console.log('subscribed !');
rpc.on('voiceSettingsUpdate', async (voiceSettings) => {
console.log('voice update', voiceSettings);
if (voiceSettings.mute) {
await run('g815-led -k g1 ff0000');
}
else {
await run(`g815-led -k g1 ${DEFAULT_COLOR}`);
}
if (voiceSettings.deaf) {
await run('g815-led -k g2 ff0000');
}
else {
await run(`g815-led -k g2 ${DEFAULT_COLOR}`);
}
});
});
});
});
Running into an issue with React/Socket.io. I have two different socket emitters/listeners: one for a chat, and one for keeping track live changes to the application. I have two separate windows running localhost. The issue is when i emit a change on one window, the other window can receive that change the first time but never again (i.e. get first chat message but none that follow). After that first emit/receive, the sending client starts to receive its own emitters.
front end code:
`
socket = io("localhost:3002");
componentDidMount() {
//get id from url
const { id } = this.props.match.params;
//join specific room for project
this.socket.on("connect", () => {
this.socket.emit("room", this.projectId);
});
//listener for incoming messages
this.socket.on("RECEIVE_MESSAGE", (data) => {
this.props.addChat(this.projectId, data);
});
this.socket.on("UPDATE_PROJECT", () => {
console.log("update");
this.props.fetchProject(id);
});
}
emitTaskChange = () => {
this.socket.emit("TASK_CHANGE", { data: null });
};
onChatSubmit = (e) => {
e.preventDefault();
//create object with current user as author, message, and a timestamp
const chat = {
author: this.props.currentUser.name,
message: this.state.newChat,
createdAt: new Date().toLocaleString(),
};
//send message through socket
this.socket.emit("SEND_MESSAGE", chat);
//call action creator to add new chat
this.props.addChat(this.projectId, chat);
this.setState({ currentMessage: "" });
};
handleTaskEdit = (taskId, currentStatus) => {
const newStatus = currentStatus === "todo" ? "inprogress" : "completed";
this.props.editTask(this.projectId, taskId, newStatus);
this.emitTaskChange();
};
`
backend code:
`
const io = socket(server);
//create separate chat rooms using project id
io.on("connection", (socket) => {
socket.on("room", (room) => {
socket.join(room);
socket.in(room).on("SEND_MESSAGE", (message) => {
socket.emit("RECEIVE_MESSAGE", message);
});
socket.in(room).on("TASK_CHANGE", (data) => {
socket.emit("UPDATE_PROJECT", data);
});
});
`
found the error:
had to change the server-side code from socket.on and instead use the io object that was initialized such as io.sockets.on