Mute audio and video
In the previous lesson we have learnt how to connect to a conference, even if it's protected by a PIN. Now we will add some new features. We will start by allowing the user to protect their own privacy - and for that we will set two new buttons in our interface to allow the user to mute their microphone and camera.
During this lesson we will complete the following tasks:
- Hide the local video in case we don't have a local stream active.
- Define the function that will be triggered when the user presses the mute microphone button.
- Define the function that will be triggered when the user presses the mute camera button.
- Modify the toolbar to include both buttons.
You can download the starter code from Step.03-Exercise-Mute-audio-and-video.
App component
We will start by defining the functions that will be triggered when the user
presses any of the mute buttons in src/App.tsx
.
We will define these functions here instead of doing it in the Toolbar
, since
we will need access to the infinityClient
and localStream
variables.
We define the function that will be triggered when the user click on the audio mute button:
const handleAudioMute = async (mute: boolean): Promise<void> => {
if (mute) {
localAudioStream?.getTracks().forEach(track => {
track.stop();
});
setLocalAudioStream(undefined);
} else {
const stream = await navigator.mediaDevices.getUserMedia({
audio: true,
});
setLocalAudioStream(stream);
}
await infinityClient.mute({mute});
};
And now we do the same for the video mute:
const handleVideoMute = async (mute: boolean): Promise<void> => {
if (mute) {
localVideoStream?.getTracks().forEach(track => {
track.stop();
});
setLocalVideoStream(undefined);
} else {
const localVideoStream = await navigator.mediaDevices.getUserMedia({
video: true,
});
setLocalVideoStream(localVideoStream);
infinityClient.setStream(
new MediaStream([
...(localAudioStream?.getTracks() ?? []),
...localVideoStream.getTracks(),
]),
);
}
await infinityClient.muteVideo({muteVideo: mute});
};
As you can see, to mute and unmute in Pexip Infinity we only have to call
infinityClient.mute()
and infinityClient.muteVideo()
. However, in order to
mute video, we need to perform two additional tasks. The first task is to
stop all the tracks from the localAudioStream
or localVideoStream
depending on which device the user mutes. This will release the access to the
microphone or camera and, in case of the camera, turn its LED off. The second
task is to do the opposite. We should request access again once the user clicks
on unmute. We do this through getUserMedia()
and indicate the device (audio or
video) that we want to access.
The next step is to modify the conference component and assign the new callbacks
to onAudioMute
and onVideoMute
.
<Conference
localVideoStream={localVideoStream}
remoteStream={remoteStream}
onAudioMute={handleAudioMute}
onVideoMute={handleVideoMute}
onDisconnect={handleDisconnect}
/>
Conference component
First of all, we will make small changes in the
src/components/Conference/Conference.tsx
file.
We will start by defining the properties onAudioMute
and onVideoMute
in the
interface:
interface ConferenceProps {
localVideoStream: MediaStream | undefined;
remoteStream: MediaStream | undefined;
onAudioMute: (mute: boolean) => Promise<void>;
onVideoMute: (mute: boolean) => Promise<void>;
onDisconnect: () => Promise<void>;
}
One important improvement is that we would like to hide the local video element when the user mutes their camera. This is a small change, but important for the user experience.
For implementing this feature, we only have to modify one line of code. We will
add a condition for rendering the localVideoStream
and this condition is that
props.localVideoStream
should exist:
{props.localVideoStream != null && (
<Video
className="local-video"
srcObject={props.localVideoStream}
isMirrored={true}
/>
)}
The next change is to pass the props to the Toolbar
:
<Toolbar
className="toolbar"
onAudioMute={props.onAudioMute}
onVideoMute={props.onVideoMute}
onDisconnect={props.onDisconnect}
/>
Toolbar component
Now we will implement the main part of this feature by modifying
src/components/Conference/Toolbar/Toolbar.tsx
file and adding the buttons to
support this functionality.
The first step is to add a new import for the useState
hook:
import {useState} from 'react';
The next step is to define onAudioMute
and onVideoMute
in the list of props
for the Toolbar
:
interface ToolbarProps {
className: string;
onAudioMute: (mute: boolean) => Promise<void>;
onVideoMute: (mute: boolean) => Promise<void>;
onDisconnect: () => Promise<void>;
}
Now we define the state variables that will be used to store the current state. Thanks to these variables we will update the interface accordingly:
const [audioMuted, setAudioMuted] = useState(false);
const [videoMuted, setVideoMuted] = useState(false);
const [processingAudioMute, setProcessingAudioMute] = useState(false);
const [processingVideoMute, setProcessingVideoMute] = useState(false);
These states will be used to change our UI:
audioMuted
andvideoMuted
are used to store the current state for mute and change the button icons and colors.processingAudioMute
andprocessingVideoMuted
are used to indicate that the mute state is in process of changing to avoid reaching any unstable state.
Now we will add the function that will be triggered each time the user presses
the mute audio button. This function will call the function defined in the
Conference
component and update the state:
const handleAudioMute = async (): Promise<void> => {
setProcessingAudioMute(true);
await props.onAudioMute(!audioMuted);
setAudioMuted(!audioMuted);
setProcessingAudioMute(false);
};
Now we do the same for the mute video button:
const handleVideoMute = async (): Promise<void> => {
setProcessingVideoMute(true);
await props.onVideoMute(!videoMuted);
setVideoMuted(!videoMuted);
setProcessingVideoMute(false);
};
The final step is to create the buttons themselves:
<Tooltip text={`${audioMuted ? 'Unmute' : 'Mute'} audio`}>
<Button
onClick={() => {
handleAudioMute().catch(console.error)
}}
variant="translucent"
modifier="square"
isActive={!audioMuted}
colorScheme="light"
disabled={processingAudioMute}
>
<Icon
source={
audioMuted
? IconTypes.IconMicrophoneOff
: IconTypes.IconMicrophoneOn
}
/>
</Button>
</Tooltip>
<Tooltip text={`${videoMuted ? 'Unmute' : 'Mute'} video`}>
<Button
onClick={() => {
handleVideoMute().catch(console.error)
}}
variant="translucent"
modifier="square"
isActive={!videoMuted}
colorScheme="light"
disabled={processingVideoMute}
>
<Icon
source={videoMuted ? IconTypes.IconVideoOff : IconTypes.IconVideoOn}
/>
</Button>
</Tooltip>
One detail that you should notice is that the icon and tooltip change depending on the mute state of the variables.
Run the app
When you run the app, you will see your two new buttons in the interface. If you click on them, you will see that the audio/video is muted and the button style changes. In the case of the mute video button, the local video will also be hidden when the mute is active.
You can compare your code with the solution in Step.03-Solution-Mute-audio-and-video. You can also check the differences with the previous lesson in the git diff.