WebRTC Sample in Kotlin

DeveloperSpace
ProAndroidDev
Published in
6 min readApr 26, 2021

--

What is WebRTC

WebRTC is a platform that supports video, voice, and generic data to be sent between peers, allowing developers to build powerful voice and video communication solutions.

What is Signaling Server

To establish a WebRTC connection between two devices we require a signaling server. Signaling Server helps us to connect them over the internet. A signaling server’s job is to serve as an intermediary to let two peers find and establish a connection without sharing much information.

This article describes the designing & implementation of a sample WebRTC Calling in Kotlin. I have used the native WebRTC library v1.0.32006 in the sample app. Along with it, I am using Cloud Firestore to act as Signaling Server. So the data between the two peer connections are exchanges through Cloud Firestore.

The Cloud Firestore data model supports flexible, hierarchical data structures. Store your data in documents and collections.

Firestore Structure Example

Store SDP in Firestore
Sample Structure of Firestore

We have named “calls” as the main collection which includes meeting id's, Each meeting -id is a document that includes fields such as “type” & “sdp”. Data under these fields are related to specific call and they are also called Session Description.

  • type: It is a message type that can be either OFFER or ANSWER.
  • sdp: The SDP (Session Description Protocol) is a string that describes the connection of a local end from the local user's perspective. Similarly, at the remote end, this string is described as the connection of the remote user from the receiver’s point of view.

Below given is an example of the Firestore collection after OFFER is created.

Structure of Firestore after OFFER

The meeting-ids document also has a candidate collection that includes “offerCandidate” & “answerCandidate” document. Each of these documents includes data that is used to connect candidates with each other hence it is called ICE (Interactive Connectivity Establishment) Candidate.

Below is the example of ICECandidate in Firestore Collection.

Structure of ICE Candidates in Firestore

Key Components & Libraries Used

WebRTC Native Library — A WebRTC Native Library to add Support of WebRTC in your app.

Firebase — To act as a signaling server, which will help you to handle events to maintain communication between peer connections.

Prerequisites

Before diving into more details please ensure that you have checked the below points.
— Android studio installed in your system.
— Android Device or Emulator to run your app.

  • We will begin by fetching the code from GitHub.
  • You can clone the project from the WebRTC-Kotlin-Sample repository.
// Clone this repository
git clone https://github.com/developerspace-samples/WebRTC-Kotlin-Sample.git
  • Next step is to setup Firebase Account and create a new project. Once the project is created, add a new Android App in the Firebase Project and add google-services.json file in your “app” folder

Click here to know How to setup Firebase in Android App

RTCActivity.kt

This is the core activity that handles the call operation. Whenever the user starts the call it checks for the Camera & Audio permissions. If the permission is granted then “rtcClient.call(sdpObserver,meetingID)” gets triggered.
This activity also used to perform other operations like “Mute or Unmute Audio”, “Pause or Resume Video”, “Switch Camera”, etc.

  • rtcClient.switchCamera() — This method is used to switch camera from front to back or vice versa during the call.
  • rtcClient.enableVideo(boolean isVideoEnabled) — This method is used to pause or resume the video during the call.
  • rtcClient.enableAudio(boolean isMute) — This method is used to mute or unmute audio during the call.
  • audioManager.setDefaultAudioDevice(RTCAudioManager.AudioDevice.EARPIECE) — This method is used to switch audio mode to earpiece during the call.
  • audioManager.setDefaultAudioDevice(RTCAudioManager.AudioDevice.SPEAKER_PHONE) — This method is used to switch audio mode to Speaker during the call.
WebRTC Calling Control Information

It also includes Signalling listener which executes methods such as onOfferReceived() , onAnswerReceived(), onIceCandidateReceived() .

  • onOfferReceived() — This method gets triggered whenever the SignallingClient.kt receives an “OFFER” as a “type” in call object from Firestore.
  • onAnswerReceived() — This method gets triggered whenever the SignallingClient.kt receives an “ANSWER” as a “type” in call object from Firestore.
  • onCallEnded() — This method gets triggered whenever the SignallingClient.kt receives an “END_CALL” as a “type” in call object from FireStore.
  • onIceCandidateReceived() — This method gets triggered in SignallingClient.kt whenever the ICE Candidate is added to FireStore.

Please click the below link to check the Full Source code of RTCActivity.kt

https://github.com/developerspace-samples/WebRTC-Kotlin-Sample/blob/master/app/src/main/java/com/developerspace/webrtcsample/RTCActivity.kt

SignalingClient.kt

SignallingClient.kt is a coroutine class that is used to execute listeners constantly in the background. It includes Snapshot listener for “calls” object and checks if the “type” is OFFER, ANSWER or END_CALL and based on it call certain methods of the listener.

Similarly, it also listens for the candidates added to the call. Currently, we have the option only for 1–1 call, so there will be two candidates expected to add under candidate collection “offerCandidate” & “answerCandidate”

This class also include sendCandidate() which is used to update candidate in Firestore. It checks whether a candidate is “answerCandidate” or “offerCandidate” and based on it update it to the Firestore collection.

RTCClient.kt

This is the core class in the app which is used to manage the connection between the server and peer to maintain a session.

It initializes a PeerConnection and helps to maintain local audio and video streams on the server. It includes methods like PeerConnection.call(), PeerConnection.answer(), etc.

  • fun PeerConnection.call(sdpObserver: SdpObserver, meetingID: String):
    This method is used to initiates a call service using createOffer() method.
    Once the peer connection is built and the createOffer() method gives success we push the SDP on the Firestore with type as OFFER.
  • fun PeerConnection.answer(sdpObserver: SdpObserver, meetingID: String): This method is used to give response of a offer using createAnswer() method. This method gets triggered whenever the user receives an offer and on the success of createAnswer(), we update the SDP on the Firestore with type as ANSWER”. Once the answer received at another end successfully, call gets start
  • fun onRemoteSessionReceived(sessionDescription: SessionDescription):
    This method is used to add remote session on PeerConnection to establish connection. Therefore it is being added under the both call() & answer() methods.
  • fun addIceCandidate(iceCandidate: IceCandidate?) : This method is used to add candidate on the connection. It gets triggered for both “offerCandidate & “answerCandidate, so they can communicate with each other.
fun addIceCandidate(iceCandidate: IceCandidate?) {    
peerConnection?.addIceCandidate(iceCandidate)
}
  • fun endCall(meetingID: String) : This method is used to end the connection between both the users. It fetches the candidate information from Firebase and removes it from the connection, once candidates are removed successfully it closes the connection.

--

--