Implementing WebRTC in IOS Apps

Posted By : Sumit Chahar | 27-Dec-2018

Integrating WebRTC in IOS

WebRTC is an open source project that provides mobile applications and web browsers with real-time communication. It is an exciting and powerful technology. It provides free API  plugins that are usable with a desktop or a mobile browser. This technology is supported by major modern browser vendors. Earlier external plugins were used to achieve such results but now the same functionality is offered by WebRTC.


WebRTC uses various standards and protocols, which we will be discussing in this article. Some of them inclused data streams, signalling, ICE, SIP, STUN/TURN servers, JSEP, UDP/TCP, network sockets, SDP, NAT, and more.


Oodles Technologies is a renowned WebRTC Application Development company in India. We offer holistic WebRTC integration solutions for a wide range of applications and use cases.


       Key Features of WebRTC

  1. Peer-to-Peer Conferencing

  2. Video Calling

  3. Voice Calling

  4. Peer-to-Peer File Transfers

  5. Chat support

  6. Desktop sharing

Technical Highlights of WebRTC

Let’s have a look at what are the technical expectations from WebRTC applications:

  • Streaming of audio, video and other data

  • Gathering network information like IP address and port. Transferring this information to WebRTC clients (peers) for connecting easily through NATs and firewalls

  • In case an error is reported immediate initiation, session closing and coordination and signalling communication start taking place.

  • Communicate with audio, streaming video, or data.

  • Exchanging information regarding media and client capability including resolution and codecs.

Oodles Technologies implement above-listed functions using WebRTC with the main APIs such as:

MediaStream (aka getUserMedia)




Let's have a Look at how Oodles implement WebRTC in IOS.


CocoaPods, application dependency manager of Objective-C and other programming languages that run on Objective C runtime. It offers a standard format for the management of external libraries.  


Let's look at the installation of Cocoa Pods

Step 1: Download CocoaPods

CocoaPods is disseminated as a ruby gem, and is installed by running the following commands in

$ sudo gem install cocoa pods

$ pod setup

Step 2: Creating a Podfile

The next step is creating a Podfile. Project dependencies are managed through CocoaPods visible in Podfile. We create directory files similar to the Xcode project (.xcodeproj) file:

  • $ touch Podfile

  • $ open -e Podfile

  • TextEdit opens an empty file.

A question is prompted: Ready to add some content to the empty pod file?

Now Copy and paste the following lines to the TextEdit window:

" platform :ios, '9.0' "

" pod 'WebRTC' "


Step 3: Installing Dependencies

Now you can install the dependencies in your project:

$ pod install

From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:

$ open ProjectName.xcworkspace.


Step 4: Link Binary With Library Frameworks

Click on the Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.

Here is the List of Required Apple Library frameworks:

  •    ReplayKit.framework

  •    CoreGraphics.framework

  •    AVFoundation.framework

  •    CoreMedia.framework

  •    CoreVideo.framework

  •    CoreImage.framework

  •    GLKit.framework

  •    AudioToolbox.framework 

  • VideoToolbox.framework


3.Create Room for video Calling.


- (void)connectToRoomWithId:(NSString *)roomId

                   settings:(ARDSettingsModel *)settings




      shouldUseLevelControl:(BOOL)shouldUseLevelControl {


  NSParameterAssert(_state == kARDAppClientStateDisconnected);

  _settings = settings;

  _isLoopback = isLoopback;

  _isAudioOnly = isAudioOnly;

  _shouldMakeAecDump = shouldMakeAecDump;

  _shouldUseLevelControl = shouldUseLevelControl;

  self.state = kARDAppClientStateConnecting;


#if defined(WEBRTC_IOS)

  if (kARDAppClientEnableTracing) {

    NSString *filePath = [self documentsFilePathForFileName:@"webrtc-trace.txt"];




    [[NSNotificationCenter defaultCenter] addObserver:self



                                                 name:@"loginComplete" object:nil];

  // Request TURN.

  __weak ARDAppClient *weakSelf = self;

  [_turnClient requestServersWithCompletionHandler:^(NSArray *turnServers,

                                                     NSError *error) {

    if (error) {

      RTCLogError("Error retrieving TURN servers: %@",



    ARDAppClient *strongSelf = weakSelf;

    [strongSelf.iceServers addObjectsFromArray:turnServers];

    strongSelf.isTurnComplete = YES;

    [strongSelf startSignalingIfReady];



  // Join room on room server.

  [_roomServerClient joinRoomWithRoomId:roomId


      completionHandler:^(ARDJoinResponse response, NSError error) {

    ARDAppClient *strongSelf = weakSelf;

    if (error) {

      [strongSelf.delegate appClient:strongSelf didError:error];



    NSError *joinError =

        [[strongSelf class] errorForJoinResultType:response.result];

    if (joinError) {

      RTCLogError(@"Failed to join room:%@ on room server.", roomId);

      [strongSelf disconnect];

      [strongSelf.delegate appClient:strongSelf didError:joinError];



    RTCLog(@"Joined room:%@ on room server.", roomId);

    strongSelf.roomId = response.roomId;

    strongSelf.clientId = response.clientId;

    strongSelf.isInitiator = response.isInitiator;

    for (ARDSignalingMessage *message in response.messages) {

      if (message.type == kARDSignalingMessageTypeOffer ||

          message.type == kARDSignalingMessageTypeAnswer) {

        strongSelf.hasReceivedSdp = YES;

        [strongSelf.messageQueue insertObject:message atIndex:0];

      } else {

        [strongSelf.messageQueue addObject:message];



    strongSelf.webSocketURL = response.webSocketURL;

    strongSelf.webSocketRestURL = response.webSocketRestURL;

    [strongSelf registerWithColliderIfReady];

    [strongSelf startSignalingIfReady];





4. Manage Local Video Track.


- (RTCVideoTrack *)createLocalVideoTrack {
RTCVideoTrack* localVideoTrack = nil;
// The iOS simulator doesn't provide any sort of camera capture
// support or emulation ( so don't bother
// trying to open a local stream.
if (!_isAudioOnly) {
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack =
[_factory videoTrackWithSource:source

5.Start Camera Capture.


- (void)startCapture {
AVCaptureDevicePosition position =
_usingFrontCamera ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
AVCaptureDevice *device = [self findDeviceForPosition:position];
AVCaptureDeviceFormat *format = [self selectFormatForDevice:device];
int fps = [self selectFpsForFormat:format];
[_capturer startCaptureWithDevice:device format:format fps:fps];


6.Stop Camera Capture.


- (void)stopCapture {
[_capturer stopCapture];


7.Switch camera.


- (void)switchCamera {
usingFrontCamera = !usingFrontCamera;
[self startCapture];

Are you planning to implement WebRTC in IOS apps? Get in touch Oodles technologies to experience excellent services in web application development and WebRTC Software development services at best prices. Our experts offer best WebRTC solutions to businesses across the globe. Contact us now for complete details.

About Author

Author Image
Sumit Chahar

Sumit Chahar is working as an IOS Developer.He is very dedicated towards his work.

Request for Proposal

Name is required

Comment is required

Sending message..