IMS WebRTC gateways

In the last two years I've been attending conferences and presentations where traditional telco equipment providers try to sell what they typically call WebRTC gateways.

I already mentioned this issue in another post last year (WebRTC facts and lies), but in this case I will try to explain in more detail why from my perspective the concept of WebRTC gateways itself is wrong, why I see it just as a new attempt from vendors to sell as much boxes as possible and why there is a much more better approach for a telco in my opinion.

There are some recurrent misconceptions about WebRTC that are impacting the decisions made in this area.  Let's try to clarify them before discussing the proposed approach:

  • WebRTC is not about signaling at all while IMS (and specially SIP) is mostly about signaling.  WebRTC only defines how media is transmitted between endpoints.  Then it makes not sense to talk about WebRTC to SIP gateways or WebRTC to IMS gateways.  You can use SIP with WebRTC, and in fact is what a lot (perhaps even most) of people is doing both in commercial and opensource projects (OpenSER, Asterisk, FreeSWITCH).
  • WebRTC is not yet standardized, just one month ago we were still discussing if we should remove SDP, just two weeks ago there was some initial consensus on how to transmit multiple video channels in WebRTC, the support for SDES for SRTP is still under discussion and there is no agreement at all on video codecs to be supported and probably won't be.   It makes no sense to try to be compliant with WebRTC when it changes every week or every month and every new Chrome release potentially breaks rest of existing endpoints (Chrome 26 changed the ICE role negotiation, Chrome 27 disabled PLIs and NACKs unless explicitly enabled, Chrome 28 crashes if using TURN and not bundle....)
  • WebRTC does not define new protocols but just bundle existing (and mostly telco) protocols like SDP, RTP, RTCP, SRTP, ICE that have been in place for around 10 years.
With all these points in mind, lets consider the typical IMS architecture deployed by telcos.   In this architecture there is a network access role defined as P-CSCF in the 3GPP that is typically implemented in a component known as Session Border Controller (SBC).   These SBCs provide support for SIP over different transports, RTP/RTCP, NAT traversal through latching, SRTP encryption, transcoding ....[1].  Basically being the entry point in the telco network they try to adapt differences between endpoints while protecting the network to potential attacks.

Let's review the support required by a SBC to be accessed from a WebRTC browser and make use of any service in the IMS network:
  • SIP over HTTP/WS:  SBCs support a lot of transports (UDP, TCP, TLS, SCTP) but not yet Websockets [2].   This is a very small addition and most of opensource projects are already supporting it. To give an idea of the complexity, I've seen node.js SIP WS proxy implementations with less than 50 lines of code. 
  • RTP and RTCP: Already supported by SBCs.   WebRTC adds some extensions to RTCP, but they are mostly used for video and they are not required to interoperate.  We have successfully implemented interconnection ignoring all those extensions.
  • SRTP: Already supported by SBCs.
  • DTLS/SDES key negotiation: All the SBCs support SDES key negotiation because it is mandated by 3GPP.   DTLS was the initial proposal by the IETF as unique and mandatory to use solution, but the issue is still open and major SBC provider representatives are saying that they are ok with DTLS. [3]  Chrome supports SDES and they don't plan to remove it, Firefox only supports DTLS.
  • ICE: WebRTC endpoints (browsers) require ICE support for media flows establishment.  This is already supported by some of the most popular SBCs [4] and in the worst case they only need to support ICE-lite and not full ICE that is basically just implement echoing a STUN packet.
  • Codecs: Codecs is the biggest pain point between WebRTC and IMS happens.   Because there is no agreement on WebRTC video codec and because most of IMS usage is for audio lets focus on those codecs.   IMS endpoints typically implement variants of AMR and G-7XX codecs while browsers implement OPUS and G-711.   There are different options to make this a non-issue: updating IMS endpoints, transcoding in the SBC (that usually have dedicated hardware for that [5]) or using G-711 (most scenarios today will be using desktop computers with good enough wifi connectivity).
  • There are other extensions like rtcp-mux or bundle but they and mostly optimizations for P2P establishments that is not usually allowed in IMS networks and anyway it is optional in today's browsers implementations (Firefox doesn't support them)
As you can see from that enumeration there is no "red" point that would recommend not to use the SBC as the entry point for browsers to the IMS network.   Adding another "gateway" in front just adds delay, points of failure, bottlenecks, more components to scale, inefficiency in deployments and specially more potential interoperability issues.

Note: I remember people discussing 5-10 years ago why adding a B2BUA (like SBCs) is a bad idea and now we are trying to put 2 in cascade !!!!

[1] http://tools.ietf.org/html/rfc5853
[2] http://www.ietf.org/id/draft-ietf-sipcore-sip-websocket-09.txt
[3] http://www.ietf.org/mail-archive/web/rtcweb/current/msg08315.html
[4] http://www.ericsson.com/us/ourportfolio/products/session-border-gateway-sbg
[5] https://support.acmepacket.com/docs/PUB/SCX637/Net-Net%204500%20and%203820%20S-CX6.3.7%20M1%20Transcoding%20Essentials%20Guide.pdf

Comments

Popular posts from this blog

Bandwidth Estimation in WebRTC (and the new Sender Side BWE)

Improving Real Time Communications with Machine Learning

Controlling bandwidth usage in WebRTC (and how googSuspendBelowMinBitrate works)