The transition of internet communications from dedicated, native desktop apps to mobile and web has taken 10 years.
When we started Wire, mobile and web were our focus. The first desktop prototype that we built used HTML5. It meant architecting the backend for browsers. It also meant we needed to find a way for real time media to run on the web.
So far, there has been no standard for real time audio in HTML. Browser-based calling solutions have required plugins or Flash. For web developers, this has been one of the critical limitations when it comes to building applications.
To address this, Google has been working with the World Wide Web Consortium (W3C) and the Internet Engineering Task Force (IETF). In 2011, they formed the Web Real-Time Communications Working Group. Additionally, they acquired and made available open source assets for real time audio from Global IP Sound (GIPS).
The Wire team is experienced with WebRTC. Koen Vos, Wire’s Chief Scientist, and I worked at GIPS on implementation and deployment. Since then, we have contributed to the effort. We have also led the development of the IETF standardized open source and freeware codecs, OPUS and iLBC.
Today, we are using WebRTC as the media foundation for Wire’s software and as the magic behind Wire for web. HTML5 real-time communications capabilities are something we believe in and continue investing in. While developing our own proprietary enhancements to the Wire media stack, we maintain strict compatibility with WebRTC. This means interoperability with any browser that supports the WebRTC standard.
— Alan Duric, Wire co-founder and CTO