/plushcap/analysis/100ms/webrtc-audio-streaming-android

How We Built Local Audio Streaming in Android

What's this blog post about?

The text discusses the implementation of a feature for an Android social audio rooms app, where users can host talk shows and stream local music simultaneously. To achieve this, significant changes were made to WebRTC and the Android SDK. The problem was broken down into three parts: capturing audio from the device mic, capturing local audio from the device, and mixing two audio streams. The first part involved understanding how audio is captured by WebRTC and creating a callback to get the byte buffer read from the microphone. In the second part, Android's AudioRecord class was used to record system audio played by other apps on the device. The third part focused on mixing the two audio streams using a method that combines the ShortArrays of both streams into a single ByteArray. The final solution involves capturing audio from both sources and replacing the microphone byte buffer with the resultant combined audio stream, allowing users to host talk shows while streaming local music simultaneously.

Company
100ms

Date published
Feb. 5, 2023

Author(s)
Pratim Mallick

Word count
1375

Language
English

Hacker News points
None found.


By Matt Makai. 2021-2024.