Skip to main content

Kotlin

The BRTC Kotlin SDK enables you to build real-time audio communication applications on Android. This SDK manages WebRTC endpoints, handles audio streaming, and provides a simple API for connecting users through high-quality voice calls.

Installation

ResourceDescription
Kotlin SDKAn Android SDK to manage WebRTC endpoints and connect them to other endpoints in your application.
AAR ReleasesThe SDK is distributed as an AAR artifact attached to GitHub Releases.
Sample ApplicationA sample Android application that demonstrates outbound PSTN dialing and live call quality monitoring using BRTC.

Requirements

  • Android API 24+ (Nougat)
  • Kotlin 2.0+
  • Gradle 9.0+, Android Gradle Plugin 8.7+
  • JVM target 17

Install via AAR

  1. Download the latest bandwidthrtc-release.aar from the releases page.

  2. Place it in your app's libs/ directory.

  3. Add the following to your app's build.gradle.kts:

dependencies {
implementation(files("libs/bandwidthrtc-release.aar"))

// Required transitive dependencies
implementation("com.squareup.okhttp3:okhttp:4.12.0")
implementation("org.jetbrains.kotlinx:kotlinx-coroutines-android:1.7.3")
implementation("org.jetbrains.kotlinx:kotlinx-serialization-json:1.6.2")
implementation("io.getstream:stream-webrtc-android:1.3.7")
}

Android Permissions

Add the following permissions to your AndroidManifest.xml:

<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
caution

RECORD_AUDIO is a dangerous permission and must be requested at runtime before calling publish(). Without it, the SDK will publish a silent audio stream with no error.

Getting Started

1. Initialize the Client

Create a new instance of the BandwidthRTC client. All SDK operations are suspending functions and must be called from a coroutine scope.

import com.bandwidth.rtc.BandwidthRTC
import com.bandwidth.rtc.types.LogLevel

// Create client with optional log level (default: WARN)
// Available levels: OFF, ERROR, WARN, INFO, DEBUG, TRACE
val bandwidthRtc = BandwidthRTC(context, LogLevel.DEBUG)

2. Connect to Bandwidth RTC Platform

Connect to the Bandwidth RTC platform using an endpoint token obtained from your backend server.

import com.bandwidth.rtc.types.RtcAuthParams

try {
bandwidthRtc.connect(RtcAuthParams(endpointToken = "your-endpoint-token-here"))
println("Connected to Bandwidth RTC!")
} catch (e: Exception) {
println("Failed to connect: ${e.message}")
}

Connection Options

You can pass additional options when connecting:

import com.bandwidth.rtc.types.RtcOptions
import com.bandwidth.rtc.types.AudioProcessingOptions
import org.webrtc.PeerConnection

val options = RtcOptions(
// Optional: Override the default WebSocket URL
websocketUrl = "wss://your-custom-url.com",

// Optional: Provide custom ICE servers
iceServers = listOf(
PeerConnection.IceServer.builder("stun:stun.l.google.com:19302").createIceServer()
),

// Optional: Set ICE transport policy
iceTransportPolicy = PeerConnection.IceTransportsType.ALL,

// Optional: Configure audio processing
audioProcessing = AudioProcessingOptions(
enableHardwareAec = false,
enableSoftwareEchoCancellation = false,
enableSoftwareNoiseSuppression = false,
enableAutoGainControl = false
)
)

bandwidthRtc.connect(RtcAuthParams(endpointToken = "your-token"), options)

3. Set Up Event Listeners

Register callbacks to respond to incoming streams and connection events:

// Called when a remote stream becomes available
bandwidthRtc.onStreamAvailable = { rtcStream ->
println("New stream available: ${rtcStream.streamId}")
println("Media types: ${rtcStream.mediaTypes}")
// Use rtcStream.mediaStream to attach audio to your UI
}

// Called when a remote stream is no longer available
bandwidthRtc.onStreamUnavailable = { streamId ->
println("Stream unavailable: $streamId")
}

// Called when the connection is ready
bandwidthRtc.onReady = { metadata ->
println("Connection ready!")
println("Endpoint ID: ${metadata.endpointId}")
println("Device ID: ${metadata.deviceId}")
println("Region: ${metadata.region}")
println("Territory: ${metadata.territory}")
}

// Called when the remote side disconnects
bandwidthRtc.onRemoteDisconnected = {
println("Remote side disconnected")
}

Publishing Media

Publish Audio (Default)

Publish audio with default settings:

try {
val rtcStream = bandwidthRtc.publish(audio = true)
println("Publishing audio stream: ${rtcStream.streamId}")
} catch (e: Exception) {
println("Failed to publish media: ${e.message}")
}

Publish with Stream Alias

Add an alias to your stream for easier identification in billing records and events:

val rtcStream = bandwidthRtc.publish(
audio = true,
alias = "user-microphone"
)

Controlling Published Media

Mute/Unmute Microphone

// Mute microphone
bandwidthRtc.setMicEnabled(false)

// Unmute microphone
bandwidthRtc.setMicEnabled(true)

Speakerphone Control

// Enable speakerphone
bandwidthRtc.setSpeakerphoneOn(true)

// Disable speakerphone
bandwidthRtc.setSpeakerphoneOn(false)

Unpublish Media

Stop publishing a stream:

bandwidthRtc.unpublish(rtcStream)

Audio Level Detection

Monitor local and remote audio levels:

// Local microphone audio level
bandwidthRtc.onLocalAudioLevel = { samples ->
// samples is a FloatArray of PCM audio samples
val rms = samples.map { it * it }.average().let { Math.sqrt(it) }
println("Local audio level: $rms")
}

// Remote audio level
bandwidthRtc.onRemoteAudioLevel = { samples ->
val rms = samples.map { it * it }.average().let { Math.sqrt(it) }
println("Remote audio level: $rms")
}

DTMF (Dual-Tone Multi-Frequency) Signaling

Send DTMF tones during an active call:

// Send a single digit
bandwidthRtc.sendDtmf("5")

// Send multiple digits
bandwidthRtc.sendDtmf("1234")

// Send with special characters
bandwidthRtc.sendDtmf("*123#")

Valid DTMF characters: 0-9, *, #, ,

Call Statistics

Get real-time call quality statistics:

bandwidthRtc.getCallStats(previousSnapshot = null) { stats ->
println("Packets received: ${stats.packetsReceived}")
println("Packets lost: ${stats.packetsLost}")
println("Jitter: ${stats.jitter}s")
println("Round trip time: ${stats.roundTripTime}s")
println("Codec: ${stats.codec}")
println("Inbound bitrate: ${stats.inboundBitrate} bps")
println("Outbound bitrate: ${stats.outboundBitrate} bps")
}

Pass the previous snapshot to compute delta-based metrics like bitrate:

var lastSnapshot: CallStatsSnapshot? = null

fun refreshStats() {
bandwidthRtc.getCallStats(previousSnapshot = lastSnapshot) { stats ->
lastSnapshot = stats
updateStatsUI(stats)
}
}

Making Outbound Connections

Connect to a Phone Number

import com.bandwidth.rtc.types.EndpointType

val callResult = bandwidthRtc.requestOutboundConnection(
"+15551234567",
EndpointType.PHONE_NUMBER
)
if (callResult.accepted) {
println("Connection established!")
}

Hang Up Connection

bandwidthRtc.hangupConnection("+15551234567", EndpointType.PHONE_NUMBER)

Disconnecting

When you're done with the session, disconnect from the platform:

bandwidthRtc.disconnect()

This will:

  • Close the WebSocket connection
  • Stop all published media streams
  • Clean up all peer connections
  • Release audio resources

Complete Jetpack Compose Example

Here's a complete example of using the SDK in a Jetpack Compose application for voice calls:

import android.Manifest
import android.content.pm.PackageManager
import androidx.activity.compose.rememberLauncherForActivityResult
import androidx.activity.result.contract.ActivityResultContracts
import androidx.compose.foundation.layout.*
import androidx.compose.material3.*
import androidx.compose.runtime.*
import androidx.compose.ui.Alignment
import androidx.compose.ui.Modifier
import androidx.compose.ui.platform.LocalContext
import androidx.compose.ui.unit.dp
import androidx.core.content.ContextCompat
import com.bandwidth.rtc.BandwidthRTC
import com.bandwidth.rtc.types.*
import kotlinx.coroutines.launch

@Composable
fun VoiceCallScreen() {
val context = LocalContext.current
val scope = rememberCoroutineScope()

var isConnected by remember { mutableStateOf(false) }
var isMuted by remember { mutableStateOf(false) }
var endpointId by remember { mutableStateOf<String?>(null) }

val bandwidthRtc = remember { BandwidthRTC(context, LogLevel.DEBUG) }

// Request microphone permission
val permissionLauncher = rememberLauncherForActivityResult(
ActivityResultContracts.RequestPermission()
) { granted ->
if (granted) {
scope.launch { connectAndPublish(bandwidthRtc) }
}
}

// Set up event listeners
LaunchedEffect(Unit) {
bandwidthRtc.onStreamAvailable = { rtcStream ->
println("Remote stream available: ${rtcStream.streamId}")
}

bandwidthRtc.onStreamUnavailable = { streamId ->
println("Stream unavailable: $streamId")
}

bandwidthRtc.onReady = { metadata ->
endpointId = metadata.endpointId
isConnected = true
}

bandwidthRtc.onRemoteDisconnected = {
isConnected = false
endpointId = null
}
}

// Clean up on dispose
DisposableEffect(Unit) {
onDispose {
scope.launch { bandwidthRtc.disconnect() }
}
}

Column(
modifier = Modifier
.fillMaxSize()
.padding(16.dp),
horizontalAlignment = Alignment.CenterHorizontally,
verticalArrangement = Arrangement.Center
) {
Text("Voice Call", style = MaterialTheme.typography.headlineMedium)

Spacer(modifier = Modifier.height(16.dp))

Text(if (isConnected) "Connected" else "Disconnected")
endpointId?.let { Text("Endpoint: $it") }

Spacer(modifier = Modifier.height(24.dp))

if (!isConnected) {
Button(onClick = {
if (ContextCompat.checkSelfPermission(context, Manifest.permission.RECORD_AUDIO)
== PackageManager.PERMISSION_GRANTED
) {
scope.launch { connectAndPublish(bandwidthRtc) }
} else {
permissionLauncher.launch(Manifest.permission.RECORD_AUDIO)
}
}) {
Text("Connect")
}
} else {
Row(horizontalArrangement = Arrangement.spacedBy(8.dp)) {
Button(onClick = {
isMuted = !isMuted
bandwidthRtc.setMicEnabled(!isMuted)
}) {
Text(if (isMuted) "Unmute" else "Mute")
}

Button(onClick = {
scope.launch {
bandwidthRtc.disconnect()
isConnected = false
endpointId = null
}
}) {
Text("Hang Up")
}
}

Spacer(modifier = Modifier.height(16.dp))

// DTMF Dial Pad
Text("Dial Pad", style = MaterialTheme.typography.titleMedium)
val digits = listOf("1", "2", "3", "4", "5", "6", "7", "8", "9", "*", "0", "#")
for (row in digits.chunked(3)) {
Row(horizontalArrangement = Arrangement.spacedBy(8.dp)) {
for (digit in row) {
Button(onClick = { bandwidthRtc.sendDtmf(digit) }) {
Text(digit)
}
}
}
}
}
}
}

private suspend fun connectAndPublish(bandwidthRtc: BandwidthRTC) {
try {
// Get endpoint token from your backend
val endpointToken = fetchEndpointToken()

bandwidthRtc.connect(RtcAuthParams(endpointToken = endpointToken))
bandwidthRtc.publish(audio = true, alias = "user-microphone")
} catch (e: Exception) {
println("Connection failed: ${e.message}")
}
}

private suspend fun fetchEndpointToken(): String {
// Implement your backend token fetch here
TODO("Fetch endpoint token from your backend server")
}

Error Handling

The SDK uses a sealed class hierarchy for errors. Always wrap SDK calls in try-catch blocks:

import com.bandwidth.rtc.types.BandwidthRTCError

try {
bandwidthRtc.connect(RtcAuthParams(endpointToken = token))
bandwidthRtc.publish(audio = true)
} catch (e: BandwidthRTCError.InvalidToken) {
println("Token is invalid or expired")
} catch (e: BandwidthRTCError.AlreadyConnected) {
println("Already connected -- disconnect first")
} catch (e: BandwidthRTCError.NotConnected) {
println("Must connect before publishing")
} catch (e: BandwidthRTCError.ConnectionFailed) {
println("Connection failed: ${e.message}")
} catch (e: BandwidthRTCError.MediaAccessDenied) {
println("Microphone permission not granted")
} catch (e: BandwidthRTCError.PublishFailed) {
println("Failed to publish: ${e.message}")
} catch (e: BandwidthRTCError) {
println("BRTC error: ${e.message}")
}

Best Practices

  1. Request RECORD_AUDIO permission before publishing: The SDK will publish a silent stream without error if the permission is not granted. Always check and request permission at runtime.
  2. Handle disconnections gracefully: Monitor the onRemoteDisconnected callback and implement reconnection logic.
  3. Clean up resources: Always call disconnect() when done to free up audio devices and network connections.
  4. Use audio level detection: Implement visual feedback for speaker activity using onLocalAudioLevel and onRemoteAudioLevel.
  5. Secure your tokens: Never embed endpoint tokens in your app; always fetch them from your backend server.
  6. Use coroutines properly: All SDK methods are suspending functions. Use an appropriate CoroutineScope (e.g., viewModelScope or lifecycleScope).
  7. Monitor call quality: Use getCallStats() to track jitter, packet loss, and round-trip time for diagnostics.

Next Steps