Real-time communication (RTC) is an increasingly important aspect of modern applications, and WebRTC is a powerful, open-source framework that enables real-time audio and video communication directly within web browsers and mobile apps. Flutter, with its cross-platform capabilities and rich ecosystem, makes an excellent choice for building applications with WebRTC functionality. In this post, we’ll guide you through implementing WebRTC for video calling in Flutter.
What is WebRTC?
WebRTC (Web Real-Time Communication) is an open-source project providing web browsers and mobile applications with real-time communication (RTC) capabilities via simple APIs. It allows audio and video communication to work inside web pages by allowing direct peer-to-peer communication, eliminating the need for plugins or proprietary codecs.
Why Use WebRTC?
- Open Source and Free: No licensing fees are required.
- Cross-Platform Compatibility: Works across different browsers and devices.
- Real-Time Communication: Enables low-latency audio and video streaming.
- Peer-to-Peer: Supports direct communication between peers, reducing server load.
Prerequisites
- Flutter SDK installed.
- Basic understanding of Flutter app development.
- Android Studio or Xcode for platform-specific configurations.
Steps to Implement WebRTC for Video Calling in Flutter
Step 1: Set Up a New Flutter Project
Create a new Flutter project:
flutter create webrtc_video_call
Navigate to the project directory:
cd webrtc_video_call
Step 2: Add Required Dependencies
Add the flutter_webrtc
package and other necessary packages to your pubspec.yaml
file:
dependencies:
flutter:
sdk: flutter
flutter_webrtc: ^0.9.25 # Use the latest version
sdp_transform: ^0.3.1 # For SDP manipulation
flutter_bloc: ^8.1.3 # For state management (optional)
# Other dependencies as needed
Run flutter pub get
to install the dependencies.
Step 3: Configure Platform-Specific Settings
WebRTC requires specific permissions and settings for Android and iOS.
Android Configuration
Open android/app/src/main/AndroidManifest.xml
and add the necessary permissions:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.example.webrtc_video_call">
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
<application
android:name="io.flutter.app.FlutterApplication"
android:label="webrtc_video_call"
android:icon="@mipmap/ic_launcher">
<activity
android:name=".MainActivity"
android:launchMode="singleTop"
android:theme="@style/LaunchTheme"
android:configChanges="orientation|keyboardHidden|keyboard|screenSize|smallestScreenSize|locale|layoutDirection|fontScale|screenLayout|density|uiMode"
android:hardwareAccelerated="true"
android:windowSoftInputMode="adjustResize">
<!-- Specifies an Android theme to apply to this Activity as soon as
the Android process has started. This theme is visible to the user
while the Flutter UI initializes. After that, this theme continues
to determine the Window background behind the Flutter UI. -->
<meta-data
android:name="io.flutter.embedding.android.NormalTheme"
android:resource="@style/NormalTheme"
/>
<!-- Displays an Android View that continues rendering after the Flutter UI initializes. -->
<meta-data
android:name="io.flutter.embedding.android.SplashScreenDrawable"
android:resource="@drawable/launch_background"
/>
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
<!-- Don't delete the meta-data below.
This is used by the Flutter tool to generate GeneratedPluginRegistrant.java -->
<meta-data
android:name="flutterEmbedding"
android:value="2" />
</application>
</manifest>
Also, ensure that your android/app/build.gradle
has the following settings:
android {
compileSdkVersion 33 // Or the latest SDK version
defaultConfig {
applicationId "com.example.webrtc_video_call"
minSdkVersion 21
targetSdkVersion 33 // Or the latest SDK version
versionCode flutterVersionCode.toInteger()
versionName flutterVersionName
}
// Other configurations
}
iOS Configuration
Open ios/Runner/Info.plist
and add the necessary permissions:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>CFBundleDevelopmentRegion</key>
<string>$(DEVELOPMENT_LANGUAGE)</string>
<key>CFBundleExecutable</key>
<string>$(EXECUTABLE_NAME)</string>
<key>CFBundleIdentifier</key>
<string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
<key>CFBundleInfoDictionaryVersion</key>
<string>6.0</string>
<key>CFBundleName</key>
<string>webrtc_video_call</string>
<key>CFBundlePackageType</key>
<string>APPL</string>
<key>CFBundleShortVersionString</key>
<string>$(FLUTTER_BUILD_NAME)</string>
<key>CFBundleSignature</key>
<string>????</string>
<key>CFBundleVersion</key>
<string>$(FLUTTER_BUILD_NUMBER)</string>
<key>LSRequiresIPhoneOS</key>
<true/>
<key>NSCameraUsageDescription</key>
<string>This app needs camera access to make video calls.</string>
<key>NSMicrophoneUsageDescription</key>
<string>This app needs microphone access to make audio calls.</string>
<key>UILaunchStoryboardName</key>
<string>LaunchScreen</string>
<key>UIMainStoryboardFile</key>
<string>Main</string>
<key>UISupportedInterfaceOrientations</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UISupportedInterfaceOrientations~ipad</key>
<array>
<string>UIInterfaceOrientationPortrait</string>
<string>UIInterfaceOrientationPortraitUpsideDown</string>
<string>UIInterfaceOrientationLandscapeLeft</string>
<string>UIInterfaceOrientationLandscapeRight</string>
</array>
<key>UIViewControllerBasedStatusBarAppearance</key>
<false/>
<key>CADisableMinimumFrameDurationOnPhone</key>
<true/>
</dict>
</plist>
Step 4: Create the Main UI
In your lib/main.dart
, create a basic UI for the video calling app:
import 'package:flutter/material.dart';
import 'package:flutter_webrtc/flutter_webrtc.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
home: VideoCallScreen(),
);
}
}
class VideoCallScreen extends StatefulWidget {
@override
_VideoCallScreenState createState() => _VideoCallScreenState();
}
class _VideoCallScreenState extends State<VideoCallScreen> {
RTCVideoRenderer _localRenderer = RTCVideoRenderer();
RTCVideoRenderer _remoteRenderer = RTCVideoRenderer();
MediaStream? _localStream;
@override
void initState() {
super.initState();
initRenderers();
_getUserMedia();
}
Future<void> initRenderers() async {
await _localRenderer.initialize();
await _remoteRenderer.initialize();
}
_getUserMedia() async {
final Map<String, dynamic> mediaConstraints = {
'audio': true,
'video': {
'mandatory': {
'minWidth': '640',
'minHeight': '480',
'minFrameRate': '30',
},
'facingMode': 'user',
'optional': [],
}
};
try {
MediaStream stream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
setState(() {
_localStream = stream;
_localRenderer.srcObject = stream;
});
} catch (e) {
print(e.toString());
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text('WebRTC Video Call'),
),
body: OrientationBuilder(
builder: (context, orientation) {
return Center(
child: Container(
width: orientation == Orientation.portrait ? MediaQuery.of(context).size.width : MediaQuery.of(context).size.height,
height: orientation == Orientation.portrait ? MediaQuery.of(context).size.height : MediaQuery.of(context).size.width,
child: Stack(
children: [
Positioned(
left: 0.0,
right: 0.0,
top: 0.0,
bottom: 0.0,
child: Container(
margin: EdgeInsets.fromLTRB(0.0, 0.0, 0.0, 0.0),
width: MediaQuery.of(context).size.width,
height: MediaQuery.of(context).size.height,
child: RTCVideoView(_localRenderer, mirror: true),
decoration: BoxDecoration(color: Colors.black54),
),
),
Positioned(
left: 20.0,
top: 20.0,
child: Container(
width: 120.0,
height: 160.0,
child: RTCVideoView(_remoteRenderer),
decoration: BoxDecoration(color: Colors.black54),
),
),
],
),
),
);
},
),
);
}
}
This code does the following:
- Initializes two
RTCVideoRenderer
objects for local and remote video streams. - Asks for user media (camera and microphone) permissions using
navigator.mediaDevices.getUserMedia
. - Displays the local video stream.
Step 5: Implement Signaling
WebRTC uses a signaling server to exchange metadata between peers, such as session descriptions and ICE candidates. This is typically done using WebSocket. For simplicity, we’ll use a basic WebSocket implementation for signaling.
Add WebSocket Dependency
dependencies:
flutter:
sdk: flutter
flutter_webrtc: ^0.9.25
sdp_transform: ^0.3.1
web_socket_channel: ^2.4.0
# Other dependencies as needed
Update the VideoCallScreen
to include WebSocket signaling:
import 'dart:convert';
import 'package:flutter/material.dart';
import 'package:flutter_webrtc/flutter_webrtc.dart';
import 'package:web_socket_channel/web_socket_channel.dart';
class VideoCallScreen extends StatefulWidget {
@override
_VideoCallScreenState createState() => _VideoCallScreenState();
}
class _VideoCallScreenState extends State<VideoCallScreen> {
RTCVideoRenderer _localRenderer = RTCVideoRenderer();
RTCVideoRenderer _remoteRenderer = RTCVideoRenderer();
MediaStream? _localStream;
RTCPeerConnection? _peerConnection;
final _channel = WebSocketChannel.connect(Uri.parse('wss://your_signaling_server_url')); // Replace with your signaling server URL
@override
void initState() {
super.initState();
initRenderers();
_getUserMedia();
_createPeerConnection().then((pc) {
_peerConnection = pc;
});
}
Future<void> initRenderers() async {
await _localRenderer.initialize();
await _remoteRenderer.initialize();
}
_getUserMedia() async {
final Map<String, dynamic> mediaConstraints = {
'audio': true,
'video': {
'mandatory': {
'minWidth': '640',
'minHeight': '480',
'minFrameRate': '30',
},
'facingMode': 'user',
'optional': [],
}
};
try {
MediaStream stream = await navigator.mediaDevices.getUserMedia(mediaConstraints);
setState(() {
_localStream = stream;
_localRenderer.srcObject = stream;
});
stream.getTracks().forEach((track) {
_peerConnection?.addTrack(track, stream);
});
} catch (e) {
print(e.toString());
}
}
_createPeerConnection() async {
Map<String, dynamic> configuration = {
'iceServers': [
{'urls': 'stun:stun.l.google.com:19302'},
]
};
final pc = await createPeerConnection(configuration);
pc.onIceCandidate = (RTCIceCandidate candidate) {
_channel.sink.add(jsonEncode({
'type': 'candidate',
'candidate': candidate.candidate,
'sdpMid': candidate.sdpMid,
'sdpMLineIndex': candidate.sdpMLineIndex,
}));
};
pc.onTrack = (RTCTrackEvent event) {
event.streams.forEach((stream) {
setState(() {
_remoteRenderer.srcObject = stream;
});
});
};
return pc;
}
_createOffer() async {
try {
RTCSessionDescription s = await _peerConnection!.createOffer();
_peerConnection!.setLocalDescription(s);
_channel.sink.add(jsonEncode({
'type': 'offer',
'sdp': s.sdp,
}));
} catch (e) {
print(e.toString());
}
}
_createAnswer() async {
try {
RTCSessionDescription s = await _peerConnection!.createAnswer();
_peerConnection!.setLocalDescription(s);
_channel.sink.add(jsonEncode({
'type': 'answer',
'sdp': s.sdp,
}));
} catch (e) {
print(e.toString());
}
}
_setRemoteDescription(String sdp) async {
RTCSessionDescription description = RTCSessionDescription(sdp, 'answer');
await _peerConnection!.setRemoteDescription(description);
}
_addCandidate(dynamic candidate) async {
dynamic iceCandidate = RTCIceCandidate(candidate['candidate'], candidate['sdpMid'], candidate['sdpMLineIndex']);
await _peerConnection!.addCandidate(iceCandidate);
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text('WebRTC Video Call'),
),
body: OrientationBuilder(
builder: (context, orientation) {
return Center(
child: Container(
width: orientation == Orientation.portrait ? MediaQuery.of(context).size.width : MediaQuery.of(context).size.height,
height: orientation == Orientation.portrait ? MediaQuery.of(context).size.height : MediaQuery.of(context).size.width,
child: Stack(
children: [
Positioned(
left: 0.0,
right: 0.0,
top: 0.0,
bottom: 0.0,
child: Container(
margin: EdgeInsets.fromLTRB(0.0, 0.0, 0.0, 0.0),
width: MediaQuery.of(context).size.width,
height: MediaQuery.of(context).size.height,
child: RTCVideoView(_localRenderer, mirror: true),
decoration: BoxDecoration(color: Colors.black54),
),
),
Positioned(
left: 20.0,
top: 20.0,
child: Container(
width: 120.0,
height: 160.0,
child: RTCVideoView(_remoteRenderer),
decoration: BoxDecoration(color: Colors.black54),
),
),
],
),
),
);
},
),
floatingActionButton: FloatingActionButton(
onPressed: _createOffer,
tooltip: 'Call',
child: Icon(Icons.phone),
),
);
}
@override
void dispose() {
_localRenderer.dispose();
_remoteRenderer.dispose();
_localStream?.dispose();
_peerConnection?.close();
_channel.sink.close();
super.dispose();
}
}
The code above integrates WebSocket communication to handle the exchange of offer, answer, and ICE candidates. The _createOffer
method initiates the call.
Step 6: Set Up the Signaling Server
A signaling server is crucial for exchanging metadata between WebRTC peers. Here’s a simple Node.js-based signaling server:
const WebSocket = require('ws');
const wss = new WebSocket.Server({ port: 8080 });
wss.on('connection', ws => {
ws.on('message', message => {
console.log('received: %s', message);
wss.clients.forEach(client => {
if (client !== ws && client.readyState === WebSocket.OPEN) {
client.send(message);
}
});
});
ws.on('close', () => console.log('Client disconnected'));
});
console.log('Signaling server started on port 8080');
To run the signaling server:
- Save the code to a file (e.g.,
server.js
). - Open a terminal, navigate to the directory, and run
node server.js
.
Step 7: Test the Application
Run the Flutter application on two different devices or emulators. Ensure both devices are connected to the same signaling server. When one device initiates a call, the other should receive the call, and video streams should be displayed on both screens.
Conclusion
Implementing WebRTC in Flutter for video calling involves setting up a Flutter project, adding the flutter_webrtc
dependency, configuring platform-specific settings, implementing the UI, integrating WebSocket signaling, and setting up a signaling server. This guide provides a foundational understanding of how to build a WebRTC video calling application in Flutter. By following these steps, you can create real-time communication experiences in your Flutter apps.