On this tutorial, we’ll focus on probably the most dependable and broadly used RTC framework: WebRTC, the quickly rising hybrid software improvement framework. Flutter-WebRTC, and the way we are able to use the framework. I’ll clarify the way to create a Flutter-WebRTC app in Simply 7 steps.
Video calling has change into a typical medium of communication for the reason that pandemic. We noticed a really big wave in real-time communication area (audio and video communication). There are a lot of use circumstances of RTC in fashionable companies, similar to video conferencing, real-time streaming, stay commerce, schooling, telemedicine, surveillance, gaming, and so forth.
Builders usually ask the identical questions The way to construct real-time purposes with minimal efforts (Me too ๐ ). If you happen to ask such questions, then you might be on the proper place.
On this article, we’ll focus on probably the most dependable and broadly used RTC framework: WebRTC, the quickly rising hybrid software improvement framework Flutter, and the way we are able to use WebRTC with the Flutter framework. We can even construct a demo Flutter app with WebRTC.
What’s Flutter?
Flutter is a cell app improvement framework based mostly on the Dart programming language, developed by Google. One can develop Android apps, iOS apps, net apps, and desktop apps utilizing the identical code with the Flutter Framework. Flutter has a big neighborhood, which is why it’s the fastest-growing app improvement framework ever.
What’s WebRTC?
WebRTC is an open supply framework for real-time communication (audio, video, and generic information) adopted by the vast majority of browsers and can be utilized on native platforms like Android, iOS, MacOS, Linux, Home windows, and so forth.
WebRTC depends on three main APIs
- getUserMedia: used to get native audio and video media.
- RTCPeerConnection: establishes reference to different peer.
- RTCDataChannel: Creates a channel for generic information trade.
What’s Flutter-WebRTC?
Flutter-WebRTC is a plugin for the Flutter framework that permits real-time communication (RTC) capabilities in net and cell purposes. It’s a assortment of communication protocols and APIs that permit direct communication between net browsers and cell purposes with out third-party plugins or software program.With Flutter-WebRTC, you’ll be able to simply construct video name purposes with out coping with the underlying applied sciences’ complexities.
How WebRTC works ?
With the intention to perceive working of WebRTC, we have to perceive following applied sciences.
1. Signalling
WebRTC permits peer-to-peer communication over the online though a peer has no concept the place different friends are and the way to hook up with them or talk to them.
With the intention to set up connection between friends, WebRTC wants shoppers to trade metadata so as to coordinate with them utilizing Signalling. It permits to speak over firewalls or work with NATs (Community Tackle Translators) . Know-how, that’s majorly used for signalling is WebSocket, which permits bidirectional communication between friends and signalling server.
2. SDP
SDP stands for Session Description Protocol. It describes session info like
- sessionId
- session expire time
- Audio/Video Encoding/Codecs/Encryption and so forth…
- Audio/Video IP and Port
Suppose there are two friends Consumer A and Consumer B that will likely be related over WebRTC. Then consumer A generates and sends an SDP supply (session associated info like codecs it helps) to Consumer B then Consumer B responds with SDP Reply (Settle for or Reject SDP Supply). SDP is used right here for negotiation between two friends.
3. ICE
ICE stands for Interactive Connectivity Institution, which permits peer to attach with different friends. There are a lot of the reason why a straight up connection between friends won’t work.
It requires to bypass firewalls that might forestall opening connections, give a novel IP tackle if like most conditions gadget doesn’t have public IP Tackle, and relay information by a server if router doesn’t permit to straight join with friends. ICE makes use of STUN or TURN servers to perform this.
Let’s begin with Flutter-WebRTC Venture
Initially, we have to setup signalling server.
- Clone the Flutter-WebRTC repository
git clone https://github.com/videosdk-live/webrtc.git
- Go to webrtc-signalling-server and set up dependencies for Flutter-WebRTC App
cd webrtc-signalling-server && npm set up
- Begin Signalling Server for Flutter-WebRTC App
npm run begin
- Flutter-WebRTC Venture Construction
lib
โโโ primary.dart
โโโ companies
โโโ signalling.service.dart
โโโ screens
โโโ join_screen.dart
โโโ call_screen.dart
7 Steps to Construct a Flutter-WebRTC Video Calling App
Step 1: Create Flutter-WrbRTC app venture
flutter create flutter_webrtc_app
Step 2: Add venture dependency for Flutter-WebRTC App
flutter pub add flutter_webrtc socket_io_client
Step 3: Flutter-WrbRTC Setup for IOS and Android
-
Flutter-WebRTC iOS Setup
Add following strains to your Data.plist file, positioned at /ios/Runner/Data.plist.
<key>NSCameraUsageDescription</key>
<string>$(PRODUCT_NAME) Digicam Utilization!</string>
<key>NSMicrophoneUsageDescription</key>
<string>$(PRODUCT_NAME) Microphone Utilization!</string>
These strains permits your app to entry digital camera and microphone.
Notice: Refer, you probably have bother with iOS setup.
-
Flutter-WebRTC Android Setup
Add following strains in AndroidManifest.xml, positioned at /android/app/src/primary/AndroidManifest.xml
<uses-feature android:title="android.{hardware}.digital camera" />
<uses-feature android:title="android.{hardware}.digital camera.autofocus" />
<uses-permission android:title="android.permission.CAMERA" />
<uses-permission android:title="android.permission.RECORD_AUDIO" />
<uses-permission android:title="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:title="android.permission.CHANGE_NETWORK_STATE" />
<uses-permission android:title="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:title="android.permission.BLUETOOTH" android:maxSdkVersion="30" />
<uses-permission android:title="android.permission.BLUETOOTH_ADMIN" android:maxSdkVersion="30" />
If obligatory, you’ll need to extend minSdkVersion
of defaultConfig
as much as 23
in app stage construct.gradle file.
Step 4: Create SignallingService for Flutter-WebRTC App
Signalling Service will take care of the communication to the Signalling Server. Right here, we’ll use socket.io consumer to attach with socker.io server, which is mainly a WebSocket Server.
import 'dart:developer';
import 'package deal:socket_io_client/socket_io_client.dart';
class SignallingService {
// occasion of Socket
Socket? socket;
SignallingService._();
static closing occasion = SignallingService._();
init({required String websocketUrl, required String selfCallerID}) {
// init Socket
socket = io(websocketUrl, {
"transports": ['websocket'],
"question": {"callerId": selfCallerID}
});
// hear onConnect occasion
socket!.onConnect((information) {
log("Socket related !!");
});
// hear onConnectError occasion
socket!.onConnectError((information) {
log("Join Error $information");
});
// join socket
socket!.join();
}
}
Step 5: Create JoinScreen for Flutter-WebRTC App
JoinScreen will likely be a StatefulWidget, which permits the consumer to hitch a session. Utilizing this display, consumer can begin a session or be a part of a session when another consumer name this consumer utilizing CallerID.
import 'package deal:flutter/materials.dart';
import 'call_screen.dart';
import '../companies/signalling.service.dart';
class JoinScreen extends StatefulWidget {
closing String selfCallerId;
const JoinScreen({tremendous.key, required this.selfCallerId});
@override
State<JoinScreen> createState() => _JoinScreenState();
}
class _JoinScreenState extends State<JoinScreen> {
dynamic incomingSDPOffer;
closing remoteCallerIdTextEditingController = TextEditingController();
@override
void initState() {
tremendous.initState();
// hear for incoming video name
SignallingService.occasion.socket!.on("newCall", (information) {
if (mounted) {
// set SDP Supply of incoming name
setState(() => incomingSDPOffer = information);
}
});
}
// be a part of Name
_joinCall({
required String callerId,
required String calleeId,
dynamic supply,
}) {
Navigator.push(
context,
MaterialPageRoute(
builder: (_) => CallScreen(
callerId: callerId,
calleeId: calleeId,
supply: supply,
),
),
);
}
@override
Widget construct(BuildContext context) {
return Scaffold(
backgroundColor: Theme.of(context).colorScheme.background,
appBar: AppBar(
centerTitle: true,
title: const Textual content("P2P Name App"),
),
physique: SafeArea(
baby: Stack(
youngsters: [
Center(
child: SizedBox(
width: MediaQuery.of(context).size.width * 0.9,
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
TextField(
controller: TextEditingController(
text: widget.selfCallerId,
),
readOnly: true,
textAlign: TextAlign.center,
enableInteractiveSelection: false,
decoration: InputDecoration(
labelText: "Your Caller ID",
border: OutlineInputBorder(
borderRadius: BorderRadius.circular(10.0),
),
),
),
const SizedBox(height: 12),
TextField(
controller: remoteCallerIdTextEditingController,
textAlign: TextAlign.center,
decoration: InputDecoration(
hintText: "Remote Caller ID",
alignLabelWithHint: true,
border: OutlineInputBorder(
borderRadius: BorderRadius.circular(10.0),
),
),
),
const SizedBox(height: 24),
ElevatedButton(
style: ElevatedButton.styleFrom(
side: const BorderSide(color: Colors.white30),
),
child: const Text(
"Invite",
style: TextStyle(
fontSize: 18,
color: Colors.white,
),
),
onPressed: () {
_joinCall(
callerId: widget.selfCallerId,
calleeId: remoteCallerIdTextEditingController.text,
);
},
),
],
),
),
),
if (incomingSDPOffer != null)
Positioned(
baby: ListTile(
title: Textual content(
"Incoming Name from ${incomingSDPOffer["callerId"]}",
),
trailing: Row(
mainAxisSize: MainAxisSize.min,
youngsters: [
IconButton(
icon: const Icon(Icons.call_end),
color: Colors.redAccent,
onPressed: () {
setState(() => incomingSDPOffer = null);
},
),
IconButton(
icon: const Icon(Icons.call),
color: Colors.greenAccent,
onPressed: () {
_joinCall(
callerId: incomingSDPOffer["callerId"]!,
calleeId: widget.selfCallerId,
supply: incomingSDPOffer["sdpOffer"],
);
},
)
],
),
),
),
],
),
),
);
}
}
Step 6: Create CallScreen for Flutter-WebRTC App
In CallScreen, we’ll present native stream of consumer, distant stream of different consumer, controls like toggleCamera, toggleMic, switchCamera, endCall. Right here, we’ll set up RTCPeerConnection between friends, create SDP Supply and SDP Reply and transmit ICE Candidate associated information over signalling server (socket.io).
import 'package deal:flutter/materials.dart';
import 'package deal:flutter_webrtc/flutter_webrtc.dart';
import '../companies/signalling.service.dart';
class CallScreen extends StatefulWidget {
closing String callerId, calleeId;
closing dynamic supply;
const CallScreen({
tremendous.key,
this.supply,
required this.callerId,
required this.calleeId,
});
@override
State<CallScreen> createState() => _CallScreenState();
}
class _CallScreenState extends State<CallScreen> {
// socket occasion
closing socket = SignallingService.occasion.socket;
// videoRenderer for localPeer
closing _localRTCVideoRenderer = RTCVideoRenderer();
// videoRenderer for remotePeer
closing _remoteRTCVideoRenderer = RTCVideoRenderer();
// mediaStream for localPeer
MediaStream? _localStream;
// RTC peer connection
RTCPeerConnection? _rtcPeerConnection;
// listing of rtcCandidates to be despatched over signalling
Listing<RTCIceCandidate> rtcIceCadidates = [];
// media standing
bool isAudioOn = true, isVideoOn = true, isFrontCameraSelected = true;
@override
void initState() {
// initializing renderers
_localRTCVideoRenderer.initialize();
_remoteRTCVideoRenderer.initialize();
// setup Peer Connection
_setupPeerConnection();
tremendous.initState();
}
@override
void setState(fn) {
if (mounted) {
tremendous.setState(fn);
}
}
_setupPeerConnection() async {
// create peer connection
_rtcPeerConnection = await createPeerConnection({
'iceServers': [
{
'urls': [
'stun:stun1.l.google.com:19302',
'stun:stun2.l.google.com:19302'
]
}
]
});
// hear for remotePeer mediaTrack occasion
_rtcPeerConnection!.onTrack = (occasion) {
_remoteRTCVideoRenderer.srcObject = occasion.streams[0];
setState(() {});
};
// get localStream
_localStream = await navigator.mediaDevices.getUserMedia({
'audio': isAudioOn,
'video': isVideoOn
? {'facingMode': isFrontCameraSelected ? 'consumer' : 'atmosphere'}
: false,
});
// add mediaTrack to peerConnection
_localStream!.getTracks().forEach((monitor) {
_rtcPeerConnection!.addTrack(monitor, _localStream!);
});
// set supply for native video renderer
_localRTCVideoRenderer.srcObject = _localStream;
setState(() {});
// for Incoming name
if (widget.supply != null) {
// hear for Distant IceCandidate
socket!.on("IceCandidate", (information) {
String candidate = information["iceCandidate"]["candidate"];
String sdpMid = information["iceCandidate"]["id"];
int sdpMLineIndex = information["iceCandidate"]["label"];
// add iceCandidate
_rtcPeerConnection!.addCandidate(RTCIceCandidate(
candidate,
sdpMid,
sdpMLineIndex,
));
});
// set SDP supply as remoteDescription for peerConnection
await _rtcPeerConnection!.setRemoteDescription(
RTCSessionDescription(widget.supply["sdp"], widget.supply["type"]),
);
// create SDP reply
RTCSessionDescription reply = await _rtcPeerConnection!.createAnswer();
// set SDP reply as localDescription for peerConnection
_rtcPeerConnection!.setLocalDescription(reply);
// ship SDP reply to distant peer over signalling
socket!.emit("answerCall", {
"callerId": widget.callerId,
"sdpAnswer": reply.toMap(),
});
}
// for Outgoing Name
else {
// hear for native iceCandidate and add it to the listing of IceCandidate
_rtcPeerConnection!.onIceCandidate =
(RTCIceCandidate candidate) => rtcIceCadidates.add(candidate);
// when name is accepted by distant peer
socket!.on("callAnswered", (information) async {
// set SDP reply as remoteDescription for peerConnection
await _rtcPeerConnection!.setRemoteDescription(
RTCSessionDescription(
information["sdpAnswer"]["sdp"],
information["sdpAnswer"]["type"],
),
);
// ship iceCandidate generated to distant peer over signalling
for (RTCIceCandidate candidate in rtcIceCadidates) {
socket!.emit("IceCandidate", {
"calleeId": widget.calleeId,
"iceCandidate": {
"id": candidate.sdpMid,
"label": candidate.sdpMLineIndex,
"candidate": candidate.candidate
}
});
}
});
// create SDP Supply
RTCSessionDescription supply = await _rtcPeerConnection!.createOffer();
// set SDP supply as localDescription for peerConnection
await _rtcPeerConnection!.setLocalDescription(supply);
// make a name to distant peer over signalling
socket!.emit('makeCall', {
"calleeId": widget.calleeId,
"sdpOffer": supply.toMap(),
});
}
}
_leaveCall() {
Navigator.pop(context);
}
_toggleMic() {
// change standing
isAudioOn = !isAudioOn;
// allow or disable audio monitor
_localStream?.getAudioTracks().forEach((monitor) {
monitor.enabled = isAudioOn;
});
setState(() {});
}
_toggleCamera() {
// change standing
isVideoOn = !isVideoOn;
// allow or disable video monitor
_localStream?.getVideoTracks().forEach((monitor) {
monitor.enabled = isVideoOn;
});
setState(() {});
}
_switchCamera() {
// change standing
isFrontCameraSelected = !isFrontCameraSelected;
// change digital camera
_localStream?.getVideoTracks().forEach((monitor) {
// ignore: deprecated_member_use
monitor.switchCamera();
});
setState(() {});
}
@override
Widget construct(BuildContext context) {
return Scaffold(
backgroundColor: Theme.of(context).colorScheme.background,
appBar: AppBar(
title: const Textual content("P2P Name App"),
),
physique: SafeArea(
baby: Column(
youngsters: [
Expanded(
child: Stack(children: [
RTCVideoView(
_remoteRTCVideoRenderer,
objectFit: RTCVideoViewObjectFit.RTCVideoViewObjectFitCover,
),
Positioned(
right: 20,
bottom: 20,
child: SizedBox(
height: 150,
width: 120,
child: RTCVideoView(
_localRTCVideoRenderer,
mirror: isFrontCameraSelected,
objectFit:
RTCVideoViewObjectFit.RTCVideoViewObjectFitCover,
),
),
)
]),
),
Padding(
padding: const EdgeInsets.symmetric(vertical: 12),
baby: Row(
mainAxisAlignment: MainAxisAlignment.spaceAround,
youngsters: [
IconButton(
icon: Icon(isAudioOn ? Icons.mic : Icons.mic_off),
onPressed: _toggleMic,
),
IconButton(
icon: const Icon(Icons.call_end),
iconSize: 30,
onPressed: _leaveCall,
),
IconButton(
icon: const Icon(Icons.cameraswitch),
onPressed: _switchCamera,
),
IconButton(
icon: Icon(isVideoOn ? Icons.videocam : Icons.videocam_off),
onPressed: _toggleCamera,
),
],
),
),
],
),
),
);
}
@override
void dispose() {
_localRTCVideoRenderer.dispose();
_remoteRTCVideoRenderer.dispose();
_localStream?.dispose();
_rtcPeerConnection?.dispose();
tremendous.dispose();
}
}
Step 7: Modify code in primary.dart
We are going to go websocketUrl (signalling server URL) to JoinScreen and create random callerId for consumer.
import 'dart:math';
import 'package deal:flutter/materials.dart';
import 'screens/join_screen.dart';
import 'companies/signalling.service.dart';
void primary() {
// begin videoCall app
runApp(VideoCallApp());
}
class VideoCallApp extends StatelessWidget {
VideoCallApp({tremendous.key});
// signalling server url
closing String websocketUrl = "WEB_SOCKET_SERVER_URL";
// generate callerID of native consumer
closing String selfCallerID =
Random().nextInt(999999).toString().padLeft(6, '0');
@override
Widget construct(BuildContext context) {
// init signalling service
SignallingService.occasion.init(
websocketUrl: websocketUrl,
selfCallerID: selfCallerID,
);
// return materials app
return MaterialApp(
darkTheme: ThemeData.darkish().copyWith(
useMaterial3: true,
colorScheme: const ColorScheme.darkish(),
),
themeMode: ThemeMode.darkish,
residence: JoinScreen(selfCallerId: selfCallerID),
);
}
}
Wohoo!! Lastly we did it.
Issues with P2P WebRTC
- High quality of Service: High quality will lower as variety of peer connection will increase.
- Consumer Facet Computation: Low finish units cannot synchronise a number of incoming streams.
- Scalability: It turns into very tough for consumer to deal with computation and community load when variety of friends will increase and importing media on the identical time.
Options
- MCU (Multipoint Management Unit)
- SFU (Selective Forwarding Unit)
- Video SDK
Combine With Video SDK
Video SDK is probably the most developer-friendly platform for stay video and audio SDKs. Video SDK makes integrating stay video and audio into your Flutter venture significantly simpler and quicker. You possibly can have a branded, customised, and programmable name up and working very quickly with just a few strains of code.
As well as, Video SDK supplies best-in-class modifications, offering you complete management over format and rights. Plugins could also be used to enhance the expertise, and end-to-end name logs and high quality information will be accessed straight out of your Video SDK dashboard or through REST APIs. This quantity of knowledge permits builders to debug any points that come up throughout a dialog and enhance their integrations for the very best buyer expertise doable.
Alternatively, you’ll be able to comply with this fast begin information to Create a Demo Flutter Venture with the Video SDK. or begin with Code Pattern.