Mykhailo Lieibenson
Some web dude interested in JS, Nginx, Node.js, NW.js, WebRTC, distributed systems and DevOps to support development (SDLC).
* almost
http://blog.crownrelo.com/uk/wp-content/uploads/city_break_germany_091864.jpg
At least in Google Chrome, on iOS and in Edge
Web (Real Time Communication) is peer-to-peer technology with media
(audio, video, screen capture)
and data capabilities
(bi-directional data channel)
HTTPS
DTLS
Optional things like PGP
Peer
Peer
Peer
Peer
Signaling relay
Peer
Peer
Signaling relay
Peer
Peer
Peer
Peer
Peer
SFU
Peer
Peer
Peer
Peer
Can be used outside of a browser with native C++ implementation (webrtc.org) for mobile, IoT, server side, etc.
Chrome for Android supports it too.
Media device handling and engines
Codec implementations
Media transport layer built on proven protocols
Reliable, realtime, bidirectional data channels
RTP, RTCP and STUN multiplexed on one connection
RTP streams used to transport media
RTCP streams used to provide transport feedback (PLI)
STUN used to authenticate
(Used to exchange credentials negotiated in offer / answer)
STUN pings (Transmitted every 500ms)
RTP, RTCP are symmetric to keep NAT traversal pinholes open
Some SDP and a lot of asynchronous events
Offer created by one party
Answered by the other
Async IP candidate gathering
Async connection establishment
Is used to poke holes in NAT.
v=0
o=- 3709720962883168841 3 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE audio
a=msid-semantic: WMS OwFf4ndJLqJTV8yAucuMC11MIsnKgmz5HrSH
m=audio 49997 RTP/SAVPF 111 103 104 0 8 106 105 13 126
c=IN IP4 54.72.232.91
a=rtcp:1 IN IP4 0.0.0.0
a=ice-ufrag:wG4d9cZxWJe+1/J3
...
v=0 o=- 120247245346114869 3 IN IP4 127.0.0.1 s=- t=0 0 a=group:BUNDLE audio a=msid-semantic: WMS 074F5EBE-9D32-C14B-84B95E6802A10EB0 m=audio 45001 RTP/SAVPF 103 c=IN IP4 10.39.32.26 a=rtcp:1 IN IP4 0.0.0.0 a=sendrecv a=ssrc:3540656010 cname:kbjGX1Qis4+ZyDHn ...
In many slides I show Chrome only examples with "webkit" vendor prefix.
For Firefox please use "moz" prefix instead.
navigator.webkitGetUserMedia({
audio: true,
video: true
}, function(stream){
// handle stream
var url = URL.createObjectURL(stream);
// url is like blob:https%3A//localhost%3A3443/b4d040dd-633c-4bb1-9fb9-6fd98f0dca43
// get video element and set src-attribute using url from above
// add some filters or something...
}, function(err){
// handle errors
console.error("Oops", err);
});
// Or in Chrome Canary, Firefox and Edge
navigator.mediaDevices.getUserMedia({audio: true, video: true})
.then(console.log.bind(console))
.catch(console.error.bind(console))
Triggers Mic/Cam access dialog to ask user about permissions.
It happens per constraint.
If you request first audio and then video it will be two dialogs.
If at the same time then only one combined dialog.
If your web app is served via HTTP getUserMedia will trigger access dialog on every call.
If your web app is served via HTTPS then user response will be persisted.
// get audio stream track
stream.getAudioTracks();
// locally mutes audio upstream (mic) without calling mute on audio element
stream.getAudioTracks()[0].enabled = false;
// get video stream track
stream.getVideoTracks();
/// clones stream
stream.clone();
// stops stream
stream.getTracks().forEach(function(track){
track.stop();
});
Stop your stream when you don't need it to suppress camera usage indicator.
MediaStreamTrack.getSources(function(devices){
// filter your devices if you want only audio or video
// because device enumeration gives you both
console.log(devices);
});
// device example
// SourceInfo
// {
// facing: "",
// id: "b6ab4c86bb2cb76b14535fa61fd2414e3f7b54325824a3da8915c7c508b05642",
// kind: "video",
// label: "HD Pro Webcam C920 (046d:082d)"
// }
// Or Promise-based API in Chrome & Firefox
navigator.mediaDevices.enumerateDevices().then(console.log.bind(console));
navigator.mediaDevices.getUserMedia({
audio: {
optional: [
{
// device id
sourceId: "b6ab4c86bb2cb76b14535fa61fd2414e3"
}
]
},
video: true
}).then(function(stream){
// handle stream
}).catch(function(err){
console.error(err);
});
navigator.mediaDevices.getUserMedia({
audio: true,
video: {
mandatory: {
minWidth: 320,
minHeight: 240,
maxWidth: 1280,
maxHeight: 720
}
}
}).then(...);
Please note that "mandatory" is an object
and "optional" is an array of objects
navigator.mediaDevices.getUserMedia({
video: {
mandatory: {
maxWidth: 1280,
maxHeight: 720,
maxFrameRate: 3,
chromeMediaSource: 'desktop',
chromeMediaSourceId: id
}
}
}).then(...);
Captures desktops, windows, tabs.
chrome.desktopCapture.chooseDesktopMedia(['screen', 'window'], null, function(id){
// returns a stream id
});
Install extension via JS Chrome API or inline HTML.
new webkitRTCPeerConnection({
iceServers: [{
url: "stun:stun.l.google.com:19302"
},{
url: "turn:turn.citrixonline.com:5060",
username: "Allice",
credential: "LetMeIn"
},{
url: "turn:turn.citrixonline.com:443?transport=tcp",
username: "Bob"
}]
},{
mandatory: {
googIPv6: false
},
optional: [{
DtlsSrtpKeyAgreement: true
}]
});
Peer Connection is your transport between peers.
PeerConnection contains own signalling channel for STUN pings but not for handshake!
So offer/answer should be exchanged separately.
For example using HTTP or WebSocket.
var pc = new webkitRTCPeerConnection({iceServers: []});
// add local stream (upstream) to local peer connection
pc.addStream(stream);
// remove local stream from local peer connection
pc.removeStream(stream);
// after each add/remove operation peer connection should renegotiate handshake
// you can do it either manually or based on peer connection callback
pc.onnegotiationneeded = function(){};
pc.onaddstream = function(stream){
// remote peer added stream
};
pc.onremovestream = function(stream){
// remote peer removed stream
};
var pc = new webkitRTCPeerConnection({iceServers: []});
var dataChannel = pc.createDataChannel("my-channel", {
ordered: false, // do not guarantee order
maxRetransmitTime: 3000, // ms
});
dataChannel.onopen = function() {
dataChannel.send("Hi.");
};
dataChannel.onmessage = function(msg) {
console.log("New message received:", msg.data);
};
dataChannel.onerror = function(error){ ... };
dataChannel.onclose = function(){ ... };
Very similar to WebSocket API.
var pc = new webkitRTCPeerConnection({iceServers: []});
pc.onicecandidate = function(candidate){
if (!candidate.candidate){
// if candidate property is null then it was last candidate
// and there is no need to signal it
return;
}
// signal ice candidate to remote peer
};
function onRemoteIceCandidate(candidate){
// in case of Firefox use mozRTCIceCandidate
pc.addIceCandidate(new RTCIceCandidate(candidate));
}
var pc = new webkitRTCPeerConnection({iceServers: []});
pc.createOffer(function(offer){
// send offer to remote peer
// also you can set local description here
}, function(err){
console.error(err);
},{
mandatory: {
OfferToReceiveAudio: true,
OfferToReceiveVideo: true
}
});
// In Firefox constraints would look like this:
// {
// offerToReceiveAudio: true,
// offerToReceiveVideo: true
// }
// without additional "mandatory": {} wrapper
Firefox supports Promise-based API for PeerConnection
var pc = new webkitRTCPeerConnection({iceServers: []});
// For Firefox please use mozRTCSessionDescription
pc.setLocalDescription(new RTCSessionDescription(offer),
callback, errorCallback);
var pc = new webkitRTCPeerConnection({iceServers: []});
// handle offer on "second" peer
function onOfferReceived(offer) {
// offer should look like {"type":"offer","sdp":"..."}
pc.setRemoteDescription(new RTCSessionDescription(offer), function(){
pc.createAnswer(function(answer){
pc.setLocalDescription(new RTCSessionDescription(answer), function(){
// signal (send) answer back
}, function(err){
console.error(err);
});
}, function(err){
console.error(err);
});
}, function(err){
console.error(err);
});
}
Callback hell will look a bit better with Promise-based API
var pc = new webkitRTCPeerConnection({iceServers: []});
// handle answer on "first" peer
function onAnswerReceived(answer) {
// answer should look like {"type":"offer","sdp":"..."}
pc.setRemoteDescription(new RTCSessionDescription(answer), callback, errorCallback);
}
var pc = new webkitRTCPeerConnection({iceServers: []});
// current state of signalling
console.log(pc.signalingState);
pc.onsignalingstatechange = function(state){ ... };
// current state of ice
console.log(pc.iceConnectionState);
pc.oniceconnectionstatechange = function(state){ ... };
WebRTC and ORTC will converge.
And it is already happening.
Project named WebRTC NV (Next Version)
ActiveX / NPAPI WebRTC plugin
Now you get "*almost" part, huh? ;)
If you are interested in what we do and want to be part of it - talk to me later
By Mykhailo Lieibenson
Together we will dive into world of peer-to-peer web real time communication and learn how to build video chat in browser. Modern browsers are so powerful that they easily can handle audio, video, screen- and file- sharing right in the same tab allowing to keep in touch with your colleagues and have great conference experience without any specific software installed. Just your browser.
Some web dude interested in JS, Nginx, Node.js, NW.js, WebRTC, distributed systems and DevOps to support development (SDLC).