meonardo / janus-videoroom-ios Goto Github PK
View Code? Open in Web Editor NEWAn iOS AppRTC example for janus videoroom plugin, written in Swift.
An iOS AppRTC example for janus videoroom plugin, written in Swift.
Hi, i saw these above buttons is automatically change state when publish/not publish camera
when join as subscriber these button is not enabled, but after publish all button can interact( without click event), i tried to find anything related to this button but can not find. Can you give me some advices, it's really magic for me
hi, im trying to disconnect the socket when leaved call using:
@objc private func roomStateDidChange(_ sender: Notification) {
guard let roomState = sender.object as? JanusRoomState else { return }
ProgressHUD.dismiss()
if roomState == .left {
roomManager.disconnect()
dismiss(animated: true, completion: nil)
}
}
But when debug i found the state connected is called again, dont know why. here is the log:
==============> Send WebSocket Message: {"janus":"destroy","transaction":"Destroy","session_id":5125526605993175}
"peerConnection new connection state: disconnected"
Connection state did change: disconnected
<============== Receive WebSocket Message: {
"janus": "success",
"session_id": 5125526605993175,
"transaction": "Destroy"
}
2021-10-10 16:06:22.092131+0700 janus-videoroom-example[15297:524784] [tcp] tcp_input [C1.1:2] flags=[R] seq=4068253224, ack=0, win=0 state=CLOSED rcv_nxt=4068253224, snd_una=3875245229
VideoViewController is deinit...
WebRTCClient is deinit...
it's seem the socket does not closed immediately but its connected again
hi can u help? Thanks
hi, i want to sharing screen as first time join, so i try to modify your code like below:
private func createVideoTrack() -> RTCVideoTrack {
let videoSource = WebRTCClient.factory.videoSource()
if JanusRoomManager.shared.isBroadcasting{
let videoCapturer = ScreenSampleCapturer(delegate: videoSource)
self.videoCapturer = videoCapturer
wormhole = Wormhole(appGroup: APPGROUP)
wormhole.passMessage(videoCapturer, with: "kExternalSampleCapturerDidCreateNote") { error in
print("send externalSampleCapturerDidCreateNote :\(error)")
}
}else{
let videoCapturer = RTCCameraVideoCapturer(delegate: videoSource)
videoCapturer.rotationDelegate = self
self.videoCapturer = videoCapturer
}
let videoTrack = WebRTCClient.factory.videoTrack(with: videoSource, trackId: "video0")
return videoTrack
}
i see the log and everything work normally, and i can get the "publisherDidJoin" delegate as local. but the video is still black screen, i dont know why. Can you help me about this problem? Thanks
Here is the janus log
==============> Send WebSocket Message: {"janus":"create","transaction":"Create","room":83885488671}
<============== success Message: {
"janus": "success",
"transaction": "Create",
"data": {
"id": 1210586025144688
}
}
==============> Send WebSocket Message: {"session_id":1210586025144688,"plugin":"janus.plugin.videoroom","janus":"attach","transaction":"Attach"}
<============== success Message: {
"janus": "success",
"session_id": 1210586025144688,
"transaction": "Attach",
"data": {
"id": 2140242295233091
}
}
==============> Send WebSocket Message: {"session_id":1210586025144688,"handle_id":2140242295233091,"body":{"ptype":"publisher","display":"16c69c8f-0723-4cbf-bd49-567492bbba47_8YOJfqM24kzD7EZHOyfoAQVXMOBILEIOS","pin":"lemftmwlbfmophfm3806WXIYAAEVIDMDHJKR","request":"join","room":83885488671},"janus":"message","transaction":"JoinRoom"}
<============== event Message: {
"janus": "event",
"session_id": 1210586025144688,
"transaction": "JoinRoom",
"sender": 2140242295233091,
"plugindata": {
"plugin": "janus.plugin.videoroom",
"data": {
"videoroom": "joined",
"room": 83885488671,
"description": "Room 83885488671",
"id": 6056358345907222,
"private_id": 2053271671,
"publishers": [],
"attendees": [
{
"id": 71997261593985,
"display": "8ff5766b-cb8b-4678-842e-88baaf45be78_viewer"
}
]
}
}
}
==============> Send WebSocket Message: {"session_id":1210586025144688,"handle_id":2140242295233091,"body":{"request":"configure","audio":true,"video":true},"jsep":{"type":"offer","sdp":"v=0\r\no=- 4495793928468605710 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE 0 1\r\na=extmap-allow-mixed\r\na=msid-semantic: WMS 2140242295233091\r\nm=audio 9 UDP\/TLS\/RTP\/SAVPF 111 103 104 9 102 0 8 106 105 13 110 112 113 126\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:B4e0\r\na=ice-pwd:+14uS4b+qI0c86MBp8hD1Ugs\r\na=ice-options:trickle renomination\r\na=fingerprint:sha-256 41:33:16:68:41:48:1F:BD:76:58:B3:11:F2:65:3E:9B:CC:DE:0C:81:3D:DF:C8:29:B3:B0:DC:65:6A:20:FF:2A\r\na=setup:actpass\r\na=mid:0\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=extmap:2 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/abs-send-time\r\na=extmap:3 http:\/\/www.ietf.org\/id\/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=extmap:5 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id\r\na=extmap:6 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id\r\na=sendrecv\r\na=msid:2140242295233091 audio0\r\na=rtcp-mux\r\na=rtpmap:111 opus\/48000\/2\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:103 ISAC\/16000\r\na=rtpmap:104 ISAC\/32000\r\na=rtpmap:9 G722\/8000\r\na=rtpmap:102 ILBC\/8000\r\na=rtpmap:0 PCMU\/8000\r\na=rtpmap:8 PCMA\/8000\r\na=rtpmap:106 CN\/32000\r\na=rtpmap:105 CN\/16000\r\na=rtpmap:13 CN\/8000\r\na=rtpmap:110 telephone-event\/48000\r\na=rtpmap:112 telephone-event\/32000\r\na=rtpmap:113 telephone-event\/16000\r\na=rtpmap:126 telephone-event\/8000\r\na=ssrc:2747753570 cname:GVOUAvNpEaB2EaK1\r\na=ssrc:2747753570 msid:2140242295233091 audio0\r\na=ssrc:2747753570 mslabel:2140242295233091\r\na=ssrc:2747753570 label:audio0\r\nm=video 9 UDP\/TLS\/RTP\/SAVPF 96 97 98 99 100 101 127 123 35 36 125 122 124\r\nc=IN IP4 0.0.0.0\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=ice-ufrag:B4e0\r\na=ice-pwd:+14uS4b+qI0c86MBp8hD1Ugs\r\na=ice-options:trickle renomination\r\na=fingerprint:sha-256 41:33:16:68:41:48:1F:BD:76:58:B3:11:F2:65:3E:9B:CC:DE:0C:81:3D:DF:C8:29:B3:B0:DC:65:6A:20:FF:2A\r\na=setup:actpass\r\na=mid:1\r\na=extmap:14 urn:ietf:params:rtp-hdrext:toffset\r\na=extmap:2 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/abs-send-time\r\na=extmap:13 urn:3gpp:video-orientation\r\na=extmap:3 http:\/\/www.ietf.org\/id\/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:12 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/playout-delay\r\na=extmap:11 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/video-content-type\r\na=extmap:7 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/video-timing\r\na=extmap:8 http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/color-space\r\na=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=extmap:5 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id\r\na=extmap:6 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id\r\na=sendrecv\r\na=msid:2140242295233091 video0\r\na=rtcp-mux\r\na=rtcp-rsize\r\na=rtpmap:96 H264\/90000\r\na=rtcp-fb:96 goog-remb\r\na=rtcp-fb:96 transport-cc\r\na=rtcp-fb:96 ccm fir\r\na=rtcp-fb:96 nack\r\na=rtcp-fb:96 nack pli\r\na=fmtp:96 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=640c2a\r\na=rtpmap:97 rtx\/90000\r\na=fmtp:97 apt=96\r\na=rtpmap:98 H264\/90000\r\na=rtcp-fb:98 goog-remb\r\na=rtcp-fb:98 transport-cc\r\na=rtcp-fb:98 ccm fir\r\na=rtcp-fb:98 nack\r\na=rtcp-fb:98 nack pli\r\na=fmtp:98 level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e02a\r\na=rtpmap:99 rtx\/90000\r\na=fmtp:99 apt=98\r\na=rtpmap:100 VP8\/90000\r\na=rtcp-fb:100 goog-remb\r\na=rtcp-fb:100 transport-cc\r\na=rtcp-fb:100 ccm fir\r\na=rtcp-fb:100 nack\r\na=rtcp-fb:100 nack pli\r\na=rtpmap:101 rtx\/90000\r\na=fmtp:101 apt=100\r\na=rtpmap:127 VP9\/90000\r\na=rtcp-fb:127 goog-remb\r\na=rtcp-fb:127 transport-cc\r\na=rtcp-fb:127 ccm fir\r\na=rtcp-fb:127 nack\r\na=rtcp-fb:127 nack pli\r\na=rtpmap:123 rtx\/90000\r\na=fmtp:123 apt=127\r\na=rtpmap:35 AV1X\/90000\r\na=rtcp-fb:35 goog-remb\r\na=rtcp-fb:35 transport-cc\r\na=rtcp-fb:35 ccm fir\r\na=rtcp-fb:35 nack\r\na=rtcp-fb:35 nack pli\r\na=rtpmap:36 rtx\/90000\r\na=fmtp:36 apt=35\r\na=rtpmap:125 red\/90000\r\na=rtpmap:122 rtx\/90000\r\na=fmtp:122 apt=125\r\na=rtpmap:124 ulpfec\/90000\r\na=ssrc-group:FID 2233140089 1784273287\r\na=ssrc:2233140089 cname:GVOUAvNpEaB2EaK1\r\na=ssrc:2233140089 msid:2140242295233091 video0\r\na=ssrc:2233140089 mslabel:2140242295233091\r\na=ssrc:2233140089 label:video0\r\na=ssrc:1784273287 cname:GVOUAvNpEaB2EaK1\r\na=ssrc:1784273287 msid:2140242295233091 video0\r\na=ssrc:1784273287 mslabel:2140242295233091\r\na=ssrc:1784273287 label:video0\r\n"},"janus":"message","transaction":"Configure"}
Discovered local candidate: 2140242295233091 - 8623880194215298
<============== event Message: {
"janus": "event",
"session_id": 1210586025144688,
"transaction": "Configure",
"sender": 2140242295233091,
"plugindata": {
"plugin": "janus.plugin.videoroom",
"data": {
"videoroom": "event",
"room": 83885488671,
"configured": "ok",
"audio_codec": "opus",
"video_codec": "vp8",
"streams": [
{
"type": "audio",
"mindex": 0,
"mid": "0",
"codec": "opus"
},
{
"type": "video",
"mindex": 1,
"mid": "1",
"codec": "vp8"
}
]
}
},
"jsep": {
"type": "answer",
"sdp": "v=0\r\no=- 4495793928468605710 2 IN IP4 203.171.20.138\r\ns=VideoRoom 83885488671\r\nt=0 0\r\na=group:BUNDLE 0 1\r\na=ice-options:trickle\r\na=fingerprint:sha-256 71:B0:57:24:53:15:3E:ED:69:1D:1C:A0:79:55:3C:B7:C1:4E:EA:62:68:88:5E:42:48:66:EF:2E:38:09:79:FB\r\na=msid-semantic: WMS janus\r\nm=audio 9 UDP/TLS/RTP/SAVPF 111\r\nc=IN IP4 203.171.20.138\r\na=recvonly\r\na=mid:0\r\na=rtcp-mux\r\na=ice-ufrag:4p9f\r\na=ice-pwd:IzOKF2NrP5rnUOu+4F+cA9\r\na=ice-options:trickle\r\na=setup:active\r\na=rtpmap:111 opus/48000/2\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=extmap:5 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id\r\na=extmap:6 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id\r\na=msid:janus janus0\r\na=ssrc:1889070410 cname:janus\r\na=candidate:1 1 udp 2015363327 203.171.20.138 52647 typ host\r\na=candidate:1 1 udp 2015363327 203.171.20.138 52647 typ host\r\na=candidate:2 1 udp 2015364863 203.171.20.138 48900 typ host\r\na=candidate:2 1 udp 2015364863 172.17.0.1 48900 typ host\r\na=candidate:3 1 udp 2015363583 203.171.20.138 60023 typ host\r\na=candidate:3 1 udp 2015363583 172.18.0.1 60023 typ host\r\na=end-of-candidates\r\nm=video 9 UDP/TLS/RTP/SAVPF 100 101\r\nc=IN IP4 203.171.20.138\r\na=recvonly\r\na=mid:1\r\na=rtcp-mux\r\na=ice-ufrag:4p9f\r\na=ice-pwd:IzOKF2NrP5rnUOu+4F+cA9\r\na=ice-options:trickle\r\na=setup:active\r\na=rtpmap:100 VP8/90000\r\na=rtcp-fb:100 ccm fir\r\na=rtcp-fb:100 nack\r\na=rtcp-fb:100 nack pli\r\na=rtcp-fb:100 goog-remb\r\na=rtcp-fb:100 transport-cc\r\na=extmap:3 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:12 http://www.webrtc.org/experiments/rtp-hdrext/playout-delay\r\na=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=extmap:5 urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id\r\na=extmap:6 urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id\r\na=rtpmap:101 rtx/90000\r\na=fmtp:101 apt=100\r\na=msid:janus janus1\r\na=ssrc:3035784549 cname:janus\r\na=ssrc:622552844 cname:janus\r\na=candidate:1 1 udp 2015363327 203.171.20.138 52647 typ host\r\na=candidate:1 1 udp 2015363327 203.171.20.138 52647 typ host\r\na=candidate:2 1 udp 2015364863 203.171.20.138 48900 typ host\r\na=candidate:2 1 udp 2015364863 172.17.0.1 48900 typ host\r\na=candidate:3 1 udp 2015363583 203.171.20.138 60023 typ host\r\na=candidate:3 1 udp 2015363583 172.18.0.1 60023 typ host\r\na=end-of-candidates\r\n"
}
}
Discovered local candidate: 8623880194215298 - 8623880194215298
publisherDidJoin: id: 1210586025144688 - 16c69c8f-0723-4cbf-bd49-567492bbba47_8YOJfqM24kzD7EZHOyfoAQVXMOBILEIOS - username: null - liveStreamOrderId: -1 -isLocal: true
Hi, i want to display local camera in preview, but i can not find any property that related to localSource....
i saw you are using this method to attach into renderer:
@available(iOSApplicationExtension, unavailable)
func attach(renderer: RTCVideoRenderer, isLocal: Bool) {
if isLocal {
updateVideoOrientationIfNecessary()
startCaptureLocalVideo(renderer: renderer)
} else {
/// Reset for reused renderer
videoRotation = ._0
renderRemoteVideo(to: renderer)
}
}
this method is wrote inside WebRTCClient class, can you help me how to do this? Thanks
It should launch the container app to fullfill the room number first when run the Broadcast extension at Control Center.
Known issue:
In landscape mode, using RTCMTLVideoView
as localVideoTrack renderer and set its transform will freeze the application.
Please see the issue link.
Alternatively, we can use RTCEAGLVideoView
as localVideoTrack renderer.
Hi, i want to add some filter effect like tiktok, and I just make a demo, it succesfull show my custom image in other view, but logo i wrong orientation, did u ever try to work with this feature?
The step:
Here is my sample code:
extension WebRTCClient: AVCaptureVideoDataOutputSampleBufferDelegate{
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return
}
let ciimage = CIImage(cvPixelBuffer: imageBuffer)
// let image = UIImage(ciImage: ciimage, scale: 1.0, orientation: imageOrientation())
let image = convert(cmage: ciimage)
print("image: \(image.size)")
let faceDetectionRequest = VNDetectFaceLandmarksRequest(completionHandler: { (request: VNRequest, error: Error?) in
DispatchQueue.main.async {[weak self] in
if let observations = request.results as? [VNFaceObservation], !observations.isEmpty {
for observation in observations {
let box = observation.boundingBox
let boxFrame = CGRect(x: box.origin.x * image.size.width, y: box.origin.y * image.size.height, width: box.width * image.size.width, height: box.height * image.size.height)
print("box: \(boxFrame)")
let logo = UIImage(named: "dog_nose")!.rotate(radians: .pi * 2)
if let newImage = self?.drawImageIn(image, logo, inRect: boxFrame){
if let pxBuffer = self?.convertImageToBuffer(from: newImage){
var newSampleBuffer: CMSampleBuffer? = nil
var timimgInfo: CMSampleTimingInfo = .invalid
var videoInfo: CMVideoFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(allocator: nil, imageBuffer: pxBuffer, formatDescriptionOut: &videoInfo)
if videoInfo != nil{
CMSampleBufferCreateForImageBuffer(allocator: kCFAllocatorDefault, imageBuffer: pxBuffer, dataReady: true, makeDataReadyCallback: nil, refcon: nil, formatDescription: videoInfo!, sampleTiming: &timimgInfo, sampleBufferOut: &newSampleBuffer)
if newSampleBuffer != nil{
self?.outputCaptureDelegate?.captureOutput!(output, didOutput: newSampleBuffer!, from: connection)
}
}
}
}
}
}
self?.outputCaptureDelegate?.captureOutput!(output, didOutput: sampleBuffer, from: connection)
}
})
let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: imageBuffer, orientation: exifOrientationForCurrentDeviceOrientation(), options: [:])
do {
try imageRequestHandler.perform([faceDetectionRequest])
} catch {
print(error.localizedDescription)
}
}
func imageOrientation() -> UIImage.Orientation {
let curDeviceOrientation = UIDevice.current.orientation
var exifOrientation: UIImage.Orientation
switch curDeviceOrientation {
case UIDeviceOrientation.portraitUpsideDown: // Device oriented vertically, Home button on the top
exifOrientation = .left
case UIDeviceOrientation.landscapeLeft: // Device oriented horizontally, Home button on the right
exifOrientation = .upMirrored
case UIDeviceOrientation.landscapeRight: // Device oriented horizontally, Home button on the left
exifOrientation = .down
case UIDeviceOrientation.portrait: // Device oriented vertically, Home button on the bottom
exifOrientation = .up
default:
exifOrientation = .up
}
return exifOrientation
}
func exifOrientationForCurrentDeviceOrientation() -> CGImagePropertyOrientation {
return exifOrientationForDeviceOrientation(UIDevice.current.orientation)
}
func exifOrientationForDeviceOrientation(_ deviceOrientation: UIDeviceOrientation) -> CGImagePropertyOrientation {
switch deviceOrientation {
case .portraitUpsideDown:
return .rightMirrored
case .landscapeLeft:
return .downMirrored
case .landscapeRight:
return .upMirrored
default:
return .leftMirrored
}
}
func convert(cmage: CIImage) -> UIImage {
let context = CIContext(options: nil)
let cgImage = context.createCGImage(cmage, from: cmage.extent)!
let image = UIImage(cgImage: cgImage)
return image
}
func drawImageIn(_ image: UIImage, _ logo: UIImage, inRect: CGRect) -> UIImage {
let renderer = UIGraphicsImageRenderer(size: image.size)
return renderer.image { context in
image.draw(in: CGRect(origin: CGPoint.zero, size: image.size))
logo.draw(in: inRect)
}
}
func convertImageToBuffer(from image: UIImage) -> CVPixelBuffer? {
let attrs = [
String(kCVPixelBufferCGImageCompatibilityKey) : true,
String(kCVPixelBufferCGBitmapContextCompatibilityKey) : true
] as [String : Any]
var buffer : CVPixelBuffer?
let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(image.size.width), Int(image.size.height), kCVPixelFormatType_32ARGB, attrs as CFDictionary, &buffer)
guard (status == kCVReturnSuccess) else {
return nil
}
CVPixelBufferLockBaseAddress(buffer!, CVPixelBufferLockFlags(rawValue: 0))
let pixelData = CVPixelBufferGetBaseAddress(buffer!)
let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
let context = CGContext(data: pixelData, width: Int(image.size.width), height: Int(image.size.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(buffer!), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)
context?.translateBy(x: 0, y: image.size.height)
context?.scaleBy(x: 1.0, y: -1.0)
UIGraphicsPushContext(context!)
image.draw(in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
UIGraphicsPopContext()
CVPixelBufferUnlockBaseAddress(buffer!, CVPixelBufferLockFlags(rawValue: 0))
return buffer
}
}
extension UIImage {
func rotate(radians: CGFloat) -> UIImage {
let rotatedSize = CGRect(origin: .zero, size: size)
.applying(CGAffineTransform(rotationAngle: CGFloat(radians)))
.integral.size
UIGraphicsBeginImageContext(rotatedSize)
if let context = UIGraphicsGetCurrentContext() {
let origin = CGPoint(x: rotatedSize.width / 2.0,
y: rotatedSize.height / 2.0)
context.translateBy(x: origin.x, y: origin.y)
context.rotate(by: radians)
draw(in: CGRect(x: -origin.y, y: -origin.x,
width: size.width, height: size.height))
let rotatedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return rotatedImage ?? self
}
return self
}
}
if you have any experience on this, please tech me the problem
Build failed with this error:
Undefined symbol: OBJC_CLASS$_RTCMTLVideoView
can anyone help me?
hi, i got echo problem when set speakerOn(), did u test with this case?:
let mandatoryConstraints =
["googEchoCancellation" : "false",
"echoCancellation" : "false",
"googNoiseSuppression" : "false",
"googCpuOveruseDetection" : "false"]
// Define media constraints. DtlsSrtpKeyAgreement is required to be true to be able to connect with web browsers.
let constraints = RTCMediaConstraints(mandatoryConstraints: mandatoryConstraints,
optionalConstraints: ["DtlsSrtpKeyAgreement": kRTCMediaConstraintsValueTrue])
Can you please help me understand the core Logic of your app?
whenever a new user joins the same room.
are you creating a new Peer connection with local track as yours own track and remote Description of new user?
is it due to multiple users?
hi, i read the doc about PIP for video call, but i can not done it alone, i feel so hard to me, so I want to ask did you want to make PIP mode for videocall?
https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_for_video_calls?changes=_1
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.