我无法使用WebRTC渲染远程视频。对于上下文,我使用的是Janus的流媒体插件。
我正在关注到目前为止的内容。每当在peerConnection(_ peerConnection:, didAdd stream:)
上调用RTCPeerConnectionDelegate
时,我都会创建一个远程渲染器,并将其添加到委托为我提供的流的第一个videoTrack中,如下所示:
#if arch(arm64)
let remoteRenderer = RTCMTLVideoView(frame: self.view.frame)
remoteRenderer.videoContentMode = .scaleAspectFill
#else
let remoteRenderer = RTCEAGLVideoView(frame: self.view.frame)
#endif
stream.videoTracks.first?.add(remoteRenderer)
self.view.addSubview(remoteRenderer)
但是视频不会显示,只会显示黑屏。我的代表还用
peerConnection(_ peerConnection:, didChange newState:)
的newState调用了RTCIceConnectionState.connected
,这使我认为连接很好。 最佳答案
收到事件“didStartReceivingOnTransceiver”时,尝试附加渲染器:
func peerConnection(_ peerConnection: RTCPeerConnection, didStartReceivingOn transceiver: RTCRtpTransceiver) {
switch transceiver.mediaType {
case .video:
DispatchQueue.main.async {[weak self] in
self?.remoteVideoTrack = transceiver.receiver.track as? RTCVideoTrack
if let renderer = self?.delegate?.viewForRemoteVideoTrack(){
self?.remoteVideoTrack?.add(renderer)
}
}
default:
break
}
}