问题描述
我正在尝试构建一项服务,以将您的屏幕从浏览器流式传输到客户端(例如抽搐).
I am trying to build a service that streams your screen from a browser to clients (something like twitch).
我已经完成的工作是,我已经使用rtmp构建了一个正常工作的nginx服务器,并使用OBS对其进行了测试.效果很好.
What I have accomplished is I have built a working nginx server with rtmp, I tested it using OBS. That works pretty well.
我的问题是如何使用WebRTC将浏览器(而不是OBS或其他广播公司)的屏幕流式传输到具有RTMP的Nginx服务器?
And my question is how to stream a screen from a browser (not from OBS or other broadcasters) using WebRTC to nginx server with RTMP?
推荐答案
对于RTSP<-> WebRTC/RTMP<-> WebRTC转换,您需要运行某种适用于所有这些格式/协议的WebRTC网关/媒体服务器软件,并且可以在所有这些格式/协议之间进行多路复用.尝试使用Wowza/Unreal Media Server/Flashphoner. https://en.wikipedia.org/wiki/Comparison_of_streaming_media_systems
For RTSP<->WebRTC / RTMP<->WebRTC conversions,you need to run some kind of WebRTC gateway / media server software that works with all these formats/protocols and can transmux between all of them.Try Wowza / Unreal Media Server / Flashphoner.https://en.wikipedia.org/wiki/Comparison_of_streaming_media_systems
因此,在您的情况下,您想通过WebRTC将浏览器的屏幕发布到媒体服务器(必须使用H264编解码器),然后使用nginx-rtmp模块将RTMP流从媒体服务器拉到nginx服务器.
So in your case you want to publish the screen from browser to media server via WebRTC (H264 codec is a must) and then pull RTMP stream from the media server to nginx server with nginx-rtmp module.
请注意,相反的情况也是可能的:您可以通过RTMP将流推送到媒体服务器(例如,OBS屏幕捕获),然后通过WebRTC将该流从媒体服务器发送到Web浏览器.
Note that the opposite is possible too: You could push a stream to media server via RTMP, (for example, OBS screen capture) and then send this stream from media server to web browser(s) via WebRTC.
这些转换的主要问题是编解码器兼容性:H264必须用于视频,但是如果需要音频,则必须将Opus转换为AAC.
The main issue in these conversions is codec compatibility: H264 must be used for video, but if you need audio then you will have to do Opus to AAC transcoding.
这篇关于如何使用WebRTC将视频流传输到RTMP?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!