问题描述
是否可以使用单个 SurfaceView
的不同部分"同时渲染两个视频流?
Is it possible to render two video streams simultaneously, using different "sections" of a single SurfaceView
?
我制作了示例代码,可以并排使用两个 SurfaceView 同时呈现两个视频,但我想知道是否可以在同一个 SurfaceView
上播放两个视频.
I have made sample code that renders two videos simultaneously using two SurfaceViews side-by-side, but I am wondering if it is possible to have both videos play on the same SurfaceView
.
使用 MediaPlayer
,您可以将 SurfaceHolder 或 Surface 本身设置为显示器.我相信实现我所要求的代码将在 surfaceCreated
方法中:
Using a MediaPlayer
, you can set either the SurfaceHolder or the Surface itself as the display. I believe the code to achieve what I am asking would be inside of the surfaceCreated
method:
@Override
public void surfaceCreated(SurfaceHolder holder)
{
mediaPlayerTop.setDisplay(holder);
mediaPlayerBottom.setDisplay(holder);
play();
}
然而,当您尝试准备
第二个MediaPlayer时,简单地将两个
MediaPlayer
设置为相同的Surface会导致IllegalStateException
(这忽略了他们可能会相互重叠的事实,因为我没有在任何地方设置位置).
However, simply setting both MediaPlayer
s to the same Surface results in an IllegalStateException
when you try to prepare
the second MediaPlayer
(this is ignoring the fact that they'd probably overlap eachother anyways because I am not setting the position anywhere).
基本上,我正在努力实现的目标是可能的吗?
Basically, is what I am trying to achieve possible?
推荐答案
是的,但需要一些努力.
Yes, but it takes some effort.
基本计划是将 MediaPlayer 的输出定向到 SurfaceTexture,它将每个传入的帧转换为 GLES 纹理.然后将其渲染到 SurfaceView,绘制一个填充一半视图的矩形.您对另一个 MediaPlayer 执行相同的操作.
The basic plan is to direct the output of MediaPlayer to a SurfaceTexture, which converts each incoming frame to a GLES texture. You then render that to the SurfaceView, drawing a rect that fills half the view. You do the same thing for the other MediaPlayer.
您需要的部分可以在 Grafika 中找到,例如来自相机的纹理"活动从相机预览中获取视频流,将其转换为 GLES 纹理,然后将其渲染为 SurfaceView.
The pieces you need can be found in Grafika, e.g. the "texture from camera" Activity takes a video stream from the camera preview, converts it to a GLES texture, and renders it to a SurfaceView.
简单地将两个 MediaPlayer 的输出定向到单独的 SurfaceView 容易得多,但灵活性较差.
Simply directing the output of two MediaPlayers to separate SurfaceViews is much easier, but less flexible.
表面是生产者-消费者对中的端点.一次只能有一个生产者,因此您不能简单地将两个 MediaPlayer 指向一个 SurfaceView.
Surfaces are endpoints in a producer-consumer pair. There can only be one producer at a time, so you can't simply direct two MediaPlayers at a single SurfaceView.
这篇关于是否可以在单个 SurfaceView 上同时渲染两个视频流?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!